Gluten and Psychopathy

Hi Stevie,

For question #1, if you look at the PDF available on the web, it says:

“If a cooked foodstuff is eaten along with the same product in a raw state there is no reaction.

The raw product has neutralized the action which this same product, with its critical temperature surpassed, would have called forth. In other words, the raw product has, so to say, reestablished the virtues of the product altered by a high temperature. Such a reestablishment is also possible when two different products have been absorbed, but with one condition; their critical temperature must either be the same, or else the critical temperature of the raw product must be higher than the critical temperature of the overheated one.

If the critical temperature of a raw product is lower than that of the overheated one, the reaction is sure to take place; in this case, even the augmentation of the quantity of the raw products does not help.

This law remains the same when the raw product is mixed with several overheated ones of the same critical temperature.

If several cooked foodstuffs are taken, each with a different critical temperature, along with raw food, reaction takes place, even if the raw product has a higher critical temperature than that of any of the cooked foodstuffs.” (I hate protected PDFs!)

So, what I don’t understand is how, for example, eating raw apples (197 F) with overcooked cabbage and bananas (192 F) would prevent an inflammatory reaction. In fact, I’m surprised that foods like figs, potatoes, and oranges didn’t cause immunoreactions in some people; raw or cooked, they make me feel miserable.

I am also curious if eating these overcooked foods at a lower temperature would produce the same response, especially water. The chemical properties of water don’t change at all through boiling. Perhaps the body experiences a “mini fever” when eating/drinking hot products, “tricking” the body into producing an immune response?

It’s hard to take the paper’s claims into consideration when he does not describe in detail how exactly he measured the WBC count and other technical details.

Please note that I am not trying to discredit a raw diet; I would just like to see more supported evidence. Thanks for the book suggestion. I will check it out the next time I’m at the bookstore. I may end up trying the diet soon to see if I can survive at college; no home cooking allowed!

As far as your humor goes, I think that we should try to explore as many angles as we can to find answers for the topic. So I don’t mind if you mean to be tongue-and-cheek about things; don’t expect me to pick up on it, though! :) Thank you.
 
Stevie Argyll said:
zlyja

I am back looking at another hypothesis.
The cooked food factor
In 1930, research was conducted to demonstrate the effect of food (cooked/processed vs. raw/natural) on the immune system. It was tested and documented at the Institute of Clinical Chemistry in Lausanne, Switzerland, under the direction of Dr. Paul Kouchakoff.

Probably the best way to look at food - what people eat and how they eat it, and what large scale effects it has - is to look at it archaeologically. Martin Jones, a professor of archaeological science at Cambridge, specializing in the study of the remains of early food, wrote an interesting book entitled "Feast - Why Humans Share Food."

book blurb said:
When you next share a meal with family or friends, ask yourself why humankind partakes of a ritual that to most species would be anathema. The human tendency to sit together peacefully over food is an extraordinary phenomenon. ... how did this strange and powerful behaviour become part of the human way of life?

When you think about it, that may be a key to a lot of things: the sharing of food/resources. Think of "The Last Supper" and its significance. That is an idea that has exercised me to no end. What was so blasted important about sharing a meal that it became the centerpiece of a religion? Was there something about it that was older than Christianity as we know it?

I know what Gurdjieff and others had to say about it - the esoteric meaning - but frankly, that isn't satisfactory. I think it may be a ritual that was handed down as a clue to reveal who was and was not pathological. Those who share willingly and generously are not, those who don't, those who seek to assimilate resources, to "own" things, are.

For most species, fire spells threat. So does direct eye contact and the opening of the mouth and the exposure of teeth. Combine these elements with the placing of food between a group of individuals other than parent and child, and you have a recipe for conflict and violence in any other species.

At some point, some group of humans turned all of these signals around and transformed them into the very essence of humanity. This change in behavior is evidenced by central hearths with clear spaces around them where diners gather in a circle. In the sharing of food and the accompanying courtesies, humans exhibit their fundamental distinction and separation from the animal kingdom.

So, was the "last supper" something like: "eat together in remembrance of me - you are not animals, you are spiritual beings."

Of all the things that humans do, sharing food is the LEAST "evolutionary." No other species followed this path of mutual sharing and support which required being able to conceptualize space and time abstractly. What logic of nature, as we witness it in the animal kingdom, would follow such a trajectory so foreign to the natural world?

These issues penetrate to the most fundamental differences between views about humanity and the world at large. Perhaps they bear witness to those having a soul and those who do not?

But, getting back to cooking food: The earliest evidence of cooking goes back as far as 180,000 years and a few other candidates for signs of cooking that go back even further - in a number of sites, they are attested as far back as 500,000 years. Certainly, Neanderthal was cooking his food though he did not have central hearths and did not given evidence for free sharing of food. The evidence at Neanderthal sites is more like the sharing systems of apes and chimpanzees who DO share food in a hierarchical sort of way. There is limited evidence that they cared for the elderly and infirm and a lot of evidence that women and children (and infirm and elderly) were given the offal and less desirable parts to eat; but it WAS a primitive sharing, if you want to call it that.

What is cooking for?

The entire rest of the animal kingdom seems to do quite well without it - in fact, seems to prefer to do without it. Humans enjoy a lot of raw food and we even consider some of these to be haute cuisine.

Among the MATERIAL changes brought about in the archaeological record by cooking is the changing size of teeth and crania in fossil remains. As teeth got smaller, brains got bigger. It seems that cooking serves to reduce the COST of digestion.

Alongside the fossil record of teeth getting smaller and brains getting bigger is the modern physiological record of living mammals that reveals a corresponding pattern in the organs of the gut. Not only are human teeth small in relation to overall body size, we also have a relatively small gut. Our bodily eating machinery has shrunk, jus as the brain has grown. Shrinking guts may have a great deal to do with growing brains.

Different parts of our bodies do different kinds of work and use up a lot of energy doing so. Some tissues have relatively low running costs, other organs, the liver, gut and brain, burn up a lot of energy. There is a limit to how big and active they can get and how efficiently they can run, determined by how much food any animal can consume and release as sugar/nutrients. By looking across the mammal kingdom and measuring the mass of different organs, one can arrive at a mammalian norm. When we compare that norm with the respective masses in our own bodies, we discover that the modern human brain is two to three times as large as what would be predicted from looking at mammals as a group.

When we think about this, we naturally have to think about the energy economy of the body: the need to balance the books in terms of energy. With a big brain that uses a lot of sugar/nutrients, something else has got to get smaller. The two other big spenders of energy in the body are the liver and the gut. The liver can't get smaller because it is tasked with managing the chemical balance of the body, detoxing, etc. So it seems that, in humans, it was the gut that got smaller, becoming half the size of that which would be predicted by mammalian norms.

So, archaeologically speaking, the record seems to show that the change to cooking enabled humans to grow massive brains by reducing the size and energy consumption of the gut.

The really economic nature of this can be considered in perspective: a cow turns grass into beef with a suite of four specialist stomachs, one with an army of bacterial guest-workers that also need feeding. The grass can be chemically dismantled almost entirely.

The human gut does a poor job with fiber. Yes, it can sweep our gut because it remains undigested, but it is very poor at meeting our nutritional needs.

Digesting meat is not as energetically costly as digesting cellulose. Animal tissue can supply energy and protein in a concentrated and unoccluded form. Another way to ease the workload of the gut is to do some of the digestion OUTSIDE the body: cooking.

Digestion is a chemical process in which large, indigestible molecules are tuned into small easily absorbed molecules, and can be achieved in a variety of ways including fermentation: getting yeast and bacteria to do part of the job of digestion first.

Brain capacities have been measured or estimated for a broad range of hominid fossils and it is possible to track key periods in which the brain increased in size substantially. Obviously, we can't measure the gut in fossils but we CAN measure the teeth. Reduction in tooth size very likely marks the easing of pressure on internal digestion.

A surge in brain size coincides with the earliest evidence of ashes, burnt bone, and charcoal in conjunction with hominids, and the subsequent appearance of hearths coincides with tooth reduction as well.

Cooking required a certain amount of brainpower and brainpower required cooking. Which drove the development?

It seems that stresses in the environment demanded solutions and finding solutions exercised the brain and those who could find solutions - including cooking - survived. The activities involved with getting food in a stressful environment placed demands on intelligence and the brain and quite possibly, the most imperative demand was that of cooperation and social cohesion which demanded the expansion of the neo-cortex. This could only happen with the reduction of something else: the gut. In short, the link between brain complexity and external digestion created a positive feed-back loop. The more effective external digestion became in reducing demands on the gut, the larger the potential size of the brain. Larger brains enabled more external digestion ideas/technologies, enhancing the cycle.

The building of a brain requires a LOT of protein, a great deal of energy (sugar), and an extensive suite of vitamins and minerals. In primates, this work is done by the placenta before birth and at birth, the great majority of its brain growth has already been accomplished. Most mammals give birth to "toddlers" that are able to take on some fairly grown-up tasks within minutes of birth. Human babies have a lot of growing to do still. So, a large part of brain growth in humans depends on mother's milk and, ultimately, the diet of the mother at a stage in her life when she is least able to compete for food.

Beyond proteins and sugars, a lot of brain growth and development depends on fats and rather specific fats at that. One of the most common fatty acids in myelin - the sheathing of neurons - is oleic acid which is the most abundant fatty acid in human milk. Two other fats crucial to optimal development of the brain and eyes are called DHA and AA. DHA is the most abundant fat in the brain. Mother and child can manufacture both DHA and AA from two other fats, but these two source fats cannot be synthesized in the human body. They have to be consumed as part of the diet. That is why they are called "essential fatty acids" (EFAs).

EFAs are "nature's antifreeze." They are important in keeping the flesh of cold ocean fish supple however cold the environment. Thus, the oil from cold water ocean fish is one of the best sources of EFAs. In the plant world, they are found in leafy vegetables and oily seeds such as walnut, hemp, and flax.

Not only do we need EFAs for our brain, we need them in the right balance: there are two kinds: omega 3 EFAs and omega 6 EFAs. A mixed diet of meat, fish and vegetables will take care of this balance. Too much meat without balancing fish and veggies, and you get too many omega 6 EFAs.

Interestingly, one source of fat that has the right balance of EFAs is the soft, oily fat of the horse.

Cooking breaks down long-chain carbohydrates and other related molecules to simpler sugars. It also generates a series of more complicated molecules by chemically linking carbohydrates and proteins.

Cooking tenderizes meat and generally softens food and this is of benefit to two vulnerable groups of people: the very young and the elderly.

Cooking also alters the nutritional quality of the food. The thermal breakdown of molecules does not just soften food; it also causes those molecules to be rearranged, sometimes for the better in nutritional terms, sometimes for the worse. In the case of plant foods, cooking performs the important task of breaking down toxic substances. Toxins are widespread in vegetation, and one of the principal means of a plant's natural defence against being eaten (lectins).

Even when cooking has a mixed or negative impact on overall nutritional quality, it has a positive effect on energetic costs of consumption. Cooking greatly enhances the profit margin by taking care of much of the chemical disassembly of the food outside the body and is, in the end, the key to the evolutionary advantages.

So, that is the long-view.
 
  • Like
Reactions: ELT
Hi Laura

I am aware of the cooked theory and also of the fact that cooked grains/cereals produce a mass of cabohydrate energy stocks leaving free time from foraging. Some have posited that increased tool use lead to bigger brains
There is another theory in the book Left In The Dark which explains the expansion of the brain in primates due to a move from insect diet to fruit/forage diet . Fruit diet lead to steroidal suppression (meatless a side effect ebing limited aggression ) - I cant remember all the details (book packed in box somewhere) but his hypotheses was that chemical change lead to a increase in brain growth up to from 600,000 up till 200,000 to 150,000 years ago when the brain growth stopped - He hypothesises cooked food and carnivorous dietary changes lead to the arrest in brain growth and stated that the brain has beeb shown to have shrunk by 5% since the cooked food diet intro. Meat consumption lead to increased aggression due to steroidal effects. _http://leftinthedark.org.uk/ is the website.
 
Stevie Argyll said:
Hi Laura

I am aware of the cooked theory and also of the fact that cooked grains/cereals produce a mass of cabohydrate energy stocks leaving free time from foraging. Some have posited that increased tool use lead to bigger brains
There is another theory in the book Left In The Dark which explains the expansion of the brain in primates due to a move from insect diet to fruit/forage diet . Fruit diet lead to steroidal suppression (meatless a side effect ebing limited aggression ) - I cant remember all the details (book packed in box somewhere) but his hypotheses was that chemical change lead to a increase in brain growth up to from 600,000 up till 200,000 to 150,000 years ago when the brain growth stopped - He hypothesises cooked food and carnivorous dietary changes lead to the arrest in brain growth and stated that the brain has beeb shown to have shrunk by 5% since the cooked food diet intro. Meat consumption lead to increased aggression due to steroidal effects. _http://leftinthedark.org.uk/ is the website.

Ummm... having checked out the above website and related materials, I don't think that this theory is actually in the running for an explanation of much of anything. Fact is, brain size was reduced by the introduction of agriculture as the archaeology shows.

This guy, Tony Wright, sounds like a nutcase. See here:
http://en.wikipedia.org/wiki/Tony_Wright_%28sleep_deprivation%29

One example of his nuttiness: "Wright proposes that once we strayed from tropical fruit diets, the biochemistry was simply no longer present to support optimal neurological development."

Fact is, it seems that the largest brains and most "human" behavior originated in the cold, cold north with the eating of meat.
 
Laura said:
Stevie Argyll said:
Hi Laura

I am aware of the cooked theory and also of the fact that cooked grains/cereals produce a mass of cabohydrate energy stocks leaving free time from foraging. Some have posited that increased tool use lead to bigger brains
There is another theory in the book Left In The Dark which explains the expansion of the brain in primates due to a move from insect diet to fruit/forage diet . Fruit diet lead to steroidal suppression (meatless a side effect ebing limited aggression ) - I cant remember all the details (book packed in box somewhere) but his hypotheses was that chemical change lead to a increase in brain growth up to from 600,000 up till 200,000 to 150,000 years ago when the brain growth stopped - He hypothesises cooked food and carnivorous dietary changes lead to the arrest in brain growth and stated that the brain has beeb shown to have shrunk by 5% since the cooked food diet intro. Meat consumption lead to increased aggression due to steroidal effects. _http://leftinthedark.org.uk/ is the website.

Ummm... having checked out the above website and related materials, I don't think that this theory is actually in the running for an explanation of much of anything. Fact is, brain size was reduced by the introduction of agriculture as the archaeology shows.

This guy, Tony Wright, sounds like a nutcase. See here:
http://en.wikipedia.org/wiki/Tony_Wright_%28sleep_deprivation%29

One example of his nuttiness: "Wright proposes that once we strayed from tropical fruit diets, the biochemistry was simply no longer present to support optimal neurological development."

Fact is, it seems that the largest brains and most "human" behavior originated in the cold, cold north with the eating of meat.

His sleep deprivation study was something to do with investigating spiritual traditions use of this .

He said something along the lines that the smart monkeys were fruitarians and the flavoniod/nutrient effect on pineal gland increases hormone output such as melatonin etc that had a feedforward effect. When I have finished my renovations i will dig it out and find out what his references were , if any , maybe a product of a sleep deprived brain :)
 
Stevie Argyll said:
He said something along the lines that the smart monkeys were fruitarians and the flavoniod/nutrient effect on pineal gland increases hormone output such as melatonin etc that had a feedforward effect. When I have finished my renovations i will dig it out and find out what his references were , if any , maybe a product of a sleep deprived brain :)

Ummm... he obviously hasn't done many monkey studies or read them. That's one thing I've been reading a LOT of trying to get a handle on what is truly "human" and what is not.
 
wetroof said:
supposedly their is such thing as lactose free milk. so maybe we could drink that.

Also I'm just wondering... I drank tons and tons of milk as a kid, is it possible my body would keep producing lactase because of the constant consumption of milk.

I remember when I was younger like 4-5 I was diagnosed as lactose intolerant? or maybe my mom just thought so? anyways I drank soy milk for a while, but then I drank regular milk again and kept drinking it.

I drank a lot of milk ever since with no problem... like a gallon a week. now I have cut back to 1/4 a gallon. maybe I will start trying to observe for some effect it has on my now that I am drinking less.

Stores in Austria now sell lactose free milk. My sister bought it the other day and on the carton it explains that lactose free simply means that the lactose has been split into glucose and galactose, which makes the milk taste a bit sweet. But as far as I understand, that doesn't solve any of the other problems dairy products may cause.

My first reaction to the mainstream producers now offering lactose free milk was quite positive, because I took it as a sign of increasing awareness regarding dairy and health issues. Now I am wondering if this is not just damage limitation in terms of people won't stop buying milk altogether, they'll rather go for the "healthier" product which really isn't any healthier.

Kind regards,
Finduilas
 
The issue that concerns us is not lactose but rather the protein, casein, which, as it happens, also binds with the opiate receptors in the gut producing the same effects as gluten. There are other issues, too, of course and from our point of view, there is no reason to drink milk and a lot of reasons to cut out all dairy entirely.
 
Thank you for your insights on the development of man, Laura! I have never thought about the inverse size of the brain and the gut being related to more efficient energy consumption and thus intelligence.

I’ve found a little bit that may be related to the topic. Not much to do with other prolamins, though. Apologies if this may have been mentioned elsewhere:

I found an article [_http://www.psychologytoday.com/blog/the-moral-molecule/200906/moral-sentiments-in-the-brain] that mentioned how oxytocin increases empathy, while high testosterone decreases it. I find that very interesting, yet contradictory to the topic, since gluten has been associated with increased estrogen. Zinc is believed to regulate testosterone in people [_http://www.ncbi.nlm.nih.gov/pubmed/8875519], and both kinds of violent offenders listed in the article mentioned earlier reported low plasma levels of zinc. Additionally, estrogen may correspond to serum copper [_http://www.ncbi.nlm.nih.gov/pubmed/18338309]. The “Type A” offenders thus might improve on a gluten-free diet to lower their estrogen levels, but I am not sure how to interpret the “depressed copper to sodium ratio” along with the low plasma zinc in the “Type B” patients.

For reference:
“""Walsh studied 24 pairs of brothers. Each pair lived together, and included one violent, delinquent sibling and one sibling with no academic or behavioral problems. Hair samples taken from the non-delinquent siblings revealed no abnormalities, while samples from the delinquent siblings showed two markedly abnormal patterns. One pattern of biochemical abnormalities ("type A") was seen in subjects who exhibited episodic violence, while another ("type B") was found in psychopathic subjects who showed no conscience or remorse, were pathological liars, and often tortured animals or set fires as children.
A controlled study by Walsh et al. of 192 violent and non-violent males found the same pattern: 92 of 96 violent subjects had type A or type B biochemical profiles, while only five of the 96 non- violent subjects had abnormal profiles.
According to Walsh, type A subjects have elevated serum copper, depressed plasma zinc, high blood lead levels, and abnormal blood histamine levels. Hair analysis reveals an elevated copper- to-sodium ratio that Walsh calls "quite striking." Type B subjects have elevated blood histamine, low plasma zinc, and elevated lead levels, and hair samples show a depressed copper-to-sodium ratio. Walsh also has identified "type C" and "type D" profiles associated with low-to-moderate levels of aggression.

Then there’s this [_http://www.ncbi.nlm.nih.gov/pubmed/14675747] study relating to thyroid hormones:

“This study is a follow-up investigation of a forensic psychiatric sub-population 6–8 years after forensic psychiatric evaluation. The aim was to examine the long-term validity of biological markers of psychopathy and antisocial behavior over time. Data on criminal records were obtained at follow-up from the National Council for Crime Prevention. Basic data included findings of psychiatric and psychological assessments, as well as values for serum triiodothyronine (T3) and free thyroxin (FT4), and platelet monoamine oxidase (MAO) activity, all obtained during the forensic psychiatric examination. Criminal recidivists at follow-up had higher serum T3 levels than non-recidivists, and much higher values than normal controls, while their levels of free T4 were lower. The T3 levels in criminal recidivists correlated to psychopathy- and aggression-related personality traits as measured by the Karolinska Scale of Personality. In violent recidivists, a remarkably high correlation was noted between T3 levels and Irritability and Detachment, traits that have previously been linked to the dopaminergic system. Stepwise multiple regression analyses confirmed the relationships of T3 levels and platelet MAO activity with personality traits in criminal recidivists. The predictive validity of biological markers of psychopathy, T3 and platelet MAO, measured during forensic psychiatric investigation, is stable over time. The results indicate chronic alterations of the hypothalamic–pituitary–thyroid axis in this group of subjects.

Wiki:
”Production of T3 and its prohormone thyroxine (T4) is activated by thyroid-stimulating hormone (TSH), which is released from the pituitary gland. This pathway is regulated via a closed-loop feedback process: Elevated concentrations of T3, and T4 in the blood plasma inhibit the production of TSH in the pituitary gland. As concentrations of these hormones decrease, the pituitary gland increases production of TSH, and by these processes, a feedback control system is set up to regulate the amount of thyroid hormones that are in the bloodstream.

“T3 increases the basal metabolic rate and, thus, increases the body's oxygen and energy consumption. The basal metabolic rate is the minimal caloric requirement needed to sustain life in a resting individual. T3 acts on the majority of tissues within the body, with a few exceptions including the spleen and testis. It increases the production of the Na+/K+ -ATPase and, in general, increases the turnover of different endogenous macromolecules by increasing their synthesis and degradation.
Protein
T3 stimulates the production of RNA Polymerase I and II and, therefore, increases the rate of protein synthesis. It also increases the rate of protein degradation, and, in excess, the rate of protein degradation exceeds the rate of protein synthesis. In such situations, the body may go into negative ion balance.
Glucose
T3 potentiates the effects of the β-adrenergic receptors on the metabolism of glucose. Therefore, it increases the rate of glycogen breakdown and glucose synthesis in gluconeogenesis. It also potentiates the effects of insulin, which have opposing effects.
Lipids
T3 stimulates the breakdown of cholesterol and increases the number of LDL receptors, therefore increasing the rate of lipolysis.
T3 also affects the cardiovascular system. It increases the cardiac output by increasing the heart rate and force of contraction. This results in increased systolic blood pressure and decreased diastolic blood pressure. The latter two effects act to produce the typical bounding pulse seen in hyperthyroidism.
T3 also has profound effect upon the developing embryo and infants. It affects the lungs and influences the postnatal growth of the central nervous system. It stimulates the production of myelin, neurotransmitters, and axon growth. It is also important in the linear growth of bones.
Neurotransmitters
T3 may increase serotonin in the brain, particularly in the cerebral cortex, and down-regulate 5HT-2 receptors
, based on studies in which T3 reversed learned helplessness in rats and physiological studies of the rat brain.[5]

Just to note: I think the MAO that they refer to is MAO-B (has to do with blood platelets), and not MAO-A aka the enzyme related to the warrior gene.

Psychopaths have defects in the hypothalamus and pituitary gland (remembering that oxytocin is produced in the pituitary gland), right? Well then that can help to explain these great levels of T3 in the blood. It’s strange that T3 is supposed to increase serotonin, though, since psychopaths are known to have more dopamine than serotonin in their bodies. Also, cortisol (stress hormone) supposedly inhibits the conversion of T4 into T3, and I believe it has been mentioned somewhere that psychopaths tend to have lower cortisol levels than most people.

The bit about glucose and T3’s increase of an insulin response is also very interesting, as too much insulin can be a cause of hypoglycemia, which Dr. Blaylock associated with increased violence.

There’s also the bit about T3 breaking down cholesterol, which calls back to the blog post mentioned earlier about low HDL in psychopaths.

Just some pure speculation on the bit about the increased rate of protein degradation: maybe the consumption of wheat, with its high concentration of nonessential amino acids (such as glutamic acid) encourages the decrease in natural protein synthesis? Then again, that could probably happen if you eat too much of any nonessential protein. Also, I don’t really know what they mean when they say that decreased protein will create a “negative ion balance”. All I see is some stuff for crystal bracelets and air ionizers.

I have read that celiacs often have thyroid problems that do not subside until they get off the gluten. This article [http://thyroid.about.com/cs/latestresearch/a/celiac.htm] mentions that a gluten-free diet has helped people with both under-active and over-active thyroids. For reference, the low TSH and high T3 might indicate hyperthyroidism in the psychopaths from the study.

And here’s some other interesting stuff, although I’m not sure if it can relate to psychopathy:

From Aragorn http://www.cassiopaea.org/forum/index.php?topic=9796.msg72850#msg72850:
i. Gliotoxin is highly toxic to SH-SY5Y cells and there is a correlation between the toxicity and the cellular redox status.

ii. Gliotoxin reduces the number of neurites, but does not affect the cell bodies morphologically, at non-cytotoxic concentrations. This indicates that the toxin may induce peripheral axonopathy in vivo.

iii. The intracellular free Ca2+ concentration is increased after exposure to gliotoxin, an effect that is the most ubiquitous feature of neuronal cell death. Simultaneously, calpains and caspases, proteases known to be involved in neuronal death and axonal degeneration, are activated.

iv. The observed irreversible neurite degenerative effects of gliotoxin are mainly dependent on caspase activation, whereas calpains are involved in the gliotoxin-induced cytotoxicity.

v. Gliotoxin induces a decreased rate of protein synthesis at noncytotoxic concentration, which may contribute to the degeneration
of neurites

(Axelsson 2006, 5)

_www.diva-portal.org/diva/getDocument?urn_nbn_se_su_diva-1312-2__fulltext.pdf

I found the above interesting in light of how gastrin releasing peptide increases calcium ion concentration, and how the GRP response is increased in celiacs, and the bit about glutamate opening up calcium channels. There was another study that did not reproduce the candida creating gliotoxin, but it is commonly accepted that the mold Aspergillus fumigatus produces gliotoxin. There is an interesting paper [“Aspergillus fumigatus toxicity and gliotoxin levels in feedstuff for domestic animals and pets in Argentina”] that says that this mold contaminates grains all the time, with the exception of well-fermented corn sillage. They suggest that, while the mold is still in the sillage, it does not produce gliotoxins in the acidic environment.

“Boudra and Morgavi (2005)
demonstrated, in in vitro studies, that acidic conditions
and agitation exert a negative effect on gliotoxin production.
However, the presence of the fungus does not imply
the presence of mycotoxins just as the presence of mycotoxins
does not imply the presence of the fungus. Environmental
conditions that allow fungal growth are not
always the same as those allowing mycotoxin production
(CAST 2003).”

I wonder if strains of other bacteria may have something to do with it.

Also, this page [http://www.ei-resource.org/illness-information/environmental-illnesses/candida-and-gut-dysbiosis/] mentions that allergic reactions to A. fumigatus increase with candida overgrowth. There’s also a reference to a paper saying that the structure of gluten is similar to that in the cellular wall of C. albicans, which it says could explain gluten sensitivity. Unfortunately, the original paper is in Russian and I can’t make heads or tails of the English abstract [http://www.ncbi.nlm.nih.gov/pubmed/15646124].

There was a study [http://www.reuters.com/article/idUSCOL07536520080130] in which Aspergillus niger prolyl endoprotease broke down gluten and gliadin particles as well. They said it was promising, but in a follow-up study, some people had problems (can’t find the URL anymore):

[Followup:Can Prolyl Endoprotease Enzyme Treatment Mitigate the Toxic Effect of
Gluten in Coeliac Patients?
Greetje J. Tack, Jolanda M. van de Water, E. M. Kooy-Winkelaar, Jeroen van Bergen,
Gerrit A. Meijer, Boudewina M. von Blomberg, Marco W. Schreurs, Maaike J. Bruins,
Luppo Edens, Chris J. Mulder, Frits Koning
Background: Coeliac disease is caused by a small intestinal inflammatory T-cell response to
poorly digested proline-rich peptides from gluten. The enzyme, Aspergillus niger-derived
prolyl endopeptidases (AN-PEP), appeared highly effective in degrading proline-rich gluten
epitopes In Vitro. Therefore, this enzyme might be a promising new treatment for coeliac
patients. Aim/methods: Safety, feasibility and efficacy of AN-PEP in coeliac patients was
determined in a randomised double-blind semi-cross-over pilot study. Patients consumed
a 7 g gluten-containing breakfast with AN-PEP-containing jam for 2 weeks (1st phase). After
a two-week wash-out period (2nd phase), patients were randomised to gluten with either
AN-PEP (n=7) or placebo (n=7) for 2 weeks (3rd phase). Before and after the 1st phase and
after the 3rd phase, measurements were performed. Biopsies were taken to determine severity
of intestinal damage (Marsh classification; stage M0 - normal mucosa, stage MIIC - total
villous atrophy) and deterioration was defined as increase of ≥2 Marsh grades. In addition,
serum endomysial (EMA), tissue transglutaminase (tTGA), and IgA (GAA) and IgG (GAG)
gliadin antibodies were determined. Also, quality of life (QoL) questionnaires were collected.
Results: At inclusion, all patients (n=16) had negative antibodies and M0/I lesions. Only 2
of 16 patients were excluded after the 1st period due to an increase of 2 Marsh grades. No
significant changes in antibodies were seen on gluten plus AN-PEP except for 2 patients
showing weakly positive GAA. The QoL gastrointestinal problem subscale did not change
suggesting that AN-PEP plus gluten was well-tolerated. Also no changes were seen on the
subscales worries, social, and emotional. During the 3rd randomization period, only 2 of 7
patients on placebo deteriorated to grade MIIIA. Only 1 of these 2 patients showed positive
GAA and GAG antibodies. In the AN-PEP group, only 1 of 7 patients developed a MIIIA
showing weakly positive GAA. EMA and tTGA did not significantly change throughout the
whole study. During the 3rd period, no change was found on the QoL subscales worries
and social. Both placebo and AN-PEP group showed minor deterioration on the subscales
emotional and gastrointestinal. Conclusion: AN-PEP combined with a gluten challenge is
safe and well-tolerated. Since only 2 of 7 subjects on placebo significantly deteriorated on
a 2-wk gluten challenge, probably a longer gluten intake is needed to detect significant
serological and histological changes.]

When I find more info, I'll post it here. Thanks. :)
 
In reference to the text I discussed above, see:

Food For Thought: Meat-Based Diet Made Us Smarter

http://www.npr.org/templates/story/story.php?storyId=128849908

Notice the date posted! Funny!

The linked article has some images worth checking out.

August 2, 2010

Our earliest ancestors ate their food raw — fruit, leaves, maybe some nuts. When they ventured down onto land, they added things like underground tubers, roots and berries.

It wasn't a very high-calorie diet, so to get the energy you needed, you had to eat a lot and have a big gut to digest it all. But having a big gut has its drawbacks.

"You can't have a large brain and big guts at the same time," explains Leslie Aiello, an anthropologist and director of the Wenner-Gren Foundation in New York City, which funds research on evolution. Digestion, she says, was the energy-hog of our primate ancestor's body. The brain was the poor stepsister who got the leftovers.

Until, that is, we discovered meat.

"What we think is that this dietary change around 2.3 million years ago was one of the major significant factors in the evolution of our own species," Aiello says.

That period is when cut marks on animal bones appeared — not a predator's tooth marks, but incisions that could have been made only by a sharp tool. That's one sign of our carnivorous conversion. But Aiello's favorite clue is somewhat ickier — it's a tapeworm. "The closest relative of human tapeworms are tapeworms that affect African hyenas and wild dogs," she says.

So sometime in our evolutionary history, she explains, "we actually shared saliva with wild dogs and hyenas." That would have happened if, say, we were scavenging on the same carcass that hyenas were.

But dining with dogs was worth it. Meat is packed with lots of calories and fat. Our brain — which uses about 20 times as much energy as the equivalent amount of muscle — piped up and said, "Please, sir, I want some more."

Carving Up The Diet


As we got more, our guts shrank because we didn't need a giant vegetable processor any more. Our bodies could spend more energy on other things like building a bigger brain. Sorry, vegetarians, but eating meat apparently made our ancestors smarter — smart enough to make better tools, which in turn led to other changes, says Aiello.

"If you look in your dog's mouth and cat's mouth, and open up your own mouth, our teeth are quite different," she says. "What allows us to do what a cat or dog can do are tools."

Tools meant we didn't need big sharp teeth like other predators. Tools even made vegetable matter easier to deal with. As anthropologist Shara Bailey at New York University says, they were like "external" teeth.

"Your teeth are really for processing food, of course, but if you do all the food processing out here," she says, gesturing with her hands, "if you are grinding things, then there is less pressure for your teeth to pick up the slack."

Our teeth, jaws and mouth changed as well as our gut.

A Tough Bite To Swallow

But adding raw meat to our diet doesn't tell the whole food story, according to anthropologist Richard Wrangham. Wrangham invited me to his apartment at Harvard University to explain what he believes is the real secret to being human. All I had to do was bring the groceries, which meant a steak — which I thought could fill in for wildebeest or antelope — and a turnip, a mango, some peanuts and potatoes.

As we slice up the turnip and put the potatoes in a pot, Wrangham explains that even after we started eating meat, raw food just didn't pack the energy to build the big-brained, small-toothed modern human. He cites research that showed that people on a raw food diet, including meat and oil, lost a lot of weight. Many said they felt better, but also experienced chronic energy deficiency. And half the women in the experiment stopped menstruating.

It's not as if raw food isn't nutritious; it's just harder for the body to get at the nutrition.

Wrangham urges me to try some raw turnip. Not too bad, but hardly enough to get the juices flowing. "They've got a tremendous amount of caloric energy in them," he says. "The problem is that it's in the form of starch, which unless you cook it, does not give you very much."

Then there's all the chewing that raw food requires. Chimps, for example, sometimes chew for six hours a day. That actually consumes a lot of energy.

"Plato said if we were regular animals, you know, we wouldn't have time to write poetry," Wrangham jokes. "You know, he was right."

Tartare No More

One solution might have been to pound food, especially meat — like the steak I brought. "If our ancestors had used stones to mash the meat like this," Wrangham says as he demonstrates with a wooden mallet, "then it would have reduced the difficulty they would have had in digesting it."

But pounding isn't as good as cooking that steak, says Wrangham. And cooking is what he thinks really changed our modern body. Someone discovered fire — no one knows exactly when — and then someone got around to putting steak and veggies on the barbeque. And people said, "Hey, let's do that again."

Besides better taste, cooked food had other benefits — cooking killed some pathogens on food.

But cooking also altered the meat itself. It breaks up the long protein chains, and that makes them easier for stomach enzymes to digest. "The second thing is very clear," Wrangham adds, "and that is the muscle, which is made of protein, is wrapped up like a sausage in a skin, and the skin is collagen, connective tissue. And that collagen is very hard to digest. But if you heat it, it turns to jelly."

As for starchy foods like turnips, cooking gelatinizes the tough starch granules and makes them easier to digest too. Even just softening food — which cooking does — makes it more digestible. In the end, you get more energy out of the food.

Yes, cooking can damage some good things in raw food, like vitamins. But Wrangham argues that what's gained by cooking far outweighs the losses.

As I cut into my steak (Wrangham is a vegetarian; he settles for the mango and potatoes), Wrangham explains that cooking also led to some of the finer elements of human behavior: it encourages people to share labor; it brings families and communities together at the end of the day and encourages conversation and story-telling — all very human activities.

"Ultimately, of course, what makes us intellectually human is our brain," he says. "And I think that comes from having the highest quality of food in the animal kingdom, and that's because we cook."

So, as the Neanderthals liked to say around the campfire: bon appetit.

and: Are We Meat Eaters or Vegetarians?
http://www.proteinpower.com/drmike/low-carb-library/are-we-meat-eaters-or-vegetarians-part-i/
http://www.proteinpower.com/drmike/low-carb-library/are-we-meat-eaters-or-vegetarians-part-ii/
 
Hi Stevie, hi Laura,

The evidence is clear that cooking meat was more efficient. It is also clear that anatomical changes occurred in correspondence with cooking meat, we can then gather that anatomical changes will continue reflecting any changes made in our diet. However, that these anatomical changes occurred, and that the processes we "discovered" or "evolved" to were more efficient, is no evidence that we, as a species, were on the "right" path, or are on the "right" path as we continue to evolve to more efficient ways of processing fuel.

In the future the most efficient source of fuel may be the emotions of other being rather than their flesh :O
Is this where we are heading??
Consequently, not requiring the consumption of physical matter, we may rid ourselves of our bodies altogether! Existing in a completely different non-physical realm?

Meat-Based Diet Made Us Smarter
I wonder though, did it come at the expense of our neighbours, as well as our own perception of being connected to them and the environment?
Was this the beginning of seeing ourselves as separate to our environment, and seeing those around us as potential food for our consumptional gratification?
Was our feelings of empathy toward our environment inversely proportionate to our perceptions of separation?


I hear the points, I am still reading and processing the data however. It's all such a complex and slippery issue, this 3D/4D STS/STO and everything in between.
In the mean time I have rhetorical questions that niggle at me re eating meat:

- We began eating meat because it served us better?
- It came at the cost of our neighbours (in the garden?)?
- Eating meat was the first infantile stage of our predatory nature? Coinciding with the "The Fall"?

Cass session October 23, 1994
Q: (L) Are there any leftover dinosaurs in the jungles of Africa or South
America?
A: No.
Q: (L) Is a vegetarian style of eating good for one?
A: Not usually.
Q: (L) What did human beings eat before the Fall?
A: Vegetarian.
Q: (L) So, until we go through the transition we are not really designed to
be vegetarian?
A: Correct.

Q: (L) Would we be able to use this source to do readings for other people
to give them information that they need?
A: Open.
Q: (L) Would you be willing to help us to give readings for other people?
A: OK.
So, we are where we are - we experienced the fall, and we must accept where we are, and find out way back "home". So, no, we may not have been on the "right" track by eating meat, but we did begin eating meat, and now, we are meat eaters - our physicality requiring this for survival.

I think I am answering my own questions here.

Can anyone inform me what STO 4D beings consume to survive? I don't recall coming across this information, I have been looking out for it though.
 

Trending content

Back
Top Bottom