Does Lying and Believing Lies Damage the Brain?

Laura

Administrator
Administrator
Moderator
FOTCM Member
Among the many interesting things said by psychologist, Andrew Lobaczewski in his seminal work, “Political Ponerology,”, one that interested me the most was his remark about "the first criterion for ponerogenesis" being the atrophy of critical faculties.

One phenomenon all ponerogenic groups and associations have in common is the fact that their members lose (or have already lost) the capacity to perceive pathological individuals as such, interpreting their behavior in a fascinated, heroic, or melodramatic ways. The opinions, ideas, and judgments of people carrying various psychological deficits are endowed with an importance at least equal to that of outstanding individuals among normal people.

The atrophy of natural critical faculties with respect to pathological individuals becomes an opening to their activities, and, at the same time, a criterion for recognizing the association in concern as ponerogenic. Let us call this the first criterion of ponerogenesis.

[...]

Thus, whenever we observe some group member being treated with no critical distance, although he betrays one of the psychological anomalies familiar to us, and his opinions being treated as at least equal to those of normal people, although they are based on a characteristically different view of human matters, we must derive the conclusion that this human group is affected by a ponerogenic process and if measures are not taken the process shall continue to its logical conclusion. We shall treat this in accordance with the above described first criterion of ponerology, which retains its validity regardless of the qualitative and quantitative features of such a union: the atrophy of natural critical faculties with respect to pathological individuals becomes an opening to their activities, and, at the same time, a criterion for recognizing the association in concern as ponerogenic. [...]

Any human group affected by the process described herein is characterized by its increasing regression from natural common sense and the ability to perceive psychological reality. Someone considering this in terms of traditional categories might consider it an instance of “turning into half-wits” or the development of intellectual deficiencies and moral failings.

One of the reasons this particular point is so interesting to me is because we (myself, fellow researchers and editors of SOTT.net) have observed this "turning into half-wits" over and over again. It's the damnedest thing! The instant an individual makes a decision to believe a lie, it's as though their ability to use accurate reasoning about anything else - not just a contentious item - grinds to a halt.

Most often, this sort of dynamic occurs in very emotional situations where the individual is heavily invested in NOT seeing the truth about a person or a situation for any number of reasons. They may be involved in a close relationship with the person, they may be brought up to "believe" a certain way and, in spite of evidence that their faith is misplaced, refuse to accept the facts.

Which, of course, leads to the consideration of Faith itself. Soren Kierkegaard suggested that religion is, of its essence, not persuasion of the truth of a doctrine, but commitment to a position which is inherently absurd. Human beings attain their identity by believing something that deeply offends their minds (or others). To exist, he says, we must believe, and to really believe means to believe something that is dreadfully hard to believe. You can't just believe something plausible because that is easy... So, for some people, it may be that believing lies is some kind of proof that they are in control of their choices, they aren't being pushed around or dominated by irritating things like facts and evidence. (Sounds really rather childish, doesn't it?)

So, anyway, some time ago I happened to read this:

Truth Serum
By: Kaja Perina

http://www.psychologytoday.com/articles/200201/truth-serum


Focuses on the use of functional magnetic resonance imaging in lie detection in forensic psychology. Impact of lying on the brain's activity.

Lying generates unique brain activity that can be measured by functional magnetic resonance imaging (fMRIs), brain scans that could someday morph into a forensic tool far more potent than the flawed polygraph test.

Researchers gave 18 subjects a playing card, then offered them money to lie to a computer about the card while undergoing an fMRI. When subjects lied, the scans revealed increased activity in several regions of the brain, including the anterior cingulate gyrus, which is implicated in conflict monitoring, attention and response inhibition. Head researcher Daniel Langleben, M.D., a professor of psychiatry at the University of Pennsylvania, says this confirms that the brain's "default" response is to tell the truth. "No area of the brain works harder to tell the truth than to lie" says Langleben.

Forensic experts are hopeful because fMRIs measure complex mental processes, while polygraph tests pick up skin and blood pressure changes that can be misleading. Langleben says the next step to ready brain scans for forensic use is to monitor spontaneous acts of deception. Langleben modeled his study on classical deception research, in which subjects are instructed to lie. The study was presented at the Society for Neuroscience annual meeting.

Psychology Today Magazine, Jan/Feb 2002
Last Reviewed 5 May 2009
Article ID: 2004

I thought I'd take a deeper look at this "anterior cingulate gyrus". I found this:

Anterior Cingulate Gyrus Dysfunction and Selective Attention Deficits in Schizophrenia: [15O]H2O PET Study During Single-Trial Stroop Task Performance

Cameron S. Carter, M.D., Mark Mintun, M.D., Thomas Nichols, B.S., and Jonathan D. Cohen, M.D., Ph.D.

OBJECTIVE: Attentional deficits are a prominent aspect of cognitive dysfunction in schizophrenia. The anterior cingulate gyrus is proposed to be an important component of frontal attentional control systems. Structural and functional abnormalities have been reported in this region in schizophrenia, but their relationship to attentional deficits is unknown. The authors investigated the function of the anterior cingulate gyrus and the related neural systems that are associated with selective attention in patients with schizophrenia.

METHOD: While subjects performed multiple blocks of a single-trial Stroop task, [15O]H2O positron emission tomography scans were obtained. Fourteen patients with schizophrenia were compared with 15 normal subjects matched for age, gender, and parental education.

RESULTS: The patients with schizophrenia responded at the same rate but made more errors in color naming during the color-incongruent condition. Consistent with the authors' hypothesis, patients with schizophrenia showed significantly less anterior cingulate gyrus activation while naming the color of color-incongruent stimuli.

CONCLUSIONS: Patients with schizophrenia fail to activate the anterior cingulate gyrus during selective attention performance. This finding adds to the understanding of the functional significance of the structural and metabolic abnormalities in schizophrenia that have been previously reported in this region of the brain. (Am J Psychiatry 1997; 154:1670–1675)


The wikipedia article on the topic is interesting: http://en.wikipedia.org/wiki/Anterior_cingulate_cortex

[...] The ACC is connected with the prefrontal cortex and parietal cortex as well as the motor system and the frontal eye fields making it a central station for processing top-down and bottom-up stimuli and assigning appropriate control to other areas in the brain. The ACC seems to be especially involved when effort is needed to carry out a task such as in early learning and problem solving. Many studies attribute functions such as error detection, anticipation of tasks, motivation, and modulation of emotional responses to the ACC.

ACC response in Stroop task experiments (designed to measure adherence to sequential decision-making paths) remains relatively elevated in typical human subjects , as the alternative - spontaneity - is sacrificed. Rehearsing a task that originally produced spontaneous, novel responses to the point of producing rigid, stereotypic responses results in a diminished ACC response.

Whereas most funded research is concentrated on reduced task focus - often diagnosed subjectively as attention deficit hyperactivity disorder (ADHD) - recent research using monkeys has revealed that heightened ACC activity (generally associated with reduced dopamine utilization) reduces capacity to learn how to use visual cues for anticipating rewards. [...]

Evidence for the role of the ACC as having an error detection function comes from consistent observations of error related negativity (ERN) uniquely generated within the ACC upon error occurrences. [...]

Stimulation of the anterior cingulate (also known as Area 25) with low dosages of electric current in neurosurgical studies has been shown to improve depression in a portion of test subjects.

Studying the effects of damage to the ACC provides insights into the type of functions it serves in the intact brain. Behavior that is associated with lesions in the ACC includes: inability to detect errors, severe difficulty with resolving stimulus conflict .... emotional instability, inattention, and akinetic mutism. There is evidence that damage to ACC is present in patients with schizophrenia, where studies have shown patients have difficulty in dealing with conflicting spatial locations in a Stroop-like task and having abnormal ERNs. Participants with ADHD were found to have reduced activation in the dorsal area of the ACC when performing the Stroop task.] Together these findings corroborate results from imaging and electrical studies about the variety of functions attributed to the ACC.

There is evidence that this area may have a role in Obsessive Compulsive Disorder due to the fact that what appears to be an unnaturally low level of glutamate activity in this region has been observed in patients with the disorder,[22] in strange contrast to many other brain regions which are thought to have excessive glutamate activity in OCD.

Helen S. Mayberg and two collaborators described how they cured 4 of 6 depressed people -- individuals virtually catatonic with depression despite years of talk therapy, drugs, even shock therapy -- with pacemakerlike electrodes in area 25. A decade earlier Mayberg had identified area 25 as a key conduit of neural traffic between the "thinking" frontal cortex and the phylogenetically older central limbic region that gives rise to emotion. She subsequently found that area 25 appeared overactive in these depressed people — "like a gate left open," as she puts it — allowing negative emotions to overwhelm thinking and mood. Inserting the electrodes closed this gate and rapidly alleviated the depression of two-thirds of the trial's patients [23].

It has also been suggested to have possible links with Social Anxiety, along with the amygdala part of the brain, but is still in the early stages of research. [...]

The ACC area in the brain is associated with many functions that require conscious experience by the viewer. Higher ACC activation levels were found for more emotionally aware female participants when shown short ‘emotional’ video clips. Better emotional awareness is associated with improved recognition of emotional cues or targets which is reflected by ACC activation.

The idea of awareness being associated with the ACC has some evidence with it, in that it seems to be the case that when subject’s responses are not congruent with actual responses, a larger ERN is produced.[12]

One study found an ERN even when subjects were not aware of their error. Awareness may not be necessary to elicit an ERN, but it could influence the effect of the amplitude of the feedback ERN. Relating back to the reward based learning theory, awareness could modulate expectancy violations. Increased awareness could result in decreased violations of expectancies and decreased awareness could achieve the opposite effect. Further research is needed to completely understand the effects of awareness on ACC activation.


There were some other interesting possible clues, but right now, I just wonder if lying, holding onto a lie, even if one is only lying to the self, causes some kind of damage to this area of the brain? Or, if not actual damage, just sets up a pattern of activity that affects other areas of the brain in a detrimental way? One suspects that even when people believe a lie that some part of their brain knows the truth and they know, at some level, that they are lying or believing lies (which amounts to lying to the self).

I also wonder what kinds of results would show up doing these kinds of scans on psychopaths? Do psychopaths know they are lying in all cases? And if that is the case, does it have the same physiological effect on them as it does on an individual with a conscience?

Just a whole lot of thoughts and questions...
 
A fascinating story well worth reading about a lying relationship and how certain lies can destroy lives, if not minds:

The Secret That Became My Life
http://www.psychologytoday.com/articles/201312/the-secret-became-my-life
 
And on the other hand....

Why People Can't Accept Facts and Prefer to Believe Lies

Quote from: Barbara Oakley in "Evil Genes"

A recent imaging study by psychologist Drew Westen and his colleagues at Emory University provides firm support for the existence of emotional reasoning. Just prior to the 2004 Bush-Kerry presidential elections, two groups of subjects were recruited - fifteen ardent Democrats and fifteen ardent Republicans. Each was presented with conflicting and seemingly damaging statements about their candidate, as well as about more neutral targets such as actor Tom Hanks (who, it appears, is a likable guy for people of all political persuasions). Unsurprisingly, when the participants were asked to draw a logical conclusion about a candidate from the other - "wrong" - political party, the participants found a way to arrive at a conclusion that made the candidate look bad, even though logic should have mitigated the particular circumstances and allowed them to reach a different conclusion. Here's where it gets interesting.

When this "emote control" began to occur, parts of the brain normally involved in reasoning were not activated. Instead, a constellation of activations occurred in the same areas of the brain where punishment, pain, and negative emotions are experienced (that is, in the left insula, lateral frontal cortex, and ventromedial prefrontal cortex). Once a way was found to ignore information that could not be rationally discounted, the neural punishment areas turned off, and the participant received a blast of activation in the circuits involving rewards - akin to the high an addict receives when getting his fix.

In essence, the participants were not about to let facts get in the way of their hot-button decision making and quick buzz of reward. "None of the circuits involved in conscious reasoning were particularly engaged," says Westen. "Essentially, it appears as if partisans twirl the cognitive kaleidoscope until they get the conclusions they want, and then they get massively reinforced for it, with the elimination of negative emotional states and activation of positive ones." {...}

Ultimately, Westen and his colleagues believe that "emotionally biased reasoning leads to the 'stamping in' or reinforcement of a defensive belief, associationg the participant's 'revisionist' account of the data with positive emotion or relief and elimination of distress. 'The result is that partisan beliefs are calcified, and the person can learn very little from new data,'" Westen says. Westen's remarkable study showed that neural information processing related to what he terms "motivated reasoning" ... appears to be qualitatively different from reasoning when a person has no strong emotional stake in the conclusions to be reached.

The study is thus the first to describe the neural processes that underlie political judgment and decision making, as well as to describe processes involving emote control, psychological defense, confirmatory bias, and some forms of cognitive dissonance. The significance of these findings ranges beyond the study of politics: "Everyone from executives and judges to scientists and politicians may reason to emotionally biased judgments when they have a vested interest in how to interpret 'the facts,'" according to Westen. 
 
Laura said:
I also wonder what kinds of results would show up doing these kinds of scans on psychopaths? Do psychopaths know they are lying in all cases? And if that is the case, does it have the same physiological effect on them as it does on an individual with a conscience?

Good question. After reading predators by Anna C. Salter where she said that is very difficult to determine if someone is lying specially with experienced liar, it would be interesting to see, for those liars, the physiological effect on them.
 
Laura said:
There were some other interesting possible clues, but right now, I just wonder if lying, holding onto a lie, even if one is only lying to the self, causes some kind of damage to this area of the brain? Or, if not actual damage, just sets up a pattern of activity that affects other areas of the brain in a detrimental way? One suspects that even when people believe a lie that some part of their brain knows the truth and they know, at some level, that they are lying or believing lies (which amounts to lying to the self).

Very interesting. Since a person held onto that belief or believing lies, that can lead to their disintegration or already disintegrated, no doubt includes having brain damages. Using Gurdjieff's terms, when one is a machine being enforced by the lies all of its life, it can be much difficult to change oneself to live in Truth and be real, especially when the brain is damaged. It's quite sobering to see how much effort goes into lying and believing the lies and how much that affects the brain.
 
From:

Your Brain on Lies, Damned Lies and ‘Truth Serums’

Apparently, every person lies an average of 1.65 times every day. However, since that average is self-reported, maybe take that figure with a pinch of salt. The truth is, most people are great at lying. The ability to conjure up a plausible alternative reality is, when you think about it, seriously impressive, but it takes practice. From about the age of 3 young children are able to make up false information, at a stunning rate of one lie every 2 hours – though admittedly the lies from a toddler’s wild imagination are relatively easy to identify.

When we lie, brain cells in the prefrontal cortex – the planning, ‘executive’ of the brain – work harder than when we tell the truth. This may be reflected in the physical structure of our brains as well: pathological liars have been shown to have more white ‘wiring’ matter and less grey matter in the prefrontal cortex of their brain than other people. But how can we tell if someone is telling a lie, or telling the truth?
 
From:

Your Brain on Lies, Damned Lies and ‘Truth Serums’

Apparently, every person lies an average of 1.65 times every day. However, since that average is self-reported, maybe take that figure with a pinch of salt. The truth is, most people are great at lying. The ability to conjure up a plausible alternative reality is, when you think about it, seriously impressive, but it takes practice. From about the age of 3 young children are able to make up false information, at a stunning rate of one lie every 2 hours – though admittedly the lies from a toddler’s wild imagination are relatively easy to identify.

When we lie, brain cells in the prefrontal cortex – the planning, ‘executive’ of the brain – work harder than when we tell the truth. This may be reflected in the physical structure of our brains as well: pathological liars have been shown to have more white ‘wiring’ matter and less grey matter in the prefrontal cortex of their brain than other people. But how can we tell if someone is telling a lie, or telling the truth?

Neurologically I think this would manifest as a functional change in the signal from the limbic brain to the ACC, so instead of merely conveying information about its response and how one's feelings have changed to adjust to the new reality, it instead is a command that increases the cognitive load on the prefrontal cortex (according to Perceval's quotation) so the limbic brain can just sit on its tush and feel good about itself. One centre is essentially enslaving and swallowing another to the degree the functioning of the ACC is perverted towards those ends.
 
Here is an excerpt from The Anatomy of Evil by Adrian Raine. I thought it was pretty interesting, and also adds some more details about this white matter increase in liars:

Pinocchio’s Nose and The Lying Brain

I want to extend our neurodevelopmental argument by looking at structural abnormalities of the brain that take the form of advantages, not disadvantages. We’ll combine this theme with a core question. The brains of violent and psychopathic offenders may be deformed, but can this also apply to other offenders? What about me and you when we tell a fib or two? Are there brain bases to less serious forms of offending?

Lying is pervasive. At some level, most of us lie most days of the week. We lie about almost anything. When do we lie most? Community surveys show it’s on our first date with a new person. And this gives us a clue as to why we lie so much. –it’s impression management. If we were brutally honest all the time, we’d likely never get that first kiss. Plus we’d make life really miserable for everyone. Do you really want me to tell you what I honestly think of that dreadful new haircut? That gaudy shirt? Your bad-mannered new boyfriend? No, you don’t. So we use lies to smooth out the rough-and-tumble of everyday social encounters. “That new hairdo suits you!” “That shirt really brings out your personality.” “Your new boyfriend is a perfect match for you!” We gain the affection and friendship of others, and at times simply do more good than harm. None of us are saints, but most of us are not serious psychopathic sinners either.

Most of us, that is. For others, lying goes a bit too far. One of the twenty traits of the psychopath is pathological lying and deception. They lie left, right and center. Sometimes for good reason, and sometimes, perplexingly, for no reason. When I worked with psychopaths, before conducting my induction interview I would review in detail their whole case file. And given that I was working in top-security prisons with long-term prisoners, their files were fairly complete. The information about their life trajectories, behaviors, and personalities gave me a good basis upon which to determine whether the prisoner I was working with was a pathological liar. When someone says something that conflicts with what you know, you have a good opportunity to challenge him. You can check if what he says back to you sounds like sense of seems a sham.
The trouble with psychopaths, though, is that they really are extraordinarily good at lying. Just when you think you’ve nabbed them telling an enormous whopper, they have the uncanny ability to reel off a seemingly convincing explanation for the discrepancy without batting an eye. Believe me, against your own better professional judgment you could walk out of that interview room believing that you must have gotten your facts wrong –only to read the file again and check in with the senior probation officer and realize he duped you. You really have to experience it to believe it.

It might surprise you to learn that I don’t have a clue who is and who is not a psychopath, even after four years of working full time with them in prison and thirty years of academic research. I’m just not that fast on the uptake in this arena. If I met you for the first time and we chatted for an hour, I would be none the wiser as to whether you were a psychopath of not. I’ll come back to that later. But it’s not just me. Whether you like it or not, you too are completely clueless when it comes to knowing if someone is lying to you or not.

Don’t take it personally –we’re all hopeless, not just you and me. Police officers, customs officers, FBI agents, and parole officers. They are no better than plain old undergraduates in their ability to detect deception. They actually believe they are good at the lie detection; they don’t even recognize their own mistakes. Doctors don’t know when you are lying to them about made-up symptoms in your attempts to get the medications you want.

Why are we so bad at knowing who is a liar? It’s because all the things that we think are signs of lying are quite unrelated to the ability to detect deception. Think of a time when you did not have any tangible background evidence or context to tell if someone was lying to you, but you judged that person as lying based on how they spoke and behaved. I bet you were basing this on things like their shifty gaze, hesitations in their speech, their fidgeting, or their going off-topic into some detail. In reality, none of these are related to lying. They give us false clues, and we are misled by them.

But how about kids? Surely we are better in judging when a child lies to us? Aren’t we?

Well, no, we’re not. in one study on this topic, children of different ages were videotaped sitting in a room with the experimenter. Behind them is an interesting toy. The experimenter tells the child he must go out of the room and that the child should not peek at the toy while he is away. The experimenter goes away for a while and comes back. Some kids peek, some don’t. The experimenter then asks the child if he or she peeked. Of those who deny peeking, some are telling the truth and some are lying. Experimenters then show the videotapes to a range of individuals to see how good they are at telling when a child is lying. Being correct 50 percent of the time would be the level of chance, because in this scenario, 50 percent of the film clips show a child lying and 50 percent show a child telling the truth.

The tapes are given to undergraduate students. Surely working out if a kid is lying has to be easier than most university exams. But these smart undergraduates are correct 51 percent of the time, not significantly above chance.

So let’s see how customs officers fare when viewing the same tapes –they have a boatload of experience in picking out deceptive travelers. They are al 49 percent –below chance levels– tough to give them some credit they are not significantly worse than the hapless undergraduates.

Okay, so let’s go to cops, as surely they are streetwise about these fledging psychopathic liars. Nice try, but actually the police are al 44 percent accuracy levels, significantly lower than chance, and significantly poorer than undergraduates or customs officers. Next time a cop stops you and accuses you of a traffic violation that you deny and he will not believe your protestation, remind him about this study.

So let’s try again. Maybe eleven-year olds are sophisticated liars, and so we might understand how overall accuracy levels with these kids are at a miserable 39 percent. But can’t we tell if a four-year old is lying? Actually, we cannot. Accuracy levels are at 40 percent at this age, 47 percent at age five, and 43 percent at age six. Parents, you think you know what you kids get up to, but actually you don’t even have a clue with your own toddler. That’s how bad the story is. Sorry, mate, but you really are as hapless as I at figuring out who a psychopathic liar is.

But here’s a ray of hope for you. I have two ten-year-old monkeys at home who are always getting into mischief. And yes, Andrew and Philip are clever and skilfull liars –just like most kids. When I want to know who did what, before I pop the question I tell them that it’s important to be honest and they should promise to tell the truth. Research indicates that getting young children to talk about moral issues first and then asking them to promise to tell the truth significantly encourages a truthful answer –boosting lie detection accuracy from 40 percent to 60 percent.

This research on children made me and my lab intrigued about what makes a psychopath a good liar. People may be hapless at lie detection, but perhaps machines have a mechanism to better delve inside the minds of Machiavellians. Psychopaths may be able to lie to us face to face, but perhaps the signature of a pathological liar may reside below the surface inside their brains. Might psychological liars have a physical advantage over the rest of us when it comes to pulling a fast one?

In our study we assessed whether people had a history of repeatedly lying throughout life. We assessed this in our psychiatric interviews on antisocial personality disorder and psychopathy. We also measured it using questionnaires, and by cross-checking notes between our lab assistants.
For example, on one day our research assistant was struck by the fact that a participant walked on his toes. Upon questioning, our participant told a detailed and convincing story of how he was in a motorbike accident resulting in damage to his heels. The very next day, he was being assessed by a different research assistant on a different floor of our building and he walked perfectly normally. The con only came to light when our research assistants traded notes. A typical pathological lie: deception but without any obvious gain or motivation.

We ended up with a group of twelve who fulfilled criteria for pathological lying and conning by their own admission. But you might reasonably ask how we know if people are telling the truth about their lying. The answer is that –to be honest– we can never be sure that our pathological liars were truthful in admitting that they repeatedly con, manipulate, and lie throughout their lives. But we can be sure that if they are telling the truth, then they are indeed pathological liars. And if they are lying about their lying, then they really have to be pathological liars! So, armed with this logic we went ahead anyway and scanned their brains.

We had two control groups for good measure. One group of twenty-one was not antisocial and did not lie –or at least they claimed they didn’t. These were the “normal” controls. The other group, of sixteen, had committed as many criminal offenses as the pathological liar group –but they were not pathological liars. These individuals made up the “antisocial” control group. These two control groups were then compared with the pathological liar group.

What came out was an unusual finding in the field that must be credited to Yaling Yang, who took the lead on this study. [...] The volume of white matter in the prefrontal cortex was greater in pathological liars than in both control groups. They had a 22 percent volume increase compared with normal controls, and a 26 percent increase compared with criminal controls. The white matter volume increase was particularly true of the more ventral, lower areas of the prefrontal cortex. As you might expect, liars also had significantly higher verbal IQs than the other two groups, but this did not explain away the structural brain differences. As Sean Spence, a leading expert on lying, commented in his editorial on this work, the white matter increase is very unusual, as virtually no other clinical disorder has been associated with this abnormality.

[…] Lying is a complex executive function that requires a lot of frontal lobe processing. Telling the truth is easy. Lying is much harder and requires more processing resources. We think that increased prefrontal white matter provides the individual with a boost in the cognitive capacity to lie because it reflects greater connectivity between subregions both within the prefrontal cortex and in other brain areas. Let’s consider lying a little more.

Lying involves theory of mind. When I lie to you about where I was at eleven p.m. on Wednesday, January 7, I need to have an understanding of what you know about the facts of the case –and what you do not know. I need to have a sense of what you think is plausible, and what is not. For this “mind reading” we need to involve other subregions in the temporal and parietal lobes and connect them to the prefrontal cortex. We have discussed the behavioral cues that are bad signs of when people lie. But extensive studies also show that during lie-telling, individuals suppress unnecessary body movements. When I’m telling you the truth about where I was on the night of January 7 and I have nothing to hide, I may gesture with my hands, raise my eyebrows when making a point in the story, and look up into space for a second or two.

Liars tend to not do that. They sit still and suppress motor activity because they are cognitively focusing on their story. All of their processing resources are going into this activity. Suppression requires prefrontal regulation of the motor and somatosensory areas of the brain that control motor and body movements. Greater white-matter connectivity will facilitate that. While liars are busy building the believable façade of their story, they also have to take care not to look too nervous. This involves suppression of limbic emotional regions that include the amygdala. So again, prefrontal-limbic connectivity is important. The more white-matter wiring there is in the prefrontal cortex, the better all these functions can be subserved.

We think that the cause of the greater white-matter volumes in pathological liars is neurodevelopmental. Again, we are talking about an increase in volume, rather than a decrease. From a neurodevelopmental perspective, throughout childhood there is massive expansion of brain size. Brain weight reaches adult values between the ages of ten and twelve, with a very significant increase in the absolute volume of white matter by this age. We also know that children become most adept at lying at the same time –by then years of age. Interestingly, then, the neurodevelopmental increase in white matter parallels developmental changes in the ability of children to lie. This suggests that the increased white matter we find in pathological liars does indeed facilitate their ability to lie. Based on this perspective, we think that the increased prefrontal white matter found in adult psychopathic liars predisposes them to deception and cunning.

The increase in white matter, then, might “cause” pathological lying. But could it be the other way around? You’ll likely recall from your childhood the late-nineteenth-century Italian children’s story about Pinocchio, the puppet whose nose grew every time he told a lie. Could it be that the act of pathological lying causes the physical increase in white matter in the prefrontal cortex?

This “Pinocchio’s nose” hypothesis is not as ridiculous as it may sound. It’s the concept of brain plasticity. The more time that musicians spend in practicing the piano, the greater the development of their white matter, especially in childhood. Practicing lying in childhood might particularly enhance prefrontal white matter. But even in adults, extensive practice has been found to correlate with brain structure. London taxi drivers have to undergo three years of extensive training to learn their way around 25,000 convoluted city streets. MRI studies have shown that these taxi drivers have a greater volume of the hippocampus compared with matched controls, and also compared with London bus drivers, who do not undergo such extensive training. Just as working in the gym can build up your muscles, mental effort can flex your brain.

In the case of pathological liars, it’s as it a criminal lifestyle makes for a criminal brain. It’s a different story from the one Lombroso was telling in Italy in the nineteenth century –the idea that brain impairment causes crime. But we cannot yet discount the alternative environmental explanation that lying causes brain change.
 
The observed increase in white matter in the prefrontal cortex area for liars is interesting. It indicates that greater connectivity (white matter is neuroprotective glial cells and myelinated axons which help in signal transmission) in prefrontal cortex is conducive to lying. Scientist Yaling Yang who found the prefrontal white matter connection to lying also proposed that decrease in prefrontal white matter makes lying more difficult.

http://www.ocf.berkeley.edu/~issues/articles/13.1_Gee_J_Liar_Liar_Brain.html

“Based on developmental theories, a child starts showing the ability to lie at around age three. This ability quickly develops and peaks approximately at age ten. This is the period of time [during which] the white matter volume in the brain increases dramatically to almost 60 percent in a normal child,” he says. Yang also adds that research demonstrates that a decrease in prefrontal white matter results in a limited ability to lie; he uses autistic children, who cannot tell lies very successfully, to illustrates this finding. Yang indicates that “autistic children only show 10 percent white matter increase between the ages of three to ten. Failure in the development of white matter [may have resulted] in their [lying] impairment.”

Found another interesting paper on this topic of lying : "The Truth about Lying: Inhibition of the Anterior Prefrontal Cortex Improves Deceptive Behavior". The full text is here:
http://cercor.oxfordjournals.org/content/20/1/205.long
The role of anterior prefrontal cortex (aPFC) in lying was investigated with transcranial direct current stimulation of the area with different polarities. The experimental design was interesting

The aim of this study was therefore 1) to realize an experimental setup, in which participants should decide themselves, which questions they would answer truthfully and which ones with a lie and 2) to investigate the causal contribution of the aPFC in deceptive behavior by modulating the excitability of this brain region through tDCS. Three experiments were conducted to test the specificity of the transcranial stimulation effect.

In the first experiment, 22 healthy subjects participated in a mock crime, in which they were supposed to steal money and then to attend an interrogation with a modified version of the Guilty Knowledge Test (GKT). In addition to verbal response (truth vs. lie) reaction time (RT), skin-conductance response (SCR) and feelings of guilt while deceiving the interrogator were assessed. In a double-blind repeated-measures design, subjects received cathodal or sham tDCS of their aPFC during the interrogation of the mock crime.

Furthermore, in order to measure skillful lying, we developed a ratio called “lying quotient” (LQ) relating the frequency of lies to critical questions with the frequency of lies to uncritical questions. Skillful lying meant that a person intending to appear innocent should not simply lie on all questions, because this behavior would appear rather suspicious. Instead, as in a real criminal interrogation, the suspects had to decide themselves which questions they would answer truthfully and which ones with a lie. Accordingly, a subject achieved a relatively high LQ if he/she answered all “critical items” (whose correct answer only the interrogator and the thief knew, e.g., the true color of the wallet) with a lie, but all “uncritical items” truthfully. To increase motivation for deceptive behavior, participants were told that they were allowed to keep the stolen money in case they could convince the interrogator that they were not guilty.

To test the specificity of the applied stimulation polarity and stimulation site, we conducted a second experiment with 22 healthy volunteers in which the stimulation polarity was reversed. For “anodal” tDCS of the aPFC, the anodal electrode was placed over FP2 (international EEG 10/20 system), and the cathodal electrode was placed over PO3 (left parieto-occipital cortex) as a control area. In randomized order, anodal or sham tDCS of the aPFC was applied during the interrogation.

Further 20 healthy subjects participated in a third experiment, in which the Stroop test (Stroop 1935) was used as a “control task.” In experiments 1 and 2, subjects intending to deceive the interrogator had to inhibit the truth as a prepotent response and give instead a deceitful answer. The Stroop task is a widely used index of executive control (MacLeod 1991; Swick and Jovanovic 2002) that tests the ability to inhibit a prepotent response but does not include deceiving the counterpart.

Experimental Design

Experiments 1 and 2 consisted of a thief role-play, in which money (20 Euros) was stolen and a subsequent interrogation, in which the suspects were asked questions about the course of the mock crime according to the GKT paradigm. The GKT (Lykken 1959, 1960) utilizes a series of multiple-choice questions, each having 1 true alternative and several false alternatives, chosen so that an innocent suspect would not be able to discriminate them from the relevant alternative (e.g., “the color of the stolen wallet was: red? black? brown? blue? gray?”). Thus, if the subject's physiological responses to the relevant alternative are consistently larger than the control alternatives, knowledge about the crime is inferred (for a meta-analysis on the validity of the GKT see Ben-Shakar and Elaad 2003). The role-play was organized as follows: Two subjects were asked to pick 1 of 2 chits of paper from a cup. The subjects were told that on 1 chit was written “thief” and on the other one “innocent attendee.” The subjects were asked to memorize their roles but not to tell the instructor which role they had chosen. After the roles were assigned by drawing lots, the subjects were told to go to an office and wait there for 20 min until the interrogation. This office consisted of a main room and an adjoining room. Both rooms were shown to the subjects before assigning the roles, and they were told that the innocent attendee should wait during the mock crime in the main room, while the thief should go to the adjoining room and search there for money with the intention to steal it. Money could be placed at several locations. Therefore, the thief should not only search for the money thoroughly but also as quickly as possible. The subjects were further told that after the money has been stolen, both subjects will be suspected to be the thief. Each of them will attend independently of each other 2 interrogations with an investigator who will play the role of a police inspector. In the interrogation, the subjects will be asked questions, which they should answer as quickly as possible with a “yes” or a “no.” Additionally, the SCR and the RT will be recorded. The subjects were also told that during each of the 2 interrogations, they will receive different types of tDCS. The true “thief” should lie in such a skillful manner that the interrogator would believe he/she is innocent. Skillful lying meant that a person intending to appear innocent should not simply lie on all questions, because this behavior would appear rather suspicious. Instead, as in a real criminal interrogation, the suspects had to decide themselves which questions they would answer truthfully and which ones with a lie. To enhance the motivation of the subjects to identify themselves with their role and to make the role-play as realistic as possible, subjects were told that they were allowed to keep the stolen money in case they could convince the interrogator that they were not guilty. However, in reality, 1 of the 2 subjects was a collaborator of the experiment, a fact unknown to the subject and on both pieces of paper “thief” was written, but the collaborator knew that he had to play the role “innocent attendee.” The goal of the investigation was to elucidate, if the subjects would show during cathodal transcranial DC stimulation of the aPFC different deceptive behavior than during anodal or sham stimulation.

Lying Quotient (LQ) Measurement

In order to measure skillful lying, we developed a ratio called lying quotient (LQ):
LQ = ((Ncrit/Ntot_crit) - (Nuncrit/Ntot_uncrit)) * 100

where Ncrit = Frequency of lies on critical questions, Ntot_crit = Total number of critical questions,
Nuncrit = Frequency of lies on uncritical questions, and Ntot_uncrit = Total number of uncritical questions.

Skillful lying meant that a person intending to appear innocent should not simply lie on all questions, because this behavior would appear rather suspicious. Instead, as in a real criminal interrogation, the suspects had to decide themselves which questions they would answer truthfully and which ones with a lie.

In the interrogation, a modified version of the GKT was applied consisting of 10 critical and 7 uncritical questions, each with 4 choices. An uncritical question was a question, whose answer would be known even by an innocent attendee, who has been in the room but did not steal the money (e.g., “On the chair in the small room there was a jacket. Was the color of the jacket: green? blue? black? brown?”). In contrast, a critical question was a question, whose answer would be known only by the thief (e.g., “In the pocket of the jacket there was wallet. Was the color of the wallet: green? blue? black? brown?”).

According to formula (1), the LQ can range from −100 to +100. A most skillful liar would have a maximum LQ of 100, if he/she lies on all critical questions, but answers all uncritical questions truthfully. Subjects who decide simply to lie on all questions independently of their relevance to the criminal act will have an LQ of 0. A quite odd behavior would be, if a subject answers all critical questions truthfully but lies on all uncritical questions. In such a case, that subject would get an LQ of −100. Besides having a direct measure for skillful lying, an important advantage of the LQ is that it enables us to control for the subjects’ bias strategies or predisposition to answer almost all questions in an interrogation with a lie or truthfully independently of the fact, if they are critical or not. A subject who decides to lie on all questions would not admit knowing any critical information, but still would appear dishonest, because he/she denies knowing information, which he/she should know even as an innocent attendee. In contrast to this strategy, another subject might prefer to answer almost all questions truthfully. Such a subject would appear very honest; however, he/she would increase the possibility to be detected as the thief, because he/she would admit knowing a lot of information, which only the delinquent could have known.

Result Discussion

Most remarkably, we observed that inhibiting the excitability of the aPFC with cathodal tDCS did not lead to impairment but rather to a significant within-subject improvement of deceptive behavior. This effect was expressed in faster RTs for telling lies, but not for telling the truth, a decrease in sympathetic SCR and feelings of guilt while deceiving the interrogator and a significantly higher LQ, which reflects skillful lying.
..........
The intriguing question that remains is why did cathodal tDCS lead to “improvement” of deceptive behavior and not to its impairment?

Recent neuroimaging studies have emphasized that the aPFC (BA 9/10) plays a crucial role in moral cognition (Greene et al. 2001; Moll et al. 2002, 2005). Moll et al. (2002, 2005) found increased activation of the aPFC when a moral judgment condition was compared with a non-emotional factual judgment, but not when moral judgments were compared with a social emotional condition, during which a more ventral region was activated. Greene et al. (2001) used a moral judgment task that involved classic moral dilemmas (e.g., should you kill an innocent person in order to save 5 other people?) and found increased activation of the aPFC during emotionally loaded moral judgments. Moreover, neuroimaging studies have also emphasized the importance of the aPFC in social interaction (Stuss et al. 2001; Decety and Sommerville 2003; Amodio and Frith 2006; Heatherton et al. 2006; Raine and Yang 2006). Heatherton et al. (2006) have shown that making judgments about the self relative to an intimate other selectively activates the aPFC. Stuss et al. (2001) have demonstrated on patients with limited focal frontal and nonfrontal lesions that the frontal lobes are necessary for “theory of mind,” which includes inferences about feelings of others and empathy for those feelings. The anterior PFC, the ventral PFC, and the amygdala are regions that have been shown to be involved in both antisocial behavior and moral decision making (Raine and Yang 2006).

Taking these findings into account, the aPFC seems to be crucially involved in socio-emotional judgments. Suppressing the excitability of this region or focal lesions should therefore show an impact on antisocial and moral behavior. In respect to our study, deceiving another person in order to obtain personal profit seems to create a moral conflict, and if a person is relieved from this moral conflict, he/she might be able to deceive unhinderedly with faster RT, less feelings of guilt and less sympathetic arousal as demonstrated here. Suppressing cortical excitability by cathodal tDCS or low-frequency repetitive transcranial magnetic stimulation (rTMS) has previously been shown to induce so-called paradoxical improvement of performance through “disinhibition” processes (Hilgetag et al. 2001; Kobayashi et al. 2004; Fecteau et al. 2007). Kobayashi et al. (2004) have, for example, demonstrated that suppression of the primary motor cortex by low-frequency rTMS enhances motor performance with the ipsilateral hand by releasing the contralateral motor cortex from transcallosal inhibition. Using tDCS, Fecteau et al. (2007) have recently shown that enhancing DLPFC activity diminished risk-taking behavior, but only when coupled with inhibitory modulation over the contralateral DLPFC. Intriguingly, Koenigs et al. (2007) have also shown that a lesion of the PFC leads to an increase of utilitarian moral decisions. An increase in antisocial behavior following PFC impairment is supposed to result from a release of limbic areas from PFC executive control (Moll et al. 2005). However, it is not the aim of this study to state that the aPFC is the only cortical region, whose stimulation can modulate deceptive behavior. Neuroimaging studies have indicated that also other cortical areas, especially the DLPFC (Phan et al. 2005; Abe et al. 2006, 2007) and the superior temporal sulcus (Phan et al. 2005) are also involved in deception and that in different types of deception (e.g., lies that are rehearsed and part of a coherent story vs. spontaneous noncoherent lies) different cortical networks are involved (Ganis et al. 2003; Abe et al. 2007). Priori et al. (2008) have recently demonstrated that tDCS of the DLPFC alters RT in deception of experienced events but had no effect on RTs in deception of new events. Thus, future studies will have to investigate the effect of stimulation of different cortical areas in different types of lies and the duration of these effects in relation to the stimulation parameters.

A further interesting question is, why anodal tDCS, which has been shown to increase cortical excitability (Gartside 1968; Nitsche and Paulus 2001; Antal et al. 2004), did not lead to opposite effects compared with cathodal tDCS resulting in an impairment of deceptive behavior and an increase of feelings of guilt while deceiving the interrogator? Although our data show that concerning the LQ and feelings of guilt there is a tendency toward lower LQ and higher feelings of guilt during anodal tDCS compared with sham tDCS (cf. Figs 3b and 4c), these changes did not reach significance. It is plausible to assume that disruption of the PFC can have an effect on social cognition (Anderson et al. 1999), moral reasoning (Koenigs et al. 2007), or even on deception as shown in the present study, however, increasing the excitability in a “normal functioning” PFC does not necessarily have to lead to opposite effects presumably due to ceiling effects. However it is tempting to test in patients with “impaired” PFC if increasing cortical excitability by anodal tDCS can help to remedy functional deficits.


Psychopathy Discussion

The findings of the present study are also particularly interesting in the light of clinical evidence suggesting that psychopaths, who are classified as pathological liars, have significantly less gray matter in their PFC (Yang et al. 2005) and, remarkably, do not show higher SCR when telling lies (Verschuere et al. 2005). We have previously demonstrated that in psychopaths limbic-prefrontal regions (amygdala, orbitofrontal cortex, insula, and the anterior cingulate), and SCR during anticipation of aversive events is pathologically reduced (Veit et al. 2002; Birbaumer et al. 2005). In a social reactive aggression paradigm, Lotze et al. (2007) have shown that during retaliation, subjects with high psychopathic scores had less BA 9/10 activation in comparison to subjects with low psychopathic scores. These findings are in accordance with the results of other research groups reporting decreased prefrontal blood flow (for a review, see Blair 2007) and deficient autonomic responses, for example, SCR, in anticipation of threatening events (Blair et al. 1997; Hare et al. 1978). Moreover, several studies (Anderson et al. 1999; Moll et al. 2005) have also shown that in psychopaths and patients with aPFC lesions, moral cognition is impaired. Thus, our findings support the hypotheses that a dysfunction of the aPFC and its specific connections may underlie certain psychopathological conditions that are characterized by the absence of sympathetic arousal while performing a wrongful act such as deceiving in a criminal interrogation.
 
I have been thinking about the truth lately and how important it is to stay as close to the truth as possible. I believe this is the best way to protect oneself. Believing lies steers you away from your own truth and because of this people get into trouble go down a path that they don’t want a path to unhappiness. The problem is that I believe our essence is the truth but in order to function in the world sometimes we can’t always act truthfully can’t act from our essence in every situation this is when our personality takes over. I believe buffers are one of the many reasons why people are so twisted upside-down because they can’t see the truth of their interactions and the situations that they got themselves into. If a situation that they are a part of turns out unfavorably for them then they buffer their actions believe lies to make them feel comfortable. However they are not safe they are just denying the inevitable because if they don’t know the truth of their actions/situations then they are destined to repeat the same cycle over and over again their being won’t grow. It is important to see our blind spots and acknowledge when we are wrong and see each side objectively. However our personality over the years has made it easy for us to justify and run away from the truth it is very hard to undo all the layers of BS to get down to seeing the truth. However when you do and you can no longer buffer making a mistake or doing wrong becomes so debilitating that there is nothing left to do other than find out what the truth is so you can move on and create a better life for yourself and the people you interact with in the future. Lies don't only damage the brain they damage your life and others.

Believing one lie leads to another and choosing a lie over the truth leads to another lie and now you are so far into lies that the truth is too far away to get back to it takes years of work and or painful experiences because once you get to a certain point where you don't want the pain (emotional or physical) anymore your only choice is to seek the truth and in doing this you will then choose truth over lies in order to live life pain free…The devil is a lie

As a teacher I wish I could open a school or create a course that teaches people how to look for the truth how to see the truth in themselves, others and life to know what a lie is and how their life will be truthfully better if they stay closer to the truth and how and why lies are so damaging... IMO this is one of lifes most important lessons and is universal everyone can benefit from this topic regardless of sex, race, culture and so on
 
I think not only lying but either some sort of one-sided thinking and ignoring - it may be logical and true in general and with examples, but when not true with self or not true with life circumstance - true only with self - it would be destructive. If we would try to move legs in a different way we would stumble and be hurt. Probably neurons work in some way and lying and losing logic makes it work badly. I think we have to be true and logical and then to think about freedom and understanding. But we have psychics so it is not that easy - being logically true may be also being psychopatic etc. It is subtle - that's what I feel about it.
 
Pinocchio’s Nose and The Lying Brain
...
This “Pinocchio’s nose” hypothesis is not as ridiculous as it may sound. It’s the concept of brain plasticity. The more time that musicians spend in practicing the piano, the greater the development of their white matter, especially in childhood. Practicing lying in childhood might particularly enhance prefrontal white matter. But even in adults, extensive practice has been found to correlate with brain structure. London taxi drivers have to undergo three years of extensive training to learn their way around 25,000 convoluted city streets. MRI studies have shown that these taxi drivers have a greater volume of the hippocampus compared with matched controls, and also compared with London bus drivers, who do not undergo such extensive training. Just as working in the gym can build up your muscles, mental effort can flex your brain.
...

The compulsiveness of pathological liars to lie when there is no benefit, and could actually be detrimental due to increased chances of discovery, is curious. Perhaps it's a form of on-going practice for them, to keep those prefrontal connections strong. Since psychopaths have little to no ability to forecast situations to determine outcomes based on their behavior, their prefrontal cortex seems like their "lie center". If that's all it does for those sorts, it would be an unconscious act the way someone might tap their finger or scratch their chin. They are continually exercising the portion of their brain that is most active. When there are no useful ways of doing it, they just make them up.
 
The previous posts reminded me of a work of art I saw recently on the cover of my RTV guide. It was a prize winning design in a contest. The artist Roel Seidell must have had some inkling about brain plasticity and the self destructiveness of lying, I suppose.

Here it is:

cover_seidell_roel.jpg
 
I've met a guy who was claiming all the time his best friend is Pinnoccio - it was whole trip, he was also telling about channels, it was from his mind. It was also a circumstance when the same people were saying the same things as it was exactly same person so I figured there is some collective awareness (unawareness) - I actually figured it before. It was sometimes too subtle and clear to be random. But sometimes this 'truth' is too similar to HAARP (or whatever would be the proper name for the issue, HAARP could be too general) - barometer of samsara, linear mechanism, so I am not that much sure about that: that's why I'm not always sure if it's subtle or stupid. It may be we just have to obey some rules of practice because of the limitation of the brain, not because there are rules - pretty much, it was claimed going to 4 is dying: it could be because of that limitation - accept the limitation until having new body or be careful with practice (by practice I mean C's term learning and Gurdjieff;s doing). If this dying is only symbolical then it already happens everywhere.

These are my feelings, somehow unable to have any new intellectual stuff, but the issue is true. My doubts are where it is about being true and logical and where it is about limitation, being in the box or imprisonment.
 
How lying takes our brains down a 'slippery slope'

Telling small lies desensitises our brains to the associated negative emotions and may encourage us to tell bigger lies in future, reveals new UCL research funded by Wellcome and the Center for Advanced Hindsight.

The research, published in Nature Neuroscience, provides the first empirical evidence that self-serving lies gradually escalate and reveals how this happens in our brains.

The team scanned volunteers' brains while they took part in tasks where they could lie for personal gain. They found that the amygdala, a part of the brain associated with emotion, was most active when people first lied for personal gain. The amygdala's response to lying declined with every lie while the magnitude of the lies escalated. Crucially, the researchers found that larger drops in amygdala activity predicted bigger lies in future.

"When we lie for personal gain, our amygdala produces a negative feeling that limits the extent to which we are prepared to lie," explains senior author Dr Tali Sharot (UCL Experimental Psychology). "However, this response fades as we continue to lie, and the more it falls the bigger our lies become. This may lead to a 'slippery slope' where small acts of dishonesty escalate into more significant lies."

The study included 80 volunteers who took part in a team estimation task that involved guessing the number of pennies in a jar and sending their estimates to unseen partners using a computer. This took place in several different scenarios. In the baseline scenario, participants were told that aiming for the most accurate estimate would benefit them and their partner. In various other scenarios, over- or under-estimating the amount would either benefit them at their partner's expense, benefit both of them, benefit their partner at their own expense, or only benefit one of them with no effect on the other.

When over-estimating the amount (that) would benefit the volunteer at their partner's expense, people started by slightly exaggerating their estimates which elicited strong amygdala responses. Their exaggerations escalated as the experiment went on while their amygdala responses declined.

"It is likely the brain's blunted response to repeated acts of dishonesty reflects a reduced emotional response to these acts," says lead author Dr Neil Garrett (UCL Experimental Psychology). "This is in line with suggestions that our amygdala signals aversion to acts that we consider wrong or immoral. We only tested dishonesty in this experiment, but the same principle may also apply to escalations in other actions such as risk taking or violent behaviour."

Dr Raliza Stoyanova, Senior Portfolio Developer, in the Neuroscience and Mental Health team at Wellcome, said: "This is a very interesting first look at the brain's response to repeated and increasing acts of dishonesty. Future work would be needed to tease out more precisely whether these acts of dishonesty are indeed linked to a blunted emotional response, and whether escalations in other types of behaviour would have the same effect."
 
Back
Top Bottom