Six Percent?

anart said:
whitecoast said:
Yeah, what bothers me about those >10% numbers for UK, France, and Russia, America and Israel in particular is: just how on EARTH are they holding their societies together? Duct tape and chewing gum?

If a society is adapted to psychopathy (as all western societies now are), it works quite well, until it doesn't.
Studies show (see The Sociopath Next Door) that the majority of people will act against their consciences if they are following the orders from their authorities. We foolishly look to our leaders to tell us how to think, feel and act as if they are like our doctors. We are also ostracized and labeled as crazy and unpatriotic if we play the role of lone soldiers attempting to stand up against them.

Excerpt from The Sociopath Next Door by Martha Stout, (pp. 60-63).
“We are programmed to obey authority even against our own consciences.

In 1961 and 1962, in New Haven, Connecticut, Yale University professor Stanley Milgram designed and filmed one of the most astonishing psychological experiments ever conducted. Milgram set out to pit the human tendency to obey authority as squarely as possible against individual conscience. Concerning his method of inquiry, he wrote, “Of all moral principles, the one that comes closest to being universally accepted is this: one should not inflict suffering on a helpless person who is neither harmful nor threatening to oneself. This principle is the counter-force we shall set in opposition to obedience.”

Milgram’s experimental procedure was relentlessly straightforward, and the filmed version of his study has outraged humanists, and unsuspecting college students, for forty years. In the study, two men, strangers to each other, arrive at a psychology laboratory to participate in an experiment that has been advertised as having to do with memory and learning. Participation is rewarded with four dollars, plus fifty cents for carfare. At the lab, the experimenter (Stanley Milgram himself, in the filmed version) explains to both men that the study concerns “the effects of punishment on learning.” One of the two is designated as the “learner” and is escorted into another room and seated in a chair. All watch as the learner’s arms are matter-of-factly strapped to the chair, “to prevent excessive movement,” and an electrode is attached to his wrist. He is told that he must learn a list of word pairs (blue box, nice day, wild duck, etc.), and that whenever he makes a mistake, he will received an electric shock. With each mistake, the shock will increase in intensity.

The other person is told that he is to be the “teacher” in this learning experiment. After the teacher has watched the learner get strapped to a chair and wired for electric shock, the teacher is taken into a different room and asked to take a seat in front of a large, ominous machine, called a “shock generator.” The shock generator has thirty switches, arranged horizontally and labeled by “volts,” from 15 volts all the way to 450 volts, in 15-volt increments. In addition to the numbers, the switches are branded with descriptors that range from ‘slight shock’ to the sinister appellation of ‘danger-severe shock.” The teacher is handed the list of word pairs and told that his job is to administer a test to the learner in the other room. Hen the learner gets an answer right -- for example, teacher calls out “blue,” and the learner answers “box” – the teacher can move on to the next test item. But when the learner gives an incorrect answer, the teacher must push a switch and give him an electric shock. The experimenter instructs the teacher to begin at the lowest level of shock on the shock generator, and with each wrong answer, to increase the shock level by one increment.

The learner in the other room is actually the experimenter’s trained confederate, an actor, and will receive no shocks at all. But of course the teacher does not know this, and it is the teacher who is the real subject of the experiment.

The teacher calls out the first few items of the “learning test,” and then trouble begins, because the learner – Milgram’s accomplice, unseen in the other room – starts to sound very uncomfortable. At 75 volts, the learner makes a mistake on the word pair, the teacher administers the shock, and the learner grunts. At 120 volts, the learner shouts to the experimenter that the shocks are becoming painful, at 150 volts, the unseen learner demands to be released from the experiment. As the shocks get stronger, the learner’’s protests sound more and more desperate, and at 285 volts, he emits an agonized scream. The experimenter – the Yale professor in the white lab coat – stands behind the teacher, who is seated at the shock generator, and calmly gives a sequence of scripted prods, such as “Please continue,” or “The experiment requires that you continue,” or “Whether the learner likes it or not, you must go on until he has learned all the word pairs correctly. So please go on.”

Milgram repeated this procedure forty times using forty different subjects – people who were “in everyday life responsible and decent” – including high school teachers, postal clerks, salesmen, manual laborers, and engineers. The forty represented various educational levels, from one man who had not finished high school to others who had doctoral or other professional degrees. The aim of the experiment was to discover how long the subjects (the teachers in this experiment) would take to disobey Milgram’s authority when presented with a clear moral imperative. How much electric shock would they administer to a pleading, screaming stranger merely because an authority figure told them to do so?

When I show Milgram’s film to a lecture hall full of psychology students, I ask them to predict the answers to these questions. The students are always certain that conscience will prevail. Many of them predict that a large number of the subjects will walk out of the experiments as soon as they find out about the use of electric shock. Most of the students are sure that, of the subjects who remain, all but a few will defy the experimenter, perhaps telling him to go to hell, at least by the time the man in the other room demands to be freed (at 150 volts). And of course, the students predict, only a tiny number of very sick, sadistic subjects will continue pushing switches all the way to 450 volts, where the machine itself says “danger – severe shock.”

Here is what actually happens: Thirty-four of the Miligram’s original forty subjects continue to shock the learner, whom they believe to be strapped to a chair, even after he asks to be released from the experiment. In fact, of these thirty-four subjects, twenty-five – that is to day, 62.5% of the total group – never disobey the experimenter at any point, continuing to press the switches all the way to the end of the sequence (450 volts) despite entreaties and shrieks from the man in the other room. The teachers sweat, they complain, they hold their heads, but they continue...

“A substantial proportion of people do what they are told to do, irrespective of the content of the act and without limitations of conscience, so long as they perceive that the command comes from a legitimate authority.”

Although I'm not sure how accurate the 6% estimate is because Dr. Martha Stout claims it's 4%, Dr. Robert Hare & Dr Paul Babiak claim it's 1% plus 10% who fit the characteristics. But even if it were only a half a percent, these statistics are still extremely alarming, IMO.
 
Just one unidentified psychopath is enough reason to study the subject, but is not likely to be the major concern for a given individual in the near future, generally speaking, OSIT.

Scarlet said:
Studies show (see The Sociopath Next Door) that the majority of people will act against their consciences if they are following the orders from their authorities. We foolishly look to our leaders to tell us how to think, feel and act as if they are like our doctors. We are also ostracized and labeled as crazy and unpatriotic if we play the role of lone soldiers attempting to stand up against them.

Seems to me that too many people accept Milgram's "obedience to authority" experiments at face value. Once you understand trauma, narcissism and how disassociation works, it's easier to see what's missing in Milgram's work. To me, it looks like the experiments should have been named "how to make people switch into trauma content and harm others or themselves". Or perhaps "turning off the empathic insula".

...and then there's the neuroscience work showing the neural correlates between physical/emotional and social pain that can be used against others to keep them from withdrawing from a group experiment once they agree to it.

If you like Milgram's work, here's a couple of links with more info:

http://cassiopaea.org/forum/index.php/topic,16718.msg158090.html#msg158090

http://cassiopaea.org/forum/index.php/topic,4290.msg128799.html#msg128799

We can even start to notice when this kind of thing happens IRL: a person with a perceived status of "more powerful than I" comes into proximity and another person switches into psychophant mode.

Once we understand the basic psychology recommended reading material and start paying attention to reality right and left, our eyes open wide to the truth of this Work. The study of psychopathology naturally follows.
 
Bud said:
Scarlet said:
Studies show (see The Sociopath Next Door) that the majority of people will act against their consciences if they are following the orders from their authorities. We foolishly look to our leaders to tell us how to think, feel and act as if they are like our doctors. We are also ostracized and labeled as crazy and unpatriotic if we play the role of lone soldiers attempting to stand up against them.

Seems to me that too many people accept Milgram's "obedience to authority" experiments at face value. Once you understand trauma, narcissism and how disassociation works, it's easier to see what's missing in Milgram's work. To me, it looks like the experiments should have been named "how to make people switch into trauma content and harm others or themselves". Or perhaps "turning off the empathic insula".

...and then there's the neuroscience work showing the neural correlates between physical/emotional and social pain that can be used against others to keep them from withdrawing from a group experiment once they agree to it.

If you like Milgram's work, here's a couple of links with more info:

http://cassiopaea.org/forum/index.php/topic,16718.msg158090.html#msg158090

http://cassiopaea.org/forum/index.php/topic,4290.msg128799.html#msg128799

We can even start to notice when this kind of thing happens IRL: a person with a perceived status of "more powerful than I" comes into proximity and another person switches into psychophant mode.

Once we understand the basic psychology recommended reading material and start paying attention to reality right and left, our eyes open wide to the truth of this Work. The study of psychopathology naturally follows.
Thanks for this considerate response, Bud! I read it the other day, but haven't had a chance to properly reply until now. I agree that there would be flaws in this test (and the repetitions of this test that subsequently followed) because of the academic settings that the tests took place in, our abilities to disassociate from traumatic experiences, our neurotic conditions, including narcissim, etc.

However, I think the results of these tests demonstrate good examples of how we blindly follow authority figures in academic settings. I have been conditioned through formal education to think that information from authority figures who have appropriate credentials related to the subject(s) they discuss holds more weight than information from people who don't have the credentials.

For example, if my teacher has a degree in art and she is teaching me about art, I would trust her opinions on that subject more than I would someone who is not educated in the arts. This leads me to think that people who want to enter the military would trust people educated in the military more than people who are not. The same would go for science, history, religion, and any area where there are experts in the fields. I think it makes sense to trust someone with appropriate credentials more than others, but that this test shows how we can do it blindly, at the expense of trusting our own consciences, or instincts, if you will.

Do you agree?
 
Excerpt from The Sociopath Next Door by Dr Martha Stout:

“Though sociopathy seems to be universal and timeless, there is credible evidence that some cultures contain fewer sociopaths than do other cultures. Intriguingly, sociopathy would appear to be relatively rare in certain East Asian countries, notably Japan and China. Studies conducted in both rural and urban areas of Taiwan have found a remarkably low prevalence of antisocial personality disorder, ranging from .03% to .14%, which is not none but is impressively less than the Western world’s approximate average of 4%, which translates to one in twenty-five people. And disturbingly, the prevalence of sociopathy in the United States seems to be increasing. The 1991 Epidemiologic Catchment Area study, sponsored by the National Institute of Mental Health, reported that in the fifteen years preceding the study, the prevalence of antisocial personality disorder had nearly doubled among the young in America. It would be difficult, closing gin on impossible, to explain such a dramatically rapid shift in terms of genetics or neurobiology. Apparently, cultural influences play a very important role in the development (or not) of sociopathy in any given population, (Stout, p 136).”
 
Scarlet said:
However, I think the results of these tests demonstrate good examples of how we blindly follow authority figures in academic settings.

No, I'm not convinced that's what the experiments show. There were no controls that ensured the participants would perform similarly if 1) causing harm to others and 2) a "more powerful entity than me" was not involved. Having said that, it seems obvious though, that others - even 'credentialed' others - believe this is what the experiments show.

Scarlet said:
I have been conditioned through formal education to think that information from authority figures who have appropriate credentials related to the subject(s) they discuss holds more weight than information from people who don't have the credentials.

I have been conditioned similarly. :)

Scarlet said:
I think the results of these tests demonstrate good examples of how we blindly follow authority figures in academic settings.

Nah, I think it's that the situation is a double-bind. Essentially the person feels there is no safe or otherwise satisfactory exit available until he finds out and does what is expected from him. You got a good grip on the consequence part, though, OSIT, when you said: "We are also ostracized and labeled as crazy and unpatriotic if we play the role of lone soldiers attempting to stand up against them."

Scarlet said:
For example, if my teacher has a degree in art and she is teaching me about art, I would trust her opinions on that subject more than I would someone who is not educated in the arts. This leads me to think that people who want to enter the military would trust people educated in the military more than people who are not. The same would go for science, history, religion, and any area where there are experts in the fields.

Well yes, that is how it typically goes.

Scarlet said:
I think it makes sense to trust someone with appropriate credentials more than others,

I guess it does make sense to a lot of people, but it doesn't work with me. 'Credentialed' people have a lot more subject-related information in their heads, but that's no guarantee their knowledge is not fragmented nor that their understanding is holistic nor that their interpretations are in any way consistent with reality.

Before anything ever comes out of their mouths, 'credentialed' people, like all others, have some intention active and some interpretation of stuff they want to offer and it won't necessarily advance your understanding of anything. A person must still use their own critical faculties in order to exercise responsibility for their actions.

Scarlet said:
...but that this test shows how we can do it blindly, at the expense of trusting our own consciences, or instincts, if you will.

Do you agree?

I agree that "we do it blindly, at the expense of trusting our own consciences, or instincts", but not that the experiments show that as their function or purpose. :)
 
Bud said:
Scarlet said:
However, I think the results of these tests demonstrate good examples of how we blindly follow authority figures in academic settings.

No, I'm not convinced that's what the experiments show. There were no controls that ensured the participants would perform similarly if 1) causing harm to others and 2) a "more powerful entity than me" was not involved. Having said that, it seems obvious though, that others - even 'credentialed' others - believe this is what the experiments show.

I'm having a bit of trouble understanding you here, Bud. I DO think people have misinterpreted the Milgram study, e.g. the "everyone has the capacity to be a serial killer" exaggeration. But I ALSO think that it shows how people DO blindly follow authority figures in a lab setting, with probable applications to other situations and dynamics, e.g. in the military, corporations, schools, etc. I don't understand your first control. Are you saying a control test should have been done where they ACTUALLY shocked people? As for the second, isn't the whole point of the experiment that a "more powerful entity" was involved? I think it's pretty clear that if there were no authority, practically no one would administer the shocks.

Scarlet said:
...but that this test shows how we can do it blindly, at the expense of trusting our own consciences, or instincts, if you will.

Do you agree?

I agree that "we do it blindly, at the expense of trusting our own consciences, or instincts", but not that the experiments show that as their function or purpose. :)

Again, I don't understand. From reading of people who participated in these experiments, they DID go against their consciences, causing some to experience intense inner conflict. Any help clearing up my confusion?
 
Hey, AI.

Approaching Infinity said:
Bud said:
Scarlet said:
However, I think the results of these tests demonstrate good examples of how we blindly follow authority figures in academic settings.

No, I'm not convinced that's what the experiments show. There were no controls that ensured the participants would perform similarly if 1) causing harm to others and 2) a "more powerful entity than me" was not involved. Having said that, it seems obvious though, that others - even 'credentialed' others - believe this is what the experiments show.

I'm having a bit of trouble understanding you here, Bud. I DO think people have misinterpreted the Milgram study, e.g. the "everyone has the capacity to be a serial killer" exaggeration.

Or that "obedience to authority" is the sole causal factor? That's what I am in disagreement with.

Approaching Infinity said:
But I ALSO think that it shows how people DO blindly follow authority figures in a lab setting, with probable applications to other situations and dynamics, e.g. in the military, corporations, schools, etc.

Yep as long as the main elements are in place (authority, trauma content to be disassociated into and the opportunity and means to cause someone harm) it works just fine.


Approaching Infinity said:
I don't understand your first control. Are you saying a control test should have been done where they ACTUALLY shocked people?

No, a control test could have been done where the shocker did something else...like give the victim a dollar or take a dollar away from the victim every time he gave a wrong answer. Other possible variations are spelled out on the posts linked on my earlier reply.


Approaching Infinity said:
As for the second, isn't the whole point of the experiment that a "more powerful entity" was involved? I think it's pretty clear that if there were no authority, practically no one would administer the shocks

It would be more clear if they actually tried the experiment to demonstrate that. The experimenters might be surprised if some people did dissociate into their own trauma content and shock the victims anyway just by being told that's what needs to be done to finish the experiment. The instructor could even be someone unknown or non-intimidating in order to isolate trauma content and disassociation as a data point to be tested for.

Scarlet said:
...but that this test shows how we can do it blindly, at the expense of trusting our own consciences, or instincts, if you will.

Do you agree?

I agree that "we do it blindly, at the expense of trusting our own consciences, or instincts", but not that the experiments show that as their function or purpose. :)

Again, I don't understand. From reading of people who participated in these experiments, they DID go against their consciences, causing some to experience intense inner conflict. Any help clearing up my confusion?

Yep, "they DID go against their consciences, causing some to experience intense inner conflict", but my point was that the experiments didn't prove "obedience to authority" as the sole causal factor.
 
Bud said:
Yep, "they DID go against their consciences, causing some to experience intense inner conflict", but my point was that the experiments didn't prove "obedience to authority" as the sole causal factor.

Ahh, OK. I see know. Thanks for taking the time to explain!
 
Bud said:
Scarlet said:
...but that this test shows how we can do it blindly, at the expense of trusting our own consciences, or instincts, if you will.

Do you agree?

I agree that "we do it blindly, at the expense of trusting our own consciences, or instincts", but not that the experiments show that as their function or purpose. :)

Again, I don't understand. From reading of people who participated in these experiments, they DID go against their consciences, causing some to experience intense inner conflict. Any help clearing up my confusion?

Yep, "they DID go against their consciences, causing some to experience intense inner conflict", but my point was that the experiments didn't prove "obedience to authority" as the sole causal factor.
Thanks Bud and Approaching Infinity, for this dialogue! :D I could see how it can't be considered a "sole causal factor" as you have mentioned, Bud, but I do see the power authority figures have over others and to me, this test offers some statistics on that.

Also, I think the "give or take a dollar from the victim" reference doesn't have the potential of triggering someone's conscience as much as physically harming a victim would. Though, it would be nice to see some research on that to have a comparison.

So, I think this study shows that the majority of participants were willing to obey their authorities to the point of causing physical harm to another human being, because they "knew" it would benefit the authority figure and the institution that figure represented. Perhaps the participants would be satisfied for having accomplished the job the authority figure required of them, but if they failed to submit, the authority figure would question and possibly criticize them for having "failed" the test. If the authority figure is a psychopath this theory I'm seeing with Milgram's test would be a most useful tool for them.

But if I'm seeing something a bit skewed, I appreciate reinterpretations of my thoughts here, thanks! :)
 
It's interesting that Dr. Lobaczewski and his colleagues estimated that 6% of the population have this disorder and that Dr. Stout wrote that the rate of anti-social personality disorder is increasing in America. It makes me wonder what kind of a world Dr. Lobaczewski lived in at that time and what kind of one I'm living in now...

Stout also mentioned that ASP is not as common in China and Japan. Does anyone have any thoughts on this?
 
Scarlet said:
Stout also mentioned that ASP is not as common in China and Japan. Does anyone have any thoughts on this?

Where would anyone get the actual, factual statistics or other evidence, since China is not an open society?
 
Good question, Bud! I intend to research the research documents that Dr. Stout references in the back of her book regarding this. I just need to get back to my local college library to gain access to the material. I'll post what I find out to hopefully answer your question. As it is, The Sociopath Next Door is the only book I've read that makes this comparison.
 
I've been told by someone that in order to determine if psychopathy is indeed hereditary all that would be required would be to test the genetic coding of fetus' of several confirmed psychopaths for the genome sequence for psychopathy. Has the genome sequence for psychopathy been identified yet? If so, where are those studies published and are they peer-reviewed?
 
Denis said:
I've been told by someone that in order to determine if psychopathy is indeed hereditary all that would be required would be to test the genetic coding of fetus' of several confirmed psychopaths for the genome sequence for psychopathy. Has the genome sequence for psychopathy been identified yet? If so, where are those studies published and are they peer-reviewed?

Here's an abstract that appears to be reporting what's been so far done of the decoding of the DNA that constitutes the human genome:
_http://www.sciencemag.org/content/291/5507/1304.abstract

The list of articles following that abstract may give you an idea what genetic and epigenetic markers have been so far identified.
 
Back
Top Bottom