Distribution Source: MIT World and iTunesU
Content Source: MIT
Format: Video
Length: 1 hour 50 minutes and 58 seconds
Link: The Lucifer Effect
I was worried this afternoon - after watching the first few minutes of the MIT world video (a new source recommended by a commentor - thanks), I knew I wanted to write on this topic, but also wanted to go to the gym. Given that the video was 2 hours long, doing both seemed impossible. So I decided to see if iTunesU carried the video as well. Sure enough, I was able to download the full video to my phone in 5 minutes... not only that, I was also able to plug my phone into the treadmill at the gym and watch the video while running. Pretty cool.
This week's topic focuses on the human capacity for both good and evil, from the perspective of Dr. Phillip Zimbardo. Dr. Zimbardo is most famous for his Stanford Prison Experiment, in which he gathered a bunch of "normal" Stanford students and randomly assigned them to be prisoners or guards. The results are fascinating, and are taught in every Pyschology 101 course in the country. In short, the experiment had to be called off after six days because the prisoner-guard dynamic had become so out of control. For me, this experiment has always reinforced the importance of critical thinking and maintaining individuality in the face of social pressures. If a few dozen smart, regular Stanford kids can abuse each other so quickly, we are all susceptible to situational and systemic pressures pushing us to do something that falls outside of our moral code.
Dr. Z makes an interesting parallel between his Stanford Prison Experiment and the tragedy of Abu Ghraib. I call it a tragedy because it was in my mind extremely unnecessary and was damaging to everyone involved: those who were abused, those who took the pictures and carried out the abuse, and the United States and its perception globally. Following the release of the pictures, Dr. Z highlights how the government - like any institution faced with a scandal - pointed to this as an incident of a few "bad apples." If it weren't so serious this shallow explanation would be laughable.
Few events have received as much scrutiny and military, government and journalistic review as the Abu Ghraib scandal. Across the board, they describe a fundamentally screwed up institution. Similar to the Stanford Prison Experiment, most of the abuses took place on the night shift. For three months, no senior officer so much as visited the prison after hours. The stress level was extremely high - one Army reservist was in charge of over 1,000 prisoners, 60 Iraqi policemen, and 12 Army reserviests. He had received no specific training for the job and as mentioned, had no supervision. The chaotic conditions included constant weapons smuggling by Iraqi policemen, a neverending sewer stench, power blackouts, prisoner escapes, grenade attacks, noise and rationed water. The head Army reservist worked 40 days straight in 12 hour shifts per day. In his off-shift he slept in the prison. In social psychology, this 100% engulfment is called a "total situation."
Because of its proximity to dangerous Iraqi slums, the British told the US not to use the Abu Ghraib prison. Furthermore, for the first time Military Intelligence units were actively encouraging the Military Police (the Army reservists) to help break down prisoners. Of course, this is not the job of the police, whose job it is to keep order in the prison. When viewing this in the context of the administration's policy condoning "soft" torture tactics, it isn't hard to imagine how prisoner abuse resulted.
None of this serves to excuse any of the behavior that took place. Rather, it shows how putting "normal" people into a terrible situation, coupled with a lack of training and supervision, as well as tacit (and in some cases explicit) approval from superiors, results in a total disaster. Dr. Z and others had the opportunity to meet with and review the files of those who took the pictures and committed some of the acts; in his and military psychologists' opinions, these people were very normal. Indeed, Dr. Z points out that the lead officer was someone with the capacity to be a hero in a different situation. Yet instead he was a perpetrator of evil, smearing someone with his own feces and forcing others to similate sex acts while naked.... how is this possible?
Dr. Z's underlying point is that good and evil are hardly black and white. The human brain has an unbelievable capacity to be selfish and caring, heroic or villanous, creative or destructive. In other words, both good and evil are core aspects of human nature, and people can be transformed by powerful situational forces. After describing some other historical examples (the Jim Jones mass suicide, Eichman and the Nazis, etc.), he put together ten simple lessons on how to create evil in good people:
- Create an ideology to justify any means ("national security", etc.)
- Take small steps/minor action first
- Successively increase small actions
- Make sure a seemingly “just authority” is in charge
- Introduce a compassionate leader who changes gradually to become authoritarian monster
- Implement ever-changing/vague rules
- Re-lable situational actors & actions (“Teacher helping” when reality is aggressor hurting)
- Provide social models of compliance
- Allow verbal dissent, but insist on behavioral compliance (verbal dissent is the feel good thing)
- Make exiting difficult (this, he says, is the key to date rape…)
Perhaps some of you have seen a few of these steps in action, either from bosses, religious or government leaders. Dr. Z views corporate or institutional evil as the biggest evil, because it has the capacity to affect many people. In the case of corporations or governments, the rules of action are defined not by ethics, but rather by laws. The question is often not "what is right," but "what can we get away with?" He also described how corporate evil is always about the first little step - perhaps in the name of being a "team player." None of this is meant to be conspiratorial; it should instead reinforce to all of us that doing things because "that's what has always been done" or because someone "says so" is a poor reason that can have serious consequences. It's also clear to me that in a corporate or institutional setting, many of these evils can happen in marginal and indeed insignificant ways... with powerful disincentives to stand up for what is right.
So how do we keep ourselves from being even marginally evil? Dr. Z has also conveniently put together a list of twenty ways of preventing unwanted influences... while I won't list all of them here, you can click on this link to see the full list. Probably the most useful for me will be the following:
In all authority confrontations: be polite, individuate yourself and the other, make it clear it is not “your problem” in the process, or situation; describe the problem objectively, do not get emotional, state clearly the remedy sought, and the positive consequences expected – hold off on the threats and costs to them or their agency as last resort.
See ya'll next week.
Subscribe to:
Post Comments (Atom)
Well thought out and well said. As usual. Sorry I missed out on Vegas. Woulda been dope sauce.
ReplyDeleteI think situational forces turn good people bad more often than we'd like to think. Maybe not Abu Ghraib bad but selfish bad, douchey bad and thoughtless bad.
Of course, this all depends on our collective definition of "bad"...