Welcome back. I'm assuming that you've now seen the Quiet Rage documentary. If not, please watch that video first and then come back and watch this one. Okay? I won't go anywhere—I promise I'll stay right here in this video. Okay? So, if you haven't watched, shoo! Go watch! We good? Okay. I think one of the most amazing aspects of the Stanford Prison Experiment is how closely it parallels real life cases of prison abuse and human rights violations— for example, the kind of prison abuse we saw in Abu Ghraib in Iraq. In both cases, the guards stripped the prisoners, they used sexual tactics to humiliate prisoners, put bags over their heads, and they performed the worst abuses at night, when the guards thought they weren't being observed. In fact, the parallels were so uncanny that when the Abu Ghraib scandal first made the news in 2004, the Stanford Prison Experiment website suddenly started getting more than a quarter million page views per day, and DVD copies of Quiet Rage were shown to U.S. military leaders in Iraq, as well as at least one member of the U.S. House Armed Services Committee. The U.S military even began requiring all guards at Abu Ghraib to watch Quiet Rage so that similar abuses wouldn't occur in the future. Of course, as with Stanley Milgram's research on obedience, the Stanford Prison Experiment raises significant ethical questions, and I very much hope that you'll share your thoughts about that in the class discussion forums. For example, what do you think about the trade-off between the information gained from the research and the stress experienced by the participants? How do we weigh those two different things? Just to put a human face on the question, consider the case of Richard Yacco, Prisoner 1037, whose parents saw him during visitor's night on Day 4 of the experiment, and who, the next day, was released early from the study because Professor Zimbardo noticed symptoms of depression. Here's a letter that Richard Yacco's mother wrote to Professor Zimbardo after seeing her son during visitor hours. "My husband and I visited our son at 'Stanford County Prison.' It seemed very real to me. I had not expected anything quite so severe nor had my son when he volunteered, I am sure. It gave me a depressed feeling and I was somewhat upset when I saw him. He looked very haggard and his chief complaint seemed to be that he had not seen the sun for so long. I asked if he was sorry he volunteered and he answered that at first he had been. However, he had gone through several different moods and he was more resigned. This will be the hardest earned money he will ever earn in his life, I am sure. Mother of '1037.'" So you can see that the power of the role even spilled over to the parents, one of whom referred to herself as "Mother of 1037." Was this level of stress unethical? Reasonable people can reach different conclusions. My own view is that a high level of stress is not unethical in and of itself. After all, there are many medical and psychological studies that involve a high level of stress, and Professor Zimbardo did remove Richard Yacco as soon as he detected signs of depression. In fact, the father thought that the son should continue participating in the study, and Richard Yacco later told a reporter that he didn't think he was going through any kind of depression. The more serious ethical challenge, I think— and one that's shared with Stanley Milgram's research on obedience—is that the study didn't follow two bedrock principles that now govern research with human participants. The first of these is informed consent: researchers need to let people know of any reasonably foreseeable factors that might influence their decision whether to participate. You might reasonably expect, for example, that imprisoning people could lead them to feel stress, or helplessness, or isolation, which means that for the consent to be truly informed, participants would need to be warned in advance that they might experience strong negative emotions if they choose to participate. Second, participants need to be informed that they have a right to withdraw from the research once participation has begun. The wording of these two principles is from the American Psychological Association, but the basic ideas are widely shared around the world. In the case of the Stanford Prison Experiment, these standards had not yet been adopted by the research community, so the consent form that participants signed didn't obtain fully informed consent, and it contained a provision that Stanford would never approve today. Specifically, the form said the following: "I will only be released from participation for reasons of health deemed adequate by the medical advisers to the research project or for other reasons deemed appropriate by Dr. Philip Zimbardo, Principal Investigator of the project." In other words, the consent form asked participants to sign away their right to withdraw—a definite no-no by modern standards. If you'd like to read more about the Stanford Prison Experiment, the ethics involved, and Professor Zimbardo's perspective looking back on the study, the most comprehensive source of information is "The Lucifer Effect," Professor Zimbardo's bestseller which devotes over 200 pages to the Stanford Prison Experiment. I'm also very pleased to say that Random House has made Chapter 11 of the book available for free to anyone in our class, so thank you Random House! This particular chapter, entitled "The Stanford Prison Experiment: Ethics and Extensions," is one of the most fascinating chapters of the book. Another good source of information is the Stanford Prison Experiment website, PrisonExp.org, which is a Social Psychology Network partner site that has hundreds of pages and links related to the research, including archival materials from the experiment. Professor Zimbardo and Social Psychology Network co-developed the site in 1999, which means it's about a century old in web years. Since the '90s, it's received over 130 million page views, and you're warmly invited to visit. And speaking of visiting, for those of you who can make it to Akron, Ohio, one other place you might consider visiting is the Center for the History of Psychology, which has an entire exhibit on the Stanford Prison Experiment, including one of the original cell doors. These are just a few photos I took in 2012. So, in the final analysis, what's the main lesson of the Stanford Prison Experiment? Here's what Professor Zimbardo wrote in the Lucifer Effect: "Good people can be induced, seduced, and initiated into behaving in evil ways... The primary simple lesson the Stanford Prison Experiment teaches is that situations matter. Social situations can have more profound effects on the behavior and mental functioning of individuals, groups, and national leaders than we might believe possible." This lesson—that situations matter—is very consistent with results from Stanley Milgram's research on obedience and Solomon Asch's research on conformity, but it's important not to misinterpret that lesson as saying that the situation is the only thing that matters, or individual characteristics never matter, or situations matter more than individual characteristics. The Stanford Prison Experiment was never intended to show that prisoner and guard roles always, or even usually, lead to abuse. Rather, it demonstrated that when people in positions of authority become deindividuated—when they lose sight of their personal identity and become absorbed in a role—the situation can spin out of control. Here's one way to think about it. The main point of Milgram's research wasn't that 65% of people in the baseline condition deliver the highest electric shock; it was that under certain circumstances, people will obey a stranger's command to harm another person. The main point of Asch's research wasn't that people will conform on 32% of all critical trials; it's that under certain circumstances, people will go along with the group, even if it means contradicting the evidence of their senses. And in the Stanford Prison Experiment, the main point wasn't that guards will become abusive 65% of the time, or 32% of the time, or 10% of the time. In fact, the main lesson isn't even about prisoners or guards. It's that we need to guard ourselves against situational factors that can lead us to behave in destructive ways. You know, just as climate change is often referred to as an inconvenient truth by climate scientists and environmentalists, social psychology has its own inconvenient truth, and that truth is that situational factors sometimes override the best of intentions and lead good people to do bad things. If we ignore that inconvenient truth in favor of a simplistic view of the world as basically made up of good guys and bad guys, we risk getting blindsided by powerful situational factors that, in some cases, can get us into deep trouble: group pressures, authority figures, assigned roles, and so on. That's the enduring lesson of the experiment. I'll see you next time.