You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.
Skip Navigation

The Thin Line

The Lucifer Effect: Understanding How Good People Turn Evil
By Philip Zimbardo
(Random House, 551 pp., $27.95)

WHY DO human beings commit despicable acts? One answer points to individual dispositions; another answer emphasizes situational pressures. In 2005, Secretary of State Condoleezza Rice stressed the importance of individual dispositions in describing terrorists as “simply evil people who want to kill.” Situationists reject this view. They believe that horrible acts can be committed by perfectly normal people. The most extreme situationists insist that in the right circumstances, almost all of us might be led to commit atrocities.

The situationist view has received strong support from some of the most famous experiments in social science, conducted by the psychologist Stanley Milgram in the early 1960s. In those experiments, ordinary people were asked to administer electric shocks to a person sitting in an adjacent room. Milgram’s subjects were told, falsely, that the purpose of the experiments was to test the effects of punishment on memory. Unbeknown to the experiment’s subjects, the person in that adjacent room was Milgram’s confederate, and there were no real shocks. The apparent shocks were delivered by a simulated shock generator, offering thirty clearly delineated voltage levels, ranging from 15 to 450 volts, accompanied by verbal descriptions ranging from “Slight Shock” to “XXX.” As the experiment unfolded, the subject was asked to administer increasingly severe shocks for incorrect answers, well past the “Danger, Severe Shock” level, which began at 375 volts.

In Milgram’s original experiments, the subjects included forty men between the ages of twenty and fifty. They came from a range of occupations, including engineers, high school teachers, and postal clerks. What do you think you would do as a participant in such an experiment? What do you think that others would do? Most people predict that in such studies almost all subjects would refuse to proceed to the end of the series of shocks. The expected break-off point is the “Very Strong Shock” of 195 volts. In Milgram’s experiment, however, every one of the forty subjects went beyond 300 volts. A large majority--twenty-six of the forty subjects, or 65 percent--went to the full 450-volt shock, five steps beyond “Danger, Severe Shock.” Replications of Milgram’s experiments, with thousands of diverse people in numerous countries, show essentially the same behavior. And women do not behave differently from men.

Milgram explained his results as demonstrating obedience to authority, in a way reminiscent of the behavior of Germans under Nazi rule. Indeed, he conducted his experiments in large part to understand how the Holocaust could have happened. Milgram concluded that ordinary people will follow orders even if the result is to produce great suffering in innocent others. Asked whether something like Nazism could occur in the United States, Milgram memorably replied that “if a system of death camps were set up in the United States of the sort we had seen in Nazi Germany, one would be able to find sufficient personnel for those camps in any medium-sized American town.”

Milgram’s experiments involved an authority figure, in the form of a professor explicitly asking people to participate in an apparently reputable experiment. In 1971, Philip Zimbardo conducted a different study of situational influences. The basic idea was to ask ordinary young men to act as either prisoners or guards in a mock prison for a period of two weeks. Zimbardo’s remarkable finding was that after just a few days, apparently normal people, acting as guards in the mock prison, turned cruel and even sadistic--not because anyone ordered them to act that way, but as a result of the role in which they found themselves. As a consequence of the cruelty of the guards and the disintegration of several of the prisoners, Zimbardo’s experiment had to be terminated after only six days.

In his new book, Zimbardo gives the full story of the Stanford Prison Experiment for the first time. Generalizing from that story, he suggests that dispositionism is a serious error, that good and evil are largely a function of our contexts and our roles, and that almost all of us are capable of real evil, given the proper situation. Zimbardo uses his experiment to cast light on diverse problems, including the conduct of American soldiers at Abu Ghraib, airplane accidents, human inaction in the face of evident cruelty, the mistreatment of patients in hospitals, and the behavior of suicide bombers and terrorists in general.

The Stanford Prison Experiment started with an ad in a local newspaper asking for volunteers for a study of prison life, lasting two weeks and paying $15 a day (about $75 by current standards). Seventy of those who answered the ad were called to Stanford for interviews and to take a series of psychological tests. All seventy were American college students; most had completed summer-school courses at Stanford or Berkeley. Twenty-four of them were selected on the ground that they were the most healthy and normal. Half were randomly assigned to be prison guards; the other half were randomly assigned to be prisoners. All of them indicated that they would prefer to be prisoners, in part on the ground that while they could not imagine being a prison guard after college, they could imagine being in jail, and they thought that they might learn from the experience. (Remember, this was 1971.) All of them agreed to participate through informed consent forms. They were also informed that if they were assigned the role of prisoners, they would suffer deprivations of their civil rights and have only minimally adequate diet and medical care. Those assigned to be prisoners were also told to wait at home on a particular Sunday, when they would be contacted to begin the experiment.

On that day, they were surprised to find themselves “arrested” by actual Stanford police officers (enlisted by Zimbardo), who handcuffed them, searched them, warned them of their rights, and booked them at police headquarters. Brought to a mock prison in the basement of the Stanford psychology department, they were stripped, deloused, and made to wear smocks, without underwear, and with numbers sewn on front and back. They were also forced to wear ankle chains and nylon stocking caps (not having been asked to shave their heads). They walked in uncomfortable rubber thongs. Having worked with one of Zimbardo’s graduate students, the guards read the prisoners a series of rules: “prisoners will be allowed five minutes in the lavatory,” “prisoners must address each other by number only,” “prisoners must never refer to their condition as an ’experiment’ or a ’simulation,’” and others. Somewhat ominously, prisoners were told that the last rule was the most important: “Failure to obey any of the above rules may result in punishment.”

The first day of the experiment was awkward for guards and prisoners alike, and not terribly eventful. Some of the guards did seem to relish their role, asking prisoners to do push-ups as “punishment” for laughing at some of the guards’ comments. Whenever a prisoner showed an irreverent attitude, he was likely to be asked to do more push-ups. Some guards engaged in acts of what Zimbardo calls “arbitrary cruelty"--say, by leaning on prisoners and pushing them back with billy clubs.

Things got much worse on Monday. On that day the prisoners staged a rebellion, ripping off their numbers, refusing to obey commands, and mocking the guards. Zimbardo asked the guards to take steps to control the situation, and they did exactly that. Their responses consisted of forcing the prisoners to do jumping jacks and push-ups; stripping them naked in their cells; depriving them of meals, pillows, blankets, and beds; and placing them in solitary confinement. Some of the prisoners were baffled by the sheer aggressiveness of the response, with one screaming wildly, “No, no, no! This is an experiment! Leave me alone! Shit, let go of me, fucker! You’re not going to take our fucking beds!” The rebellion was effectively crushed.

As the behavior of the guards became increasingly aggressive and humiliating, one of the prisoners, a fellow named Doug, broke down and asked to be released. Zimbardo, having adopted the role of “prison superintendent,” met with him privately. Zimbardo told Doug that he would forfeit his payment if he quit early, asked him to serve as an informer in return for “special privileges,” and generally convinced him to continue. Returning to the prison, Doug falsely announced to the other prisoners that they could not leave. Shortly thereafter his own stress reactions appeared to become hysterical, even pathological, as he threatened violence against both the guards and himself; and he was indeed released. On each of the next three days, another prisoner showed acute stress reactions and had to be released. The remaining prisoners became subdued and “zombie-like.”

What of the guards? The picture was one of growing cruelty, aggression, and dehumanization. Sometimes without provocation, the guards stripped the prisoners naked, hooded them, chained them, denied them food or bedding privileges, put them into solitary confinement, and made them clean toilet bowls with their bare hands. There was sexual humiliation as well. On Thursday, one of the most aggressive guards, nicknamed “John Wayne,” called out to several of the prisoners, “See that hole in the ground? Now do twenty-five push-ups, fucking that hole! You hear me!” The prisoners dutifully obeyed. He continued, “Now you two, you’re male camels. Stand behind the female camels and hump them.” Submitting to the order, the prisoners simulated sodomy.

The experiment ended prematurely after Zimbardo enlisted the help of Christina Maslach, a recent Stanford Ph.D. in psychology who was starting her career as an assistant professor at Berkeley. In Maslach’s own words, “I looked at the line of hooded, shuffling, chained prisoners, with guards shouting orders at them…. I was overwhelmed by a chilling, sickening feeling.” Refusing to engage Zimbardo’s claim that this was “amazing stuff,” Maslach ended up in a heated argument with him (notwithstanding the fact that they were romantically involved at the time). She describes the “fight” as “too long and too traumatic,” but eventually Zimbardo acknowledged that the experiment had had an adverse effect on him as well as on the student subjects. He decided to halt the experiment on Friday. (Maslach and Zimbardo later married, and the book is dedicated to her.)

Zimbardo wants to draw large lessons from his experiment. He insists that individual dispositions are far less important than we tend to think they are, and that situational pressures can lead decent people to commit terrible acts. Recall that the prisoners and the guards were randomly assigned to their roles: “The line between Good and Evil, once thought to be impermeable, proved instead to be quite permeable.” Those assigned to be prisoners behaved as prisoners and were in a sense broken by the role. Those assigned to be guards behaved badly, even viciously, their general normality notwithstanding. “At the start of this experiment, there were no differences between the two groups; less than a week later, there were no similarities between them,” Zimbardo writes. Notably, the prisoners were skeptical of the claim of random assignment and insisted, after the conclusion of the experiment, that the guards were taller than they were. (They were wrong; the two groups had the same average height.)

In pointing to the power of situational influences, Zimbardo draws on a wide range of evidence. In one experiment, twenty-one of twenty-two nurses were willing to follow a doctor’s orders to give a twenty-milliliter dose of the fictitious drug “Astrogen"--even though the label clearly stated that five milliliters was the usual dose and warned that ten milliliters was the maximum. Similar deference to authority can be found outside of social science experiments. Almost half of the nurses surveyed responded that they could remember a time when they had actually “carried out a physician’s order that [they] felt could have had harmful consequences to the patient.” According to one study, excessive obedience on the part of the first officer, in deference to bad judgments by the captain, may be responsible for as many as one-quarter of airplane accidents.

Or consider the fact that no fewer than sixty-eight fast-food restaurants have been subject to successful “strip-search scams,” in which a male caller, masquerading as a police officer named Scott, informs an assistant store manager that an employee at the restaurant has committed theft. Having learned a great deal about the local conditions, “Officer Scott” asks the manager for the name of an attractive female employee who, Scott says, has been engaged in theft and is likely to have contraband on her now. “Officer Scott” is then allowed to talk to the employee, and he tells her that she has two choices. She can come to police headquarters to be strip-searched or instead be strip-searched at that very moment by a fellow employee. Believing herself to be innocent, the employee consents to the latter. “Officer Scott” then instructs that fellow employee to search the young woman’s most private places, with the store’s video cameras looking on. (In some cases, “Officer Scott” asked for still more humiliating acts, and the employees were obedient. I omit the details.)

Drawing on studies of the apparent normality of those involved in Nazi war crimes, Zimbardo gives a social-science twist to Hannah Arendt’s claim about the “banality of evil.” He adds that suicide bombers are themselves quite normal by every objective measure. Contrary to a widespread view, they are not poor or badly educated, or mentally ill. In his account, their actions are made possible by their social networks and the successful use of “a variety of social psychological and motivational principles to assist in turning collective hatred and general frenzy into a dedicated, seriously calculated program of indoctrination and training for individuals to become youthful living martyrs.”

In explaining what makes atrocities possible, Zimbardo places a large emphasis on “de-individuation"--a process by which both perpetrators and victims become essentially anonymous and are thereby transformed into a type or a role. The very decision to wear a particular uniform can have significant behavioral effects; warriors who change their appearance in preparation for war are more likely to brutalize their enemies. During the process of de-individuation, people enter a state of arousal in which they do not face the ordinary social sanctions and in which their own moral doubts are silenced. In Zimbardo’s account, de-individuation ensures the triumph of “the Dionysian trait of uninhibited release and lust” over the “Apollonian central trait” of “constraint and the inhibition of desire.”

Zimbardo believes that these general points, and the Stanford Prison Experiment in particular, help to explain the horrific behavior of American soldiers at Abu Ghraib. Recall the well-publicized incidents, some of them photographed, in which soldiers humiliated prisoners by leading them around by dog leashes, forcing them to simulate fellatio, and making them masturbate in front of a cigarette-smoking female soldier (herself giving a high-five salute of approval). American personnel also threatened male detainees with rape, beat them with broom handles and chairs, punched and kicked them, and forced them to wear women’s underwear. Zimbardo argues that such abuses were a predictable consequence of situational forces, not (as prominent military leaders have urged) of the dispositions of rogue soldiers or a few bad apples: “It was as though the worst-case scenario of our prison experiment had been carried out over months under horrendous conditions, instead of in our brief, relatively benign simulated prison.” Zimbardo insists that the real blame lies not with the soldiers but with the situation in general, including “leadership failures, little or no mission-specific training, inadequate resources, and interrogation-confession priorities.”

Despite his emphasis on the human susceptibility to commit evil acts, Zimbardo ends on an upbeat note, exploring how people might resist pressures and offering numerous examples of heroic behavior. He calls attention to the value of learning to assert one’s independence and of being alert to the surprising force of the social “frames” in which we find ourselves. (Some of his advice is banal and even embarrassing: “I can oppose unjust systems,” or “I made a mistake!” or—worst of all—“I am Me, the best I can be.”) Zimbardo’s catalogue of heroes includes people who rebelled against apartheid, McCarthyism, and the Vietnam War. He finds that “heroic behavior is rare enough not to be readily predictable by any psychological assessments of personality,” though some people display heroism even under immense pressure to conform or to obey.

Zimbardo’s discussion is passionate and illuminating, and he accumulates massive evidence on behalf of his central claim, about the power of situational pressures. Still, it is hard to evaluate his conclusions without exploring two problems. The first involves the precise mechanisms that produce the outcomes that he describes. The second involves the limits of situationism, which does not, I think, have the unambiguous support that Zimbardo claims for it. To approach these problems, we need to return to Milgram’s experiments.

Why, exactly, did the subjects in those experiments show a willingness to administer painful shocks to innocent people? Were the subjects in a Dionysian state of de-individuation? Certainly not. When they obeyed the experimenter, it was because they deferred to his knowledge and his authority. If an apparently reputable experimenter asks subjects to proceed, most might believe, not unreasonably, that any harm done to the victims is not serious, and that the experiment actually has significant benefits for society. (In Milgram’s experiments, they would have been right to think so.) On this account, many of the subjects would have put their moral qualms to one side not out of blind obedience, but because of a judgment that any such qualms were probably ill-founded. And if nurses are willing to follow doctors’ instructions even to the point of exceeding a recommended dose, it might be because they think that if the doctor wants to go past the recommended dose, it is probably for perfectly good reasons. When nurses carry out the wishes of doctors and put their own doubts to one side, it is probably because they are willing to defer to those who have more information.

The behavior of first officers in the cockpit, and of those victimized by the “Officer Scott” scam, can be understood in similar terms. The straightforward lesson is that people often follow the instructions of authorities who appear to have superior expertise. Much of the time such deference is desirable, a sensible rule of thumb. As a general practice, nurses should not second-guess the medical decisions of doctors. In the cockpit, first officers should usually follow the captain. Citizens should generally cooperate with the police. But problems can arise, and tragedy too, if a clever experiment or ordinary life produces a situation in which the authorities are wrong or have lost their moral compass. On this view, the main social-science finding is that people commit bad acts when their usually sensible rule of thumb goes wrong in particular cases.

Yet none of this is sufficient to explain the Stanford Prison Experiment. Neither prisoners nor guards were deferring to an experimenter who was asking them to engage in morally objectionable actions. To be sure, Zimbardo did ask the guards to impose real pressure on the prisoners: “We can create fear in them, to some degree. We can create a notion of the arbitrariness that governs their lives, which are totally controlled by us…. They will be able to do nothing and say nothing that we don’t permit. We’re going to take away their individuality in various ways.” But the guards went far beyond Zimbardo’s instructions. Zimbardo did not tell the guards to force prisoners to clean toilets with their bare hands or to pretend to sodomize one another.

The prisoners’ reactions are more understandable and less interesting than those of the guards. College students in or near Palo Alto suddenly find themselves handcuffed, stripped, and jailed. It is an experiment, to be sure; but it is lived as if it were reality. Early on, a fellow prisoner tells them that they cannot leave; they are going to be held as prisoners, and to be treated however the guards like, for the full two weeks (unless they completely break down). In a setting of this kind, the student prisoners might well become desperate or panicked. Others might rebel against the guards, but when the rebellion is broken, they might well shut down and become “zombies” for the duration of the experiment. Fighting and shutting down are emphatically human reactions to terrifying situations, and in Zimbardo’s experiment, those reactions were perfectly understandable.

The most interesting puzzle is the behavior of the guards. How could ordinary college students show such a high level of aggression and cruelty? It is true that no authority was issuing orders to the prison guards. But Zimbardo specifically instructed guards to assume a particular role, in which they had “total power,” with the task of producing “the required psychological state in the prisoners for as long as the study lasted.” Zimbardo, a professor at Stanford, told college students to make the students “feel as though they were in prison.” These instructions, alongside the very role of the guard, conveyed certain information about what should be done. Those who find themselves operating as prison guards know that they should behave in certain ways. This is no less true in an experimental setting than elsewhere. Indeed, the experimental setting might have aggravated the behavior of some of the guards, who knew that certain safeguards were in place and that their specific task was to induce “the required psychological state.”

If prisoners rebel, guards are certainly supposed to respond. Recall that Sunday, the first day of Zimbardo’s experiment, was awkward and largely uneventful. It was on Monday that the prisoners undertook their rebellion, and Zimbardo instructed his guards that it was their job to control the situation. In the aftermath things started to get ugly, and they went downhill from there. Perhaps a different experiment, at a different place and with different subjects, would have had a different outcome—not necessarily a good one, but with significantly less in the way of cruelty and abuse.

These speculations may be wrong, but even if they are right they do not amount to an objection to situationism. Notice here a big difference between Milgram’s experiments and what happened at the Stanford mock prison. Milgram’s initial studies were replicable and replicated (repeatedly). We know for a fact that most people are willing to follow the experimenter’s instructions and to administer painful shocks. We even have good clues about what underlies their behavior. Zimbardo himself thinks that it would be ethically problematic to try to replicate his experiment, and very few efforts have been made to do so. (The most noteworthy, in Australia, produced a conclusion broadly similar to Zimbardo’s.) I do not mean to doubt Zimbardo’s convincing argument that social roles and contextual pressures can lead normal people to do very bad things. But we do not know whether and when prison guards will show the same degree of cruelty and sadism.

For purely situational accounts of human behavior, there is an evident problem. The Stanford Prison Experiment uncovered significant differences among both prisoners and guards. Some of the prisoners could not handle the situation and essentially screamed, “Let me out of here!”—in part, perhaps, as a strategic effort to escape a terrible situation. Some of the guards did their jobs, but without cruelty, and they did various favors for the prisoners. These identifiably “good guards” were altogether different from others, whose behavior was sadistic. To his credit, Zimbardo acknowledges the diversity of behavior on the part of the guards. In his own words, “some guards have transformed into perpetrators of evil, and other guards have become passive contributors to the evil through their inaction.” So dispositions did matter. There is a real difference between the “perpetrators” and the “passive contributors.”

Here is one way to think about Zimbardo’s findings, designed to sort out the relationship between dispositions and social contexts. In experimental settings and in the real world, most people will be reluctant to harm others. Often their reluctance can be overcome with appropriate incentives and the right information. If people can be assured that any harm is small or nonexistent, or necessary to produce some greater good, they might well put their moral qualms to one side. (Recall Milgram’s experiments.) If people can be assured that any harm is deserved, or part of legitimate punishment, then they might well be willing to inflict harm. (Prison guards do not refuse to put recalcitrant prisoners in solitary confinement.) But—and this is the key point—different people have radically different “thresholds” that must be met before they will be willing to harm others.

Some people—the “bad guards” in Zimbardo’s experiment—have a real capacity for sadism and cruelty; that capacity is built into their dispositions. If such people are instructed to act sadistically, or merely authorized to do so, they will. Other people have somewhat higher thresholds. They will require strong situational assurance that harming others is justified or acceptable, all things considered. Still other people—Zimbardo’s “heroes”—have exceedingly high thresholds, or perhaps their moral convictions operate as an absolute barrier. There is a continuum of thresholds from the sadists to the heroes, or from the devils to the saints.

If all this is right, we can understand why different prison experiments might have different outcomes. A great deal depends on the initial mix of dispositions. A group of low-threshold guards will behave very differently from a group of high-threshold guards, in part because of their antecedent inclinations, and in part because of social interactions among them. One of the more remarkable findings in modern social science is that after deliberating with one another, groups of people typically end up at a more extreme point in line with pre-existing tendencies. It follows that a set of low-threshold guards might well become very cruel indeed, whereas a set of high-threshold guards will probably behave pretty well. With mixed groups, we could easily imagine a range of outcomes, ranging from extreme cruelty to comparative generosity. If the low-threshold guards act first and influence their high-threshold colleagues, cruelty is likely; if the high-threshold guards act first and influence the low-threshold types, the outcome will be much better. (Hierarchical relationships at many organizations—including schools, workplaces, and religious organizations—can be understood in roughly analogous terms. Teachers, employers, and religious leaders may take on some of the characteristics of aggressive prison guards; individual thresholds and social interactions make all the difference.)

A great deal depends as well on the specific incentives and on existing information. Most low-threshold types will not show cruelty unless they are given at least some incentive to do so. Those with relatively high thresholds might be willing to show considerable aggression if their incentives are strong enough. Of course, beliefs can have a significant impact. Suppose that people are informed that aggression is justified or necessary in the circumstances. Perhaps they learn, or are told, that the victims of their aggression are wrongdoers who deserve whatever they get. Or perhaps they learn that they are a part of a group of people who have been systematically humiliated by others, and who are entirely justified in responding to past humiliation. Or perhaps they learn that certain individuals, or certain groups, are bad by disposition, innately inferior, perhaps even subhuman, and must be treated accordingly. Dispositions are partly a product of beliefs, contributing to low or high thresholds; and once belief-driven dispositions are in place, social situations can add fresh information, often overcoming the relevant threshold.

What emerges is a clear challenge to the most ambitious claims for situationism, and a more complicated understanding of the relationship between individual dispositions and social situations. And there is a final point. Zimbardo shows that the very assumption of a particular social role automatically conveys a great deal of information about appropriate behavior: consider the roles of nurse, first officer, and prison guard. But social roles are not fixed. Nurses and first officers need not think that they should always follow doctors and captains, and prison guards need not feel free to brutalize prisoners. Perhaps the largest lesson of Zimbardo’s experiment involves the importance of ensuring that a constant sense of moral responsibility is taken to be part of, rather than inconsistent with, a wide range of social roles.

Cass Sunstein is a contributing editor to The New Republic. This article appeared in the May 21, 2007 issue of the magazine.