Baumeister explains the acquisition of sadism with the help of a theory of motivation proposed by the psychologist Richard Solomon, based on an analogy with color vision.258 Emotions come in pairs, Solomon suggested, like complementary colors. The world as seen through rose-tinted goggles eventually returns to neutral, but when the goggles are removed, it looks greenish for a while. That is because our sense of neutral white or gray reflects the present status of a tug-of-war between circuits for the color red (more accurately, longer wavelengths) and circuits for the color green (medium wavelengths). When red-sensitive neurons are overactivated for a protracted period, they habituate and relax their tug, and the rosy tint in our consciousness fades out. Then when the goggles are removed, the red- and green-sensitive neurons are equally stimulated, but the red ones have been desensitized while the green ones are ready and rested. So the green side predominates in the tug-of-war, and greenness is what we experience.
Solomon suggested that our emotional state, like our perception of the color of the world, is kept in equilibrium by a balance of opposing circuits. Fear is in balance with reassurance, euphoria with depression, hunger with satiety. The main difference between opposing emotions and complementary colors is in how they change with experience. With the emotions, a person’s initial reaction gets weaker over time, and the balancing impulse gets stronger. As an experience is repeated, the emotional rebound is more keenly felt than the emotion itself. The first leap in a bungee jump is terrifying, and the sudden yoiiiiing of deceleration exhilarating, followed by an interlude of tranquil euphoria. But with repeated jumps the reassurance component strengthens, which makes the fear subside more quickly and the pleasure arrive earlier. If the most concentrated moment of pleasure is the sudden reversal of panic by reassurance, then the weakening of the panic response over time may require the jumper to try increasingly dangerous jumps to get the same degree of exhilaration. The action-reaction dynamic may be seen with positive initial experiences as well. The first hit of heroin is euphoric, and the withdrawal mild. But as the person turns into a junkie, the pleasure lessens and the withdrawal symptoms come earlier and are more unpleasant, until the compulsion is less to attain the euphoria than to avoid the withdrawal.
According to Baumeister, sadism follows a similar trajectory.259 An aggressor experiences a revulsion to hurting his victim, but the discomfort cannot last forever, and eventually a reassuring, energizing counteremotion resets his equilibrium to neutral. With repeated bouts of brutality, the reenergizing process gets stronger and turns off the revulsion earlier. Eventually it predominates and tilts the entire process toward enjoyment, exhilaration, and then craving. As Baumeister puts it, the pleasure is in the backwash.
By itself the opponent-process theory is a bit too crude, predicting, for example, that people would hit themselves over the head because it feels so good when they stop. Clearly not all experiences are governed by the same tension between reaction and counterreaction, nor by the same gradual weakening of the first and strengthening of the second. There must be a subset of aversive experiences that especially lend themselves to being overcome. The psychologist Paul Rozin has identified a syndrome of acquired tastes he calls benign masochism.260 These paradoxical pleasures include consuming hot chili peppers, strong cheese, and dry wine, and partaking in extreme experiences like saunas, skydiving, car racing, and rock climbing. All of them are adult tastes, in which a neophyte must overcome a first reaction of pain, disgust, or fear on the way to becoming a connoisseur. And all are acquired by controlling one’s exposure to the stressor in gradually increasing doses. What they have in common is a coupling of high potential gains (nutrition, medicinal benefits, speed, knowledge of new environments) with high potential dangers (poisoning, exposure, accidents). The pleasure in acquiring one of these tastes is the pleasure of pushing the outside of the envelope: of probing, in calibrated steps, how high, hot, strong, fast, or far one can go without bringing on disaster. The ultimate advantage is to open up beneficial regions in the space of local experiences that are closed off by default by innate fears and cautions. Benign masochism is an overshooting of this motive of mastery, and as Solomon and Baumeister point out, the revulsion-overcoming process can overshoot so far as to result in craving and addiction. In the case of sadism, the potential benefits are dominance, revenge, and sexual access, and the potential dangers are reprisals from the victim or victim’s allies. Sadists do become connoisseurs—the instruments of torture in medieval Europe, police interrogation centers, and the lairs of serial killers can be gruesomely sophisticated—and sometimes they can become addicts.
The fact that sadism is an acquired taste is both frightening and hopeful. As a pathway prepared by the motivational systems of the brain, sadism is an ever-present danger to individuals, security forces, or subcultures who take the first step and can proceed to greater depravity in secrecy. Yet it does have to be acquired, and if those first steps are blocked and the rest of the pathway bathed in sunlight, the path to sadism can be foreclosed.
IDEOLOGY
Individual people have no shortage of selfish motives for violence. But the really big body counts in history pile up when a large number of people carry out a motive that transcends any one of them: an ideology. Like predatory or instrumental violence, ideological violence is a means to an end. But with an ideology, the end is idealistic: a conception of the greater good.261
Yet for all that idealism, it’s ideology that drove many of the worst things that people have ever done to each other. They include the Crusades, the European Wars of Religion, the French Revolutionary and Napoleonic Wars, the Russian and Chinese civil wars, the Vietnam War, the Holocaust, and the genocides of Stalin, Mao, and Pol Pot. An ideology can be dangerous for several reasons. The infinite good it promises prevents its true believers from cutting a deal. It allows any number of eggs to be broken to make the utopian omelet. And it renders opponents of the ideology infinitely evil and hence deserving of infinite punishment.
We have already seen the psychological ingredients of a murderous ideology. The cognitive prerequisite is our ability to think through long chains of means-ends reasoning, which encourage us to carry out unpleasant means as a way to bring about desirable ends. After all, in some spheres of life the ends really do justify the means, such as the bitter drugs and painful procedures we undergo as part of a medical treatment. Means-ends reasoning becomes dangerous when the means to a glorious end include harming human beings. The design of the mind can encourage the train of theorization to go in that direction because of our drives for dominance and revenge, our habit of essentializing other groups, particularly as demons or vermin, our elastic circle of sympathy, and the self-serving biases that exaggerate our wisdom and virtue. An ideology can provide a satisfying narrative that explains chaotic events and collective misfortunes in a way that flatters the virtue and competence of believers, while being vague or conspiratorial enough to withstand skeptical scrutiny.262 Let these ingredients brew in the mind of a narcissist with a lack of empathy, a need for admiration, and fantasies of unlimited success, power, brilliance, and goodness, and the result can be a drive to implement a belief system that results in the deaths of millions.
But the puzzle in understanding ideological violence is not so much psychological as epidemiological: how a toxic ideology can spread from a small number of narcissistic zealots to an entire population willing to carry out its designs. Many ideological beliefs, in addition to being evil, are patently ludicrous—ideas that no sane person would ever countenance on his or her own. Examples include the burning of witches because they sank ships and turned men into cats, the extermination of every last Jew in Europe because their blood would pollute the Aryan race, and the execution of Cambodians who wore eyeglasses because it proved they were intellectuals and hence class enemies. How can we explain extraordinary popular delusions and the madness of crowds?
Groups can breed a number of pathologies of thought. One of them is polarization. T
hrow a bunch of people with roughly similar opinions into a group to hash them out, and the opinions will become more similar to one another, and more extreme as well.263 The liberal groups become more liberal; the conservative groups more conservative. Another group pathology is obtuseness, a dynamic that the psychologist Irving Janis called groupthink.264 Groups are apt to tell their leaders what they want to hear, to suppress dissent, to censor private doubts, and to filter out evidence that contradicts an emerging consensus. A third is animosity between groups.265 Imagine being locked in a room for a few hours with a person whose opinions you dislike—say, you’re a liberal and he or she is a conservative or vice versa, or you sympathize with Israel and the other person sympathizes with the Palestinians or vice versa. Chances are the conversation between the two of you would be civil, and it might even be warm. But now imagine that there are six on your side and six on the other. There would probably be a lot of hollering and red faces and perhaps a small riot. The overall problem is that groups take on an identity of their own in people’s minds, and individuals’ desire to be accepted within a group, and to promote its standing in comparison to other groups, can override their better judgment.
Even when people are not identifying with a well-defined group, they are enormously influenced by the people around them. One of the great lessons of Stanley Milgram’s experiments on obedience to authority, widely appreciated by psychologists, is the degree to which the behavior of the participants depended on the immediate social milieu.266 Before he ran the experiment, Milgram polled his colleagues, students, and a sample of psychiatrists on how far they thought the participants would go when an experimenter instructed them to shock a fellow participant. The respondents unanimously predicted that few would exceed 150 volts (the level at which the victim demands to be freed), that just 4 percent would go up to 300 volts (the setting that bore the warning “Danger: Severe Shock”), and that only a handful of psychopaths would go all the way to the highest shock the machine could deliver (the setting labeled “450 Volts—XXX”). In fact, 65 percent of the participants went all the way to the maximum shock, long past the point when the victim’s agonized protests had turned to an eerie silence. And they might have kept on shocking the presumably comatose subject (or his corpse) had the experimenter not brought the proceedings to a halt. The percentage barely budged with the sex, age, or occupation of the participants, and it varied only a small amount with their personalities. What did matter was the physical proximity of other people and how they behaved. When the experimenter was absent and his instructions were delivered over the telephone or in a recorded message, obedience fell. When the victim was in the same room instead of an adjacent booth, obedience fell. And when the participant had to work in tandem with a second participant (a confederate of the experimenter), then if the confederate refused to comply, so did the participant. But when the confederate complied, more than 90 percent of the time the participant did too.
People take their cues on how to behave from other people. This is a major conclusion of the golden age of social psychology, when experiments were a kind of guerrilla theater designed to raise consciousness about the dangers of mindless conformity. Following a 1964 news report—almost entirely apocryphal—that dozens of New Yorkers watched impassively as a woman named Kitty Genovese was raped and stabbed to death in their apartment courtyard, the psychologists John Darley and Bibb Latané conducted a set of ingenious studies on so-called bystander apathy.267 The psychologists suspected that groups of people might fail to respond to an emergency that would send an isolated person leaping to action because in a group, everyone assumes that if no one else is doing anything, the situation couldn’t be all that dire. In one experiment, as a participant was filling out a questionnaire, he or she heard a loud crash and a voice calling out from behind a partition: “Oh . . . my foot . . . I . . . can’t move it; oh . . . my ankle . . . I can’t get this thing off me.” Believe it or not, if the participant was sitting with a confederate who continued to fill out the questionnaire as if nothing was happening, 80 percent of the time the participant did nothing too. When the participants were alone, only 30 percent failed to respond.
People don’t even need to witness other people behaving callously to behave in uncharacteristically callous ways. It is enough to place them in a fictive group that is defined as being dominant over another one. In another classic psychology-experiment-cum-morality-play (conducted in 1971, before committees for the protection of human subjects put the kibosh on the genre), Philip Zimbardo set up a mock prison in the basement of the Stanford psychology department, divided the participants at random into “prisoners” and “guards,” and even got the Palo Alto police to arrest the prisoners and haul them to the campus hoosegow.268 Acting as the prison superintendent, Zimbardo suggested to the guards that they could flaunt their power and instill fear in the prisoners, and he reinforced the atmosphere of group dominance by outfitting the guards with uniforms, batons, and mirrored sunglasses while dressing the prisoners in humiliating smocks and stocking caps. Within two days some of the guards took their roles too seriously and began to brutalize the prisoners, forcing them to strip naked, clean toilets with their bare hands, do push-ups with the guards standing on their backs, or simulate sodomy. After six days Zimbardo had to call off the experiment for the prisoners’ safety. Decades later Zimbardo wrote a book that analogized the unplanned abuses in his own faux prison to the unplanned abuses at the Abu Ghraib prison in Iraq, arguing that a situation in which a group of people is given authority over another group can bring out barbaric behavior in individuals who might never display it in other circumstances.
Many historians of genocide, like Christopher Browning and Benjamin Valentino, have invoked the experiments of Milgram, Darley, Zimbardo, and other social psychologists to make sense of the puzzling participation, or at least acquiescence, of ordinary people in unspeakable atrocities. Bystanders often get caught up in the frenzy around them and join in the looting, gang rapes, and massacres. During the Holocaust, soldiers and policemen rounded up unarmed civilians, lined them up in front of pits, and shot them to death, not out of animus to the victims or a commitment to Nazi ideology but so that they would not shirk their responsibilities or let down their brothers-in-arms. Most of them were not even coerced by a threat of punishment for insubordination. (My own experience in carrying out instructions to shock a laboratory rat against my better judgment makes this disturbing claim utterly believable to me.) Historians have found few if any cases in which a German policeman, soldier, or guard suffered a penalty for refusing to carry out the Nazis’ orders.269 As we shall see in the next chapter, people even moralize conformity and obedience. One component of the human moral sense, amplified in many cultures, is the elevation of conformity and obedience to praiseworthy virtues.
Milgram ran his experiments in the 1960s and early 1970s, and as we have seen, many attitudes have changed since then. It’s natural to wonder whether Westerners today would still obey the instructions of an authority figure to brutalize a stranger. The Stanford Prison Experiment is too bizarre to replicate exactly today, but thirty-three years after the last of the obedience studies, the social psychologist Jerry Burger figured out a way to carry out a new one that would pass ethical muster in the world of 2008.270 He noticed that in Milgram’s original studies, the 150-volt mark, when the victim first cries out in pain and protest, was a point of no return. If a participant didn’t disobey the experimenter then, 80 percent of the time he or she would continue to the highest shock on the board. So Burger ran Milgram’s procedure but broke off the experiment at the 150-volt mark, immediately explaining the study to the participants and preempting the awful progression in which so many people tortured a stranger over their own misgivings. The question is: after four decades of fashionable rebellion, bumper stickers that advise the reader to Question Authority, and a growing historical consciousness that ridicules the excuse “I was only following orders,” do people still follow the ord
ers of an authority to inflict pain on a stranger? The answer is that they do. Seventy percent of the participants went all the way to 150 volts and so, we have reason to believe, would have continued to fatal levels if the experimenter had permitted it. On the bright side, almost twice as many people disobeyed the experimenter in the 2000s as did in the 1960s (30 percent as compared to 17.5 percent), and the figure might have been even higher if the diverse demographics of the recent study pool had been replaced by the white-bread homogeneity of the earlier ones.271 But a majority of people will still hurt a stranger against their own inclinations if they see it as part of a legitimate project in their society.
Why do people so often impersonate sheep? It’s not that conformity is inherently irrational.272 Many heads are better than one, and it’s usually wiser to trust the hard-won wisdom of millions of people in one’s culture than to think that one is a genius who can figure everything out from scratch. Also, conformity can be a virtue in what game theorists call coordination games, where individuals have no rational reason to choose a particular option other than the fact that everyone else has chosen it. Driving on the right or the left side of the road is a classic example: here is a case in which you really don’t want to march to the beat of a different drummer. Paper currency, Internet protocols, and the language of one’s community are other examples.
But sometimes the advantage of conformity to each individual can lead to pathologies in the group as a whole. A famous example is the way an early technological standard can gain a toehold among a critical mass of users, who use it because so many other people are using it, and thereby lock out superior competitors. According to some theories, these “network externalities” explain the success of English spelling, the QWERTY keyboard, VHS videocassettes, and Microsoft software (though there are doubters in each case). Another example is the unpredictable fortunes of bestsellers, fashions, top-forty singles, and Hollywood blockbusters. The mathematician Duncan Watts set up two versions of a Web site in which users could download garage-band rock music. 273 In one version users could not see how many times a song had already been downloaded. The differences in popularity among songs were slight, and they tended to be stable from one run of the study to another. But in the other version people could see how popular a song had been. These users tended to download the popular songs, making them more popular still, in a runaway positive feedback loop. The amplification of small initial differences led to large chasms between a few smash hits and many duds—and the hits and duds often changed places when the study was rerun.
The Better Angels of Our Nature: Why Violence Has Declined Page 83