The Social Animal
Page 33
As Haidt has shown in a string of research, most people have strong intuitive (and negative) reactions to these scenarios, even though nobody is harmed in any of them. Usually, Haidt’s research subjects cannot say why they found these things so repulsive or disturbing. They just do. The unconscious has made the call.
Furthermore, if the rationalist folk theory, with its emphasis on Level 2 moral reasoning, were correct, then you would expect people who do moral reasoning all day to be, in fact, more moral. Researchers have studied this, too. They’ve found there’s relatively little relationship between moral theorizing and noble behavior. As Michael Gazzaniga wrote in his book Human, “It has been hard to find any correlation between moral reasoning and proactive moral behavior, such as helping people. In fact, in most studies, none has been found.”
If moral reasoning led to more moral behavior, you would expect people who are less emotional to also be more moral. Yet at the extreme end, this is the opposite of the truth. As Jonah Lehrer has pointed out, when most people witness someone else suffering, or read about a murder or a rape, they experience a visceral emotional reaction. Their palms sweat and their blood pressure surges. But some people show no emotional reaction. These people are not hyper-rational moralists; they are psychopaths. Psychopaths do not seem to be able to process emotion about others’ pain. You can show them horrific scenes of death and suffering and they are unmoved. They can cause the most horrific suffering in an attempt to get something they want, and they will feel no emotional pain or discomfort. Research on wife batterers finds that as these men become more aggressive their blood pressure and pulse actually drop.
Finally, if reasoning led to moral behavior, then those who could reach moral conclusions would be able to apply their knowledge across a range of circumstances, based on these universal moral laws. But in reality, it has been hard to find this sort of consistency.
A century’s worth of experiments suggests that people’s actual behavior is not driven by permanent character traits that apply from one context to another. Back in the 1920s, Yale psychologists Hugh Hartshorne and Mark May gave ten thousand schoolchildren opportunities to lie, cheat, and steal in a variety of situations. Most students cheated in some situations and not in others. Their rate of cheating did not correlate with any measurable personality traits or assessments of moral reasoning. More recent research has found the same general pattern. Students who are routinely dishonest at home are not routinely dishonest at school. People who are courageous at work can be cowardly at church. People who behave kindly on a sunny day may behave callously the next day, when it is cloudy and they are feeling glum. Behavior does not exhibit what the researchers call “cross-situational stability.” Rather, it seems to be powerfully influenced by context.
The Intuitionist View
The rationalist assumptions about our moral architecture are now being challenged by a more intuitionist view. This intuitionist account puts emotion and unconscious intuition at the center of moral life, not reason; it stresses moral reflexes, alongside individual choice; it emphasizes the role perception plays in moral decision making, before logical deduction. In the intuitionist view, the primary struggle is not between reason and the passions. Instead, the crucial contest is within Level 1, the unconscious-mind sphere itself.
This view starts with the observation that we all are born with deep selfish drives—a drive to take what we can, to magnify our status, to appear superior to others, to exercise power over others, to satisfy lusts. These drives warp perception. It wasn’t as if Mr. Make-Believe consciously set out to use Erica, or attack her marriage. He merely saw her as an object to be used in his life quest. Similarly, murderers don’t kill people they regard as fully human like themselves. The unconscious has to first dehumanize the victim and change the way he is seen.
The French journalist Jean Hatzfeld interviewed participants in the Rwandan genocide for his book Machete Season. The participants were caught up in a tribal frenzy. They began to perceive their neighbors in radically perverse ways. One man Hatzfeld spoke with murdered a Tutsi who lived nearby: “I finished him off in a rush, not thinking anything of it, even though he was a neighbor, quite close on my hill. In truth, it came to me only afterward: I had taken the life of a neighbor. I mean, at the fatal instant I did not see in him what he had been before; I struck someone who was no longer either close or strange to me, who wasn’t exactly ordinary anymore, I’m saying like the people you meet every day. His features were indeed similar to those of the person I knew, but nothing firmly reminded me that I had lived beside him for a long time.”
These deep impulses treat conscious cognition as a plaything. They not only warp perception during sin; they invent justifications after it. We tell ourselves that the victim of our cruelty or our inaction had it coming; that the circumstances compelled us to act as we did; that someone else is to blame. The desire pre-consciously molds the shape of our thought.
But not all the deep drives are selfish ones, the intuitionists stress. We are all descended from successful cooperators. Our ancestors survived in families and groups.
Other animals and insects share this social tendency, and when we study them, we observe that nature has given them faculties that help them with bonding and commitment. In one study in the 1950s, rats were trained to press a lever for food. Then the experimenter adjusted the machine so that the lever sometimes provided food but sometimes delivered an electric shock to another rat in the next chamber. When the eating rats noticed the pain they were causing their neighbors, they adjusted their eating habits. They would not starve themselves. But they chose to eat less, to avoid causing undue pain to the other rats. Frans de Waal has spent his career describing the sophisticated empathy displays evident in primate behavior. Chimps console each other, nurse the injured, and seem to enjoy sharing. These are not signs that animals have morality, but they have the psychological building blocks for it.
Humans also possess a suite of emotions to help with bonding and commitment. We blush and feel embarrassed when we violate social norms. We feel instantaneous outrage when our dignity has been slighted. People yawn when they see others yawning, and those who are quicker to sympathetically yawn also rate higher on more complicated forms of sympathy.
Our natural empathy toward others is nicely captured by Adam Smith in The Theory of Moral Sentiments, in a passage that anticipates the theory of mirror neurons: “When we see a stroke aimed and just ready to fall upon the leg or arm of another person, we naturally shrink back our leg, our own arm; and when it does fall, we feel it in some measure and are hurt by it as well as the sufferer.” We also feel a desire, Smith added, to be esteemed by our fellows. “Nature, when she formed man for society, endowed him with an original desire to please, and an original aversion to offend his brethren. She taught him to feel pleasure in their favorable, and pain in their unfavorable regard.”
In humans, these social emotions have a moral component, even at a very early age. Yale professor Paul Bloom and others conducted an experiment in which they showed babies a scene featuring one figure struggling to climb a hill, another figure trying to help it, and a third trying to hinder it. At as early as six months, the babies showed a preference for the helper over the hinderer. In some plays, there was a second act. The hindering figure was either punished or rewarded. In this case, the eight-month-olds preferred a character who was punishing the hinderer over ones being nice to it. This reaction illustrates, Bloom says, that people have a rudimentary sense of justice from a very early age.
Nobody has to teach a child to demand fair treatment; children protest unfairness vigorously and as soon as they can communicate. Nobody has to teach us to admire a person who sacrifices for a group; the admiration for duty is universal. Nobody has to teach us to disdain someone who betrays a friend or is disloyal to a family or tribe. Nobody has to teach a child the difference between rules that are moral—“Don’t hit”—and rules that are not—“Don’t chew gum in school.�
� These preferences also emerge from somewhere deep inside us. Just as we have a natural suite of emotions to help us love and be loved, so, too, we have a natural suite of moral emotions to make us disapprove of people who violate social commitments, and approve of people who reinforce them. There is no society on earth where people are praised for running away in battle.
It’s true that parents and schools reinforce these moral understandings, but as James Q. Wilson argued in his book The Moral Sense, these teachings fall on prepared ground. Just as children come equipped to learn language, equipped to attach to Mom and Dad, so, too, they come equipped with a specific set of moral prejudices, which can be improved, shaped, developed, but never quite supplanted.
These sorts of moral judgments—admiration for someone who is loyal to a cause, contempt for someone who betrays a spouse—are instant and emotional. They contain subtle evaluations. If we see someone overcome by grief at the loss of a child, we register compassion and pity. If we see someone overcome by grief at the loss of a Maserati, we register disdain. Instant sympathy and complex judgment are all intertwined.
As we’ve seen so often in this story, the act of perception is a thick process. It is not just taking in a scene but, almost simultaneously, weighing its meaning, evaluating it, and generating an emotion about it. In fact, many scientists now believe that moral perceptions are akin to aesthetic or sensual perceptions, emanating from many of the same regions of the brain.
Think of what happens when you put a new food into your mouth. You don’t have to decide if it’s disgusting. You just know. Or when you observe a mountain scene. You don’t have to decide if a landscape is beautiful. You just know. Moral judgments are in some ways like that. They are rapid intuitive evaluations. Researchers at the Max Planck Institute for Psycholinguistics in the Netherlands have found that evaluative feelings, even on complicated issues like euthanasia, can be detected within 200 to 250 milliseconds after a statement is read. You don’t have to think about disgust, or shame, or embarrassment, or whether you should blush or not. It just happens.
In fact, if we had to rely on deliberative moral reasoning for our most elemental decisions, human societies would be pretty horrible places, since the carrying capacity of that reason is so low. Thomas Jefferson anticipated this point centuries ago:
He who made us would have been a pitiful bungler, if He had made the rules of our moral conduct a matter of science. For one man of science, there are thousands who are not. What would have become of them? Man was destined for society. His morality, therefore, was to be formed to this object. He was endowed with a sense of right and wrong merely relative to this. This sense is as much a part of nature, as the sense of hearing, seeing, feeling; it is the true foundation of morality.”
Thus, it is not merely reason that separates us from the other animals, but the advanced nature of our emotions, especially our social and moral emotions.
Moral Concerns
Some researchers believe we have a generalized empathetic sense, which in some flexible way inclines us to cooperate with others. But there is a great deal of evidence to suggest that people are actually born with more structured moral foundations, a collection of moral senses that are activated by different situations.
Jonathan Haidt, Jesse Graham, and Craig Joseph have compared these foundations to the taste buds. Just as the human tongue has different sorts of receptors to perceive sweetness, saltiness, and so on, the moral modules have distinct receptors to perceive certain classic situations. Just as different cultures have created different cuisines based on a few shared flavor senses, so, too, have different cultures created diverse understandings of virtue and vice, based on a few shared concerns.
Scholars disagree on the exact structure of these modules. Haidt, Graham, and Brian Nosek have defined five moral concerns. There is the fairness/reciprocity concern, involving issues of equal and unequal treatment. There is the harm/care concern, which includes things like empathy and concern for the suffering of others. There is an authority/respect concern. Human societies have their own hierarchies, and react with moral outrage when that which they view with reverence (including themselves) is not treated with proper respect.
There is a purity/disgust concern. The disgust module may have first developed to repel us from noxious or unsafe food, but it evolved to have a moral component—to drive us away from contamination of all sorts. Students at the University of Pennsylvania were asked how it would feel to wear Hitler’s sweater. They said it would feel disgusting, as if Hitler’s moral qualities were a virus that could spread to them.
Finally, and most problematically, there is the in-group/loyalty concern. Humans segregate themselves into groups. They feel visceral loyalty to members of their group, no matter how arbitrary the basis for membership, and feel visceral disgust toward those who violate loyalty codes. People can distinguish between members of their own group and members of another group in as little as 170 milliseconds. These categorical differences trigger different activation patterns in the brain. The anterior cingulated cortices in Caucasian and Chinese brains activate when they see members of their own group endure pain; but much less than when they see members of another group enduring it.
The Moral Motivation
In the intuitionist view, the unconscious soulsphere is a coliseum of impulses vying for supremacy. There are deep selfish intuitions. There are deep social and moral intuitions. Social impulses compete with asocial impulses. Very often social impulses conflict with one another. Compassion and pity may emerge at the cost of fortitude, toughness, and strength. The virtue of courage and heroism may clash with the virtue of humility and acceptance. The cooperative virtues may clash with the competitive virtues. Our virtues do not fit neatly together into a complementary or logical system. We have many ways of seeing and thinking about a situation, and they are not ultimately compatible.
This means that the dilemma of being alive yields to no one true answer. In the heyday of the Enlightenment, philosophers tried to ground morality in logical rules, which could fit together like pieces of a logical puzzle. But that’s not possible in the incompatible complexity of human existence. The brain is adapted to a fallen world, not a harmonious and perfectible one. Individuals contain a plurality of moral selves, which are aroused by different contexts. We contain multitudes.
But we do have a strong impulse to be as moral as possible, or to justify ourselves when our morality is in question. Having a universal moral sense does not mean that people always or even often act in good and virtuous ways. It’s more about what we admire than what we do, more about the judgments we make than our ability to live up to them. But we are possessed by a deep motivation to be and be seen as a moral person.
Moral Development
The rationalist view advises us to philosophize in order to become more moral. The intuitionist view advises us to interact. It is hard or impossible to become more moral alone, but over the centuries, our ancestors devised habits and practices that help us reinforce our best intuitions, and inculcate moral habits.
For example, in healthy societies everyday life is structured by tiny rules of etiquette: Women generally leave the elevator first. The fork goes on the left. These politeness rules may seem trivial, but they nudge us to practice little acts of self-control. They rewire and strengthen networks in the brain.
Then there is conversation. Even during small talk, we talk warmly about those who live up to our moral intuitions and coldly about those who do not. We gossip about one another and lay down a million little markers about what behavior is to be sought and what behavior is to be avoided. We tell stories about those who violate the rules of our group, both to reinforce our connections with one another and to remind ourselves of the standards that bind us together.
Finally, there are the habits of mind transmitted by institutions. As we go through life, we travel through institutions—first family and school, then the institutions of a profession or a craft. Each of these comes w
ith certain rules and obligations that tell us how to do what we’re supposed to do. They are external scaffolds that penetrate deep inside us. Journalism imposes habits that help reporters keep a mental distance from those they cover. Scientists have obligations to the community of researchers. In the process of absorbing the rules of the institutions we inhabit, we become who we are.
The institutions are idea spaces that existed before we were born, and will last after we are gone. Human nature may remain the same, eon after eon, but institutions improve and progress, because they are the repositories of hard-won wisdom. The race progresses because institutions progress.
The member of an institution has a deep reverence for those who came before her and built up the rules that she has temporarily taken delivery of. “In taking delivery,” the political theorist Hugh Heclo writes, “institutionalists see themselves as debtors who owe something, not creditors to whom something is owed.”
A teacher’s relationship to the craft of teaching, an athlete’s relationship to her sport, a farmer’s relationship to her land is not a choice that can be easily reversed when psychic losses exceed psychic profits. There will be many long periods when you put more into your institutions than you get out of them. Institutions are so valuable because they inescapably merge with who we are.
In 2005 Ryne Sandberg was inducted into the Baseball Hall of Fame. His speech is an example of how people talk when they are defined by their devotion to an institution: “I was in awe every time I walked onto the field. That’s respect. I was taught you never, ever disrespect your opponent or your teammates or your organization or your manager and never, ever your uniform. Make a great play, act like you’ve done it before; get a big hit, look for the third base coach and get ready to run the bases.”