Wired for Culture: Origins of the Human Social Mind
Page 23
On the other hand, the altruistic side of our sense of fairness can produce surprising acts of human kindness that, if we observed them in any other animal, we would think we were watching an animated Disney film. In the late 1960s, the social scientist Henry Hornstein left wallets in public places throughout New York City. The wallets contained money and identification so that if someone found a wallet, he or she could contact the owner. To everyone’s surprise, around half of the wallets—along with the money—were returned. True, wallets with more money in them were less likely to be returned, but the majority of people went out of their way to return a wallet to someone unknown to them, at a personal cost of time and effort and with no promise of any recompense.
Doing the right thing is something we take for granted, even in this anonymous situation, but in no other animal on Earth would the thought that returning the wallet was the “right” thing to do even come to mind. Why do we behave this way? The economist Kaushik Basu points out that this expectation for fairness runs deep in our minds. Consider, he says, that when you ride in a taxi and you get to your destination, if you are like most people you reflexively pay the taxi driver rather than running off. It is an action you probably give very little thought to—just what we do in such situations. Even so, Basu reminds us that this is a revealing action because most of the time no one else observes us, and the thought of running off must have occurred to everyone who has ever ridden in a taxi, even though almost none of us do it. Equally, when we pay taxi drivers, they do not turn around and demand further payment. Both of you might be behaving this way out of fear of getting the police involved or of violent reprisal by the other, but still we somehow feel such behavior would be wrong or unfair.
Are we programmed somehow to do the right thing—as when we return lost wallets or pay taxi drivers—even at a cost to ourselves, just because it is the “right” thing to do, and to expect the same from others? Social scientists and economists who get people to participate in an economic exchange called “the ultimatum game” think so. In this game, volunteers are given a sum of money, say, $100. They are told they have to give some of it to an anonymous other person, but the amount they offer is up to them. The other person can either accept the offer, in which case both people keep their portions of the money, or reject the offer, in which case neither person gets anything. Volunteers are told they will not ever see each other and that the experiment involves just this one exchange.
Now, recipients should accept any offer, as something free is better than nothing. Knowing this, the person with the money should offer the smallest amount. But neither party behaves this way. The game has been played with university students, and in cultures around the world, including hunter-gatherer societies. Time after time, recipients reject as “unfair” offers below 20–40 percent of the sum given to the first person, and both parties walk away empty-handed. Those making offers seem to expect this, and the typical offer is often around 40 percent of the total. In a related exchange known as “the Dictator game,” the people are told they can offer whatever amount they wish and that the recipient does not have any choice in the matter. Offers are lower, but people still give away some of their money.
What is going on? Would you reject a low offer? If so, why? Would you give more than the minimum? If so, why? Remember you are not going to see this person ever again. Some researchers interpret our behavior in these games as evidence that humans are hard-wired for altruism—that we both offer it and expect it from others. They say our actions are governed by a principle of strong reciprocity, a deep moral sense to behave in ways that benefit others, even when this means suffering a personal cost. According to these researchers, our strong reciprocity is evidence that human social behavior evolved by the process of group selection. This is the idea we saw that natural selection can choose among competing groups of people. The most successful groups in our past were those in which individuals put aside their own interests to pull together, even when that meant sacrificing our own well-being. This altruistic behavior is supposedly what we are seeing in the ultimatum game: donors give more than they need to, and recipients expect this. When donors give a small amount, recipients punish them, even though this means that the recipient gives up some money. Strong reciprocity is, according to some social scientists and economists, why we pay taxi drivers and why we return wallets, but also why we might even go to war for our country.
Group selection can work, but its effects are weak and it will always be opposed by selection promoting individual interest over the good of the group: when everyone else is pulling together, it might pay you to hold back. Could it be, then, that people reject low offers in the ultimatum games, not out of any sense of duty to the group but simply because a low offer is not a fair way for two people to divide up money that neither one really has much claim to? My hunch is that if you are like most people reading this account of the ultimatum game, you are feeling that it would indeed be unfair for someone to make you a low offer. This is especially true in the ultimatum game experiments, because the donor and recipient know their roles are arbitrary and could just as easily have been reversed. They also know—because they have been told—that the money has to be divided. In such circumstances, it is indeed only fair to divide the money equally, or at least nearly so, if we wish to acknowledge that someone is entitled to more just by the luck of the draw.
Fairness sounds good, but still, why do we think things must be fair? And why do we turn down the offer of free money? Why not just take whatever is on offer, and walk out of the experiment better off for it? What use is it turning down what you consider to be a miserly offer to “punish” someone (and thus yourself as well) you are never going to see again? Well, when something is unfair, most of us feel an emotion of indignation or anger, and it makes us want to lash out and punish the other person. We do so by rejecting their offer. But why do we do this? If we think about it, this is a spiteful act on our part. Yes, we punish that other person, maybe we feel better for it, and perhaps the punishment makes it less likely the person will behave that way in the future. But our spite also benefits anyone else who might do business with him or her. Given that it has cost us to behave this way, this is an act of altruism on our part. We don’t expect this kind of altruism to evolve because your actions help others at a cost to you.
Another possibility avoids all these problems and doesn’t require any notions of strong reciprocity or putting our self-interest aside. It is that rejecting the offer signals your disapproval and sends out a message that you are not someone to be trifled with. This is just what we expect of an emotion—an expectation for fairness—that has evolved to watch out for our interests. But wait, to whom are you sending this message in the ultimatum game? You have been told the experiment is anonymous, that you will never meet the donor. Maybe, but is that how people really feel in these experiments? You can be told the exchange is anonymous and that you will never encounter the person again, but that doesn’t mean you can simply switch off the normal emotions that natural selection has created in us for ensuring we are not taken advantage of in reciprocal exchanges. The experimenters who conduct these studies are, in effect, asking their volunteers to leave behind at the door all of their evolved psychology for long-term relations. Robert Trivers in his “Reciprocal Altruism Thirty Years Later” put it more witheringly, saying, “you can be aware you are in a movie theatre watching a perfectly harmless horror film and still be scared to death… . I know of no species, humans included, that leaves any part of its biology at the laboratory door; not prior experiences, nor natural proclivities, nor ongoing physiology, nor arms and legs or whatever.”
In the real world, few interactions are of the sort concocted in the ultimatum games. Our psychology is the psychology of repeated interactions, and in that context, turning down a low offer sends a message to the person you are dealing with—and to any others who might witness or hear of the event—not to try to take advantage of you in the future. Turnin
g the offer down might cost you something now, but it pays its way as an investment in future interactions (this is why punishment is effective in tit for tat: it reins in cheats). What might look spiteful is actually a way of improving your longer-term prospects. Emotions that guide this sort of behavior are important for a species like us that lives in small social groups in which people live a long time, and can therefore be expected to see each other repeatedly. The experimental situation of the ultimatum game, perhaps unwittingly, elicits the sense of having an audience precisely because volunteers are told the exchange is anonymous. It seems not to occur to the experimenters who run the ultimatum and other related games that the mere fact of telling someone their actions are anonymous is a refutation of that statement! Someone is watching.
To return to Basu’s example, why do we pay taxi drivers? One obvious reason that separates these people from ultimatum gamers is that the drivers have earned their payment. Riders know this and know this will make drivers more tenacious about getting their payment. Our disposition to act fairly is also most acutely switched on in face-to-face exchanges because we have come to expect reciprocity in our dealings. But that disposition is not one of behaving altruistically because this makes groups strong or because it is the right thing to do. Rather, it is an emotion that ensures we are not taken advantage of, and in any exchange we know that the other person is having the same thoughts as us. Indeed, no one should assume that taxi drivers do their jobs for you, or that you have their interests in mind. Not too many years ago the taxi rank at the airport of one of Southern California’s major cities was a marketplace, not a highly regulated economy with fixed fares. Someone wishing a ride could walk among the drivers bargaining for the best offer. Under these circumstances, the law of supply and demand is at its most efficient best. When there were lots of drivers but few passengers, fares came down. But if you were to show up when only one taxi was in the rank, you could be charged more or less whatever amount the driver could get away with.
When the artificial trappings of the ultimatum games are removed, our supposed strong reciprocity fades. The economist John List got people at a baseball trading card convention to approach card dealers and ask them for the best card they could buy for $20. In another situation, they had $65 to spend. Dealers consistently took advantage of the buyers, selling cards that were well below those values. Revealingly, though, it was especially the dealers from out of town—and thus unlikely to encounter the buyer again—who cheated buyers the most. List also produced a variant of the Dictator game in which instead of telling the donors they could offer whatever they wanted, he also gave them the option of taking some money from the other player. List’s hunch was that this removed some of the “demand” to give money (although List must acknowledge that it might also have created an expectation to take it). His hunch proved correct. Donations by Dictators fell to less than half of what they were in the conventional setting and 20 percent of the Dictators took money.
TRUST AND THE DIFFUSION OF COOPERATION
OUR ATTACHMENT to fairness and justice has its origins in our self-interest, and we have seen that we can respond violently when we think justice is imperiled by selfish behavior. Trivers reminds us that victims feel the sense of injustice far more strongly than do bystanders and they feel it far more than do the perpetrators. People normally raise the issue of “fair play” when they are losing. “Envy,” says Trivers, “is a trivial emotion compared to our sense of injustice. To give one possible example, you do not tie explosives to yourself to kill others because you are envious of what they have, but you may do so if these others and their behavior represent an injustice being visited upon you and yours.”
Still, the remarkable feature of human cooperation is that, as in the case of a suicide bomber, it is not restricted to reciprocal relations between pairs of people. Many, perhaps most, of our day-to-day interactions are reciprocal—such as when we buy a loaf of bread—but our cooperation routinely balloons in complexity well beyond exchanges between pairs of people. If most altruism and cooperation in animals can be arrayed into two levels—that driven by helping relatives and that which prospers from direct reciprocity or exchange between two parties—there is a third level to human cooperation that is diffuse, symbolic, and artfully indirect. On a day-to-day basis, we act in ways that cannot possibly be directly reciprocated, such as when we routinely and unself-consciously hold doors for people, form lines obediently and admonish those who don’t, help the weak, elderly, or the disabled, return items of value, aid people in distress, pay taxes, and give to charities.
If our helping, sharing, and altruism stopped at these acts, we might be happy to let it go there as a charming peculiarity of our nature, something born of our ability to transcend our biological existence, to be empathic and to understand others’ needs. But while the cost of returning a wallet or holding a door may be small, helping someone in distress might not be. As we have seen, our style of help moves beyond the eusociality of the social insects to the ultra-sociality seen only in our species: the most vivid and outré form of our altruism comes to us in battlefield accounts of soldiers who fall on a grenade, charge a machine-gun nest, help others to safety under fire, or fly an airplane Kamikaze-style into an enemy ship. Few of these will have left offspring to regale with stories of their heroism, or if they have, those offspring will suddenly find themselves without one of their parents.
The next chapter asks how this kind of strange selfless behavior can arise in a Darwinian world in which it is the survivors who float to the top. We have seen here that there are some who think we are guided by a psychology of doing what is best for the group, even at cost to ourselves. But an unusual idea from genetics discussed in the next chapter shows how costly acts ranging from so-called honor killings to a disposition to suicide can, remarkably, be understood as self-interested behavior.
CHAPTER 6
Green Beards and the
Reputation Marketplace
That human society is a marketplace in which
reputations are bought and sold
FOR MOST PEOPLE the sight of seeing their nation’s flag raised, the sound of their national anthem being played, watching their nation compete against others in international events, or the loss of one of their soldiers in battle causes a familiar emotion. We often call it “nationalism,” a diffuse and warm pride in one’s country or people, and a tendency to feel an affinity toward them that we do not always or so easily extend to others from different nations or societies. It has been the great and sometimes terrible achievement of human societies to create the conditions that make people share this sense. It can get us unwittingly to practice a kind of cultural nepotism that disposes us in the right circumstances to treat other members of our nation or group as a special and limited kind of relative, willing to be more helpful and trusting than we would normally be toward others. It is the emotion of encountering a stranger on holiday in a foreign land and finding them to be from your country. But it is also the emotion that gets the people of one nation to cheer while those of another suffer from a deadly act of terrorism.
These are all descriptions of our ultra-social nature, a nature that sees us acting altruistically toward others, especially other members of our societies, without expecting that help to be directly returned. That altruism ranges from simple acts such as holding doors and giving up seats on trains, to volunteering your time and contributing to charities, but also to risking your life in war for people you might not know and are not even related to. None of these acts is directly reciprocated, or not necessarily so, and the risks of exploitation by others who do not share your altruism far exceed those of simple reciprocal exchanges between two people. Our altruistic dispositions are so strong that they even extend to helping other species. What other animal would ever put in the time and effort to save tigers, or adopt an abandoned dog, or call out the fire brigade to rescue a cat up a tree?
Where does this sense of cultural altruism come fr
om; why do we feel it so strongly and so naturally? How do I know whom to extend these feelings to and in what circumstances, and why am I more likely to trust someone from my own group even if I have never met them? How could it ever be in your interest to risk your life in war? We saw the outlines of an answer to these questions in Chapter 2, and here we will see just how easy it can be to get this cultural altruism to evolve.
GREEN BEARDS, VENTURE CAPITALISTS,
AND GOOD SAMARITANS
IN THIS chapter we will adopt an approach that uses thought experiments and hypothetical scenarios to understand our evolved psychology. Evolutionary biologists often use such an approach in an attempt to simplify what can seem like overwhelmingly complicated situations, such as our public behavior. The risk is that the simplifications can seem to remove any realism from the examples. But the rewards of this thought experiment approach are that it often returns insights that we hadn’t expected, or ones that fit so well with how we actually behave that we think the simplifications have captured something fundamental about our underlying dispositions and motivations.
We want to think about how a disposition to behave altruistically toward one another could evolve among an imaginary group of people initially lacking that disposition. This imaginary group of people could represent our “state of nature” before we learned how to cooperate with people outside of our immediate families. To see how a disposition toward cooperation might evolve in such a group, consider that a gene, an idea, or just an emotion arose in one or even a few of them that caused them to help people whom they thought were “helpers” like themselves. It could be as simple as just some good feeling that you get from helping these people. It is not a disposition to help any one individual in particular, or even to expect him or her to help you in return. It is a disposition to help people whom you think are helpers like you. You don’t need to know why you have this disposition, or where it comes from; it could simply be something you feel.