Book Read Free

The Rational Animal: How Evolution Made Us Smarter Than We Think

Page 10

by Kenrick, Douglas T.


  With the emergency declared, things began to look more hopeful, as the world swiftly came to Zambia’s aid. Within weeks the United States had sent thirty-five thousand tons of food to the distressed nation, enough to sustain the population until the next harvest. Much of the food consisted of donations from American farmers, some of whom had surpluses from a bountiful harvest.

  But few could have anticipated what happened next. To the shock of the world, President Mwanawasa rejected the aid!

  Some observers speculated that this startling episode might be a ploy by an evil dictator hoping to bring his people to their knees. As a wealthy leader, after all, he wasn’t the one who was going to starve. But in reality President Mwanawasa had support for the food boycott from the government and from much of the population. Other observers conjectured that perhaps the food aid was so low in quality as to be barely edible. But the supplies sent to Zambia were identical to the food many Americans consumed on a daily basis. Instead, at the heart of the matter were two words that appeared in small print on the food crates: “genetically modified.” Many outraged Zambians refused to even touch the crates, leading President Mwanawasa to categorically assert, “We would rather starve than get genetically modified foods.” To the Zambians the thirty-five thousand tons of crops from the United States were not food but poison.

  Although observers decried Zambia’s decision as an error of astounding proportions, we suspect that many people in President Mwanawasa’s shoes would have reacted just as he did. The president’s decision may have been an error, but it stemmed from a deeply rational bias designed to avoid a much costlier mistake. Here we take a closer look at the ancestral biases guiding people’s decisions. While biases are often viewed as deficiencies and equated with poor decision making, an evolutionary perspective stresses that many biases stem from adaptive tendencies that helped our ancestors solve evolutionary problems. We humans are born to be biased—and for good reason. Although these innate biases can sometimes lead to errors, the very nature of these errors often reveals a deeply intelligent brain.

  DEFECTIVE BRAINS

  We have already looked at a few of the human biases discovered by behavioral economists, the scientists wearing Rolling Stones T-shirts under their white lab coats, who enjoy having a good chuckle at human behavior. And why not? It’s sometimes hard not to chuckle at our long list of gross errors in judgment. Our perceptions of reality are grossly warped by phenomena such as the false consensus bias, our tendency to overestimate the degree to which other folks agree with us, which results in half of us being shocked after every election. And reality is further warped by the overconfidence bias, the tendency for most of us to believe we are better than average on most traits, even though this is mathematically impossible (half of the population is, by definition, below average on any given trait, but of course we don’t mean you or me). Overconfidence sometimes reaches absurd levels, as in the case of people hospitalized after auto accidents, who persist in believing they are better-than-average drivers.

  Scrolling down the long list of documented errors and biases, it’s easy to see humans as being like Keanu Reeves’s character Neo in The Matrix. Our perception of the world is so skewed by our brains that we seem to be out of touch with the true nature of reality.

  But from the evolutionary psychologist’s perspective, it would be surprising if the human mind were really so woefully muddled. If our ancestors were so out of touch with reality, how in the world would they have managed to survive, let alone reproduce and then raise offspring who themselves survived, thrived, and reproduced? Brains are expensive mechanisms, requiring more energy than most other bodily organs (our brains make up only 2 percent of our bodily mass yet gobble up as much as 20 to 30 percent of the calories we consume). These calories are not wasted, though. Human brains, like those of other animals, are designed with incredible efficiency, allowing us to thrive in an incredible range of environments.

  We are not saying that the brain is free of biases or that people don’t sometimes make moronic choices. But we are saying that it’s time to reconsider what it means for judgments and decisions to be considered smart.

  DOES ADAPTIVE = ACCURATE?

  A critical distinction between a traditional and an evolutionary perspective involves the question of whether it’s always smart to be accurate. According to most social scientists, people should strive to uncover and know the pure and undistorted truth, which would therefore enable us to make more accurate and correct judgments. But will natural selection necessarily build organisms designed to seek truth? Maybe not. Indeed, in some instances evolution might even disfavor truth and accuracy. What matters instead is that people’s judgments enhance fitness. And if it so happens that being biased and inaccurate helps achieve that end in some cases, then in those cases we should expect to see a mind that consistently produces biased and inaccurate judgments. “The principal function of nervous systems,” according to cognitive scientist Patricia Churchland, is “to get the body parts where they should be in order that the organism may survive. . . . Truth, whatever that is, definitely takes the hindmost.”

  We are not saying that someone who is hopelessly delusional and incapable of ever seeing the truth is going to be evolutionarily successful. For most problems, it usually pays a higher dividend to be accurate rather than inaccurate. But the mind is not designed to strive for accuracy and truth 100 percent of the time. As we’ll see, there’s a good reason why it sometimes makes evolutionary sense to warp reality.

  Consider the following example: if an object is moving toward you at 20 feet per second, and is currently 120 feet away, how long will it take for the object to hit you? The accurate answer is 6 seconds (6.001 seconds if you’re a physicist concerned about air friction). A guess of four seconds would certainly be inaccurate; indeed, it would be a clear error in judgment. Yet the mind is wired to intentionally make this exact error. When our eyes see an approaching object, our brains tell us that this object will hit us sooner than it actually will. In fact, merely hearing the sound of an approaching object (the swoosh of a bird diving through the air, the rustling of someone in the bushes) will result in the same error. The bias to sense that approaching sounds are going to arrive sooner than they really will is known as auditory looming. One study found that this “error” was made by 100 percent of people.

  Like many errors and biases that seem irrational on the surface, auditory looming turns out, on closer examination, to be pretty smart. Other animals, like rhesus monkeys, have evolved the same bias. This intentional error functions as an advance warning system, manned by the self-protection subself, providing individuals with a margin of safety when confronted with potentially dangerous approaching objects. If you spot a rhinoceros or hear an avalanche speeding toward you, auditory looming will motivate you to jump out of the way now rather than wait until the last second. The evolutionary benefits of immediately getting out of the way of approaching dangers were so strong that natural selection endowed us—and other mammals—with brains that intentionally see and hear the world inaccurately. Although this kind of bias might inhibit economically rational judgment in laboratory tasks, it leads us to behave in a deeply rational manner in the real world. Being accurate is not always smart.

  NOT ALL ERRORS ARE CREATED EQUAL

  Although many of our decision errors and biases might look like random design flaws at the surface level, a deeper look often reveals that they aren’t random flaws at all. The mind evolved not simply to be biased but to make specific kinds of errors and not others. Consider one of our friends, who recently purchased a house with his family. During the first night in their new home, the family endured a frightening ordeal. As everyone was sleeping, a ceiling-mounted smoke detector sounded a piercing alarm. Our friend woke up to an acrid smell and quickly remembered in horror that he had left several boxes of packing materials next to the stove. His wife hustled frantically to get the kids out of the house, while he managed to call 911.


  Thankfully, it was a false alarm. There was no fire after all—the neighbors were just burning wood in their grungy, resin-coated fireplace. A few weeks later the smoke alarm shrieked again, waking everyone in the wee hours of morning. Once again, there was no fire. A few weeks later, there was another false alarm, and then another. Annoyed that the smoke detector was making so many errors, our friend decided to adjust the device to reduce its sensitivity. As he was about to make the smoke detector less responsive, though, his young daughter became visibly distraught. “But daddy,” she cried, “what if there is a fire and the alarm doesn’t sound?”

  This is the smoke detector dilemma. Do you want to set your smoke detector to be less sensitive or more sensitive? This depends on how much you value not being annoyed versus not being trapped in a burning home. Most people would rather have an oversensitive smoke detector, because by tolerating the occasional irritating error they ensure that their families remain alive. We intentionally bias the judgments of our smoke detectors because this helps ensure our survival.

  The smoke detector dilemma is the same conundrum natural selection had to resolve in designing many of our own built-in decision-making systems. For many decisions, our brains are wired like ancestral smoke detectors. A smoke detector is designed to make judgment calls despite having incomplete information. When it senses trace amounts of airborne particles that might resemble smoke, the device needs either to start screaming, “Fire! Fire! Fire!” or to remain silent. Our brains have similarly evolved to make judgment calls without having all the pertinent information.

  Imagine you’re about to run an errand and need to decide whether to take an umbrella. You must choose whether to lug this clunky bumbershoot around town all day or take a chance and leave it at home. Your decision will depend on whether you think it’s going to rain. The forecast says there is a 50 percent chance of rain, and you see some clouds in the sky. But you also know there can be clouds without rain, and weather forecasts are notoriously inaccurate. Like a smoke detector, you need to make a decision based on imperfect information.

  The decision-making process will inevitably produce some errors. We simply can’t be right all the time in a world with imperfect and incomplete information. But not all errors are created equally. There are two fundamentally different types of errors, one of which is much costlier than the other. In the umbrella case, one type of error is that you bring the umbrella, and it doesn’t rain. Known as a false alarm, this error is like that of a smoke detector sounding without a fire. It’s annoying but not a huge deal. Alternatively, a second type of error is that it rains, and you fail to bring your umbrella; this is known as a miss. Misses are often much costlier than false alarms in detecting threats. In the umbrella case, a miss means you may be drenched to the bone and ruin your new dress jacket. In the smoke detector case, a miss has even more dire consequences: you could die.

  Natural selection creates systems, like the brain, that are biased to minimize the costlier error. This built-in bias to avoid evolutionarily expensive errors is known as the smoke detector principle. Evolutionary psychologists Martie Haselton and Randy Nesse believe that natural selection engineered human judgment and decision making to be biased according to the same principle. Like a good smoke detector, our brain is rigged to sound the alarm even when there is no fire, forcing us to tolerate the inconvenience of false alarms to avoid potentially lethal misses. Because our evolutionary tendencies steer us toward avoiding costly errors, our decisions will result in more small errors. But we are disposed to produce little errors so that we avoid big mistakes.

  MONEY UP IN SMOKE

  The smoke detector principle underlies a plethora of decision errors, such as when you expend the extra effort to put on your seat belt, then don’t get in an accident. The decision to wear a seat belt produces an error 999 times out of 1,000. In the overwhelming majority of situations, we drive around strapped into an irritating harness that provides no benefit (it’s akin to lugging around an unwieldy umbrella on a sunny day). Yet after a few decades of griping, most sensible people now happily choose to commit the seat belt error, and the result has been a dramatic drop in auto fatalities compared to our belt-resisting grandparents’ day. The smoke detector principle leads people to make this small error to avoid the much costlier mistake of getting into an accident without wearing a seat belt.

  The smoke detector perspective on decision errors is radically different from how such errors are normally viewed. Many economists, for example, argue that people are especially prone to making errors when it comes to money. Books such as Why Smart People Make Big Money Mistakes and Mean Markets and Lizard Brains allege that people are foolish financial investors. They claim that one of the most common money mistakes is that people take too little risk with their financial investments—humans are notoriously risk averse. Because US stocks have historically outperformed all other types of investments over the long run, many rational types are befuddled that more people are not investing in stocks. If you have saved up $5,000 and have many years left before retirement, the smart and rational Econ would put this money in a mix of investments that include a fair share of risky stocks. Instead, many people choose to tuck the $5,000 away in a bank—where it is likely to earn a piddling interest rate that barely covers the loss due to inflation. From a traditional economic perspective, this seems irrational.

  But think about the same financial decision in light of the smoke detector principle. The decision about what to do with your $5,000 life-savings account could produce two types of errors. One possibility is that you put the money in the bank and the stock market booms. Instead of having a nice return on your investment, you have only your $5,000 plus a measly 1.8 percent interest. This is the supposedly egregious error that makes risk aversion appear so irrational.

  We agree that this is an error, but making this error won’t cost you your shirt and tie (and your life savings). Consider the other possibility: you invest all your money in stocks, and the market plummets. Now your life savings are gone. Indeed, one of us moved a good portion of his retirement funds from low-interest bond accounts into stocks in 2001. Within a few months of that decision, the stock market took a historically unprecedented dive in value and then another nosedive a short time after that. As we mentioned in the book’s opening pages, he is now in a position to retire sometime around age seventy-nine or eighty and, even then, may have to live out his golden years in a modest hut somewhere in Ecuador.

  The smoke detector principle has preset our brains to be wary of situations in which we could lose our resources, because this is the much costlier error. The human tendency toward risk aversion may lead to errors, but it is a calculated bias—engineered by natural selection to avoid a much bigger mistake.

  DIFFERENT BIASES FOR DIFFERENT SUBSELVES

  When are we inclined to be overly conservative in our judgments, and when are we inclined to be more carefree? It depends on which of our subselves is currently in charge of making judgments and decisions. Different subselves have different evolutionary goals, and the criteria for what makes a smart decision differ quite drastically depending on whether you’re currently worried about disease as opposed to seeking a mate or caring for your children. Let’s go inside the mind of our different subselves to see how this works.

  THE BEHAVIORAL IMMUNE SYSTEM: IN THE MIND OF YOUR DISEASE-AVOIDANCE SUBSELF

  Think about the last time you sneezed. Was your body in real danger? Probably not. When we sneeze, our body has frequently made an error, because there is no real threat. Instead, some dander particle or a whiff of chili powder sets off a false alarm. But we are physiologically wired to sneeze at the slightest of respiratory irritations because our body’s defenses evolved to be safe rather than sorry.

  Our immune system is rigged the same way. An army of white blood cells is ready to go to war if it senses any germ-like substance swimming in our bloodstream. But sending the entire body to war can be expensive, using up precious bodily resou
rces. So natural selection endowed us with a crude first line of defense against pathogens—what evolutionary psychologist Mark Schaller calls the behavioral immune system. This psychological system is a set of disease-avoidant thoughts and behaviors operated by our disease-avoidance subself.

  The behavioral immune system is a pathogen detector. Just as smoke detectors are sensitive to anything that could resemble smoke, the behavioral immune system is hypersensitive to anything associated with disease. The system is triggered by the sight, smell, or sound of people, places, or odors that could signal dangerous pathogens in the vicinity. Its alarm goes off anytime we’re exposed to unsightly sores on someone’s arm, the scent of decaying meat, or a man coughing on the bus. When our senses detect something or someone that smells, looks, or sounds strange or different, our disease-avoidance subself makes sure that our behavior changes accordingly.

  One study by Josh Tybur and Angela Bryan found that when people were exposed to an unpleasant odor, they were more willing to use condoms. In this case the pathogen cue was superficial and harmless (a squirt from a gag aerosol with a sulfur dioxide scent reminiscent of flatulence and descriptively named “Liquid Ass”). Nevertheless, the harmless stench triggered people’s behavioral inclination to protect themselves from sexually transmitted disease.

  The realm of psychology (the behavioral immune system) and the realm of biology (the physical immune system) are often thought of as independent spheres. But evolution has harnessed both psychology and biology in a brilliantly coordinated, deeply rational system. As it turns out, activating the behavioral immune system in turn triggers a red alert in the body’s physical immune system, the standing army of T cells and other germ-killing lymphocytes. Mark Schaller conducted a clever experiment to test just how closely the two systems work together. He and his team asked people to watch a slide show containing pictures of people who looked sick. The photos depicted folks with rashes and pox, as well as images of people coughing and sneezing as mucus spewed out of their noses. Study participants gave blood samples both before and after watching the images. The researchers then exposed these samples to a bacterial infection to measure the presence of interleukin-6 (IL-6), a pro-inflammatory cytokine that white blood cells make when they detect microbial intruders. A higher level of IL-6 indicates that the body has begun to mount a more aggressive immune response to infection—it’s the equivalent of the immune system’s troops preparing to go to battle.

 

‹ Prev