•Failing to account for your risk tolerance
•Failing to plan ahead when decisions are linked over time
But, in addition to these process mistakes, there’s an entirely different category of errors that can undermine even the most carefully considered decisions. We call these errors ‘‘psychological traps.’’ They arise because our minds sometimes play serious tricks on us.
For half a century psychologists and decision researchers have been studying the way our minds function when we make decisions. This research, in the laboratory and in the real world, has revealed that we develop unconscious routines to cope with the complexity inherent in most decisions. These routines, known as heuristics, serve us well in most situations. In judging distance, for example, our minds often rely on a heuristic that relates clarity with closeness. The clearer an object appears, the closer it must be. The fuzzier it appears, the further away it must be. This simple mental shortcut helps us to make the continuous stream of distance judgments required to navigate the world.
Yet, like most heuristics, it isn’t foolproof. On days that are hazier than normal, our eyes will tend to trick our mind into thinking that things are more distant than they actually are. Because the resulting distortion poses few dangers for most of us, we can safely ignore it. For airline pilots, though, the distortion could be catastrophic. That’s why pilots always use objective measures of distance in addition to their vision.
Researchers have identified a whole series of such flaws in the way we think. Some, like the clarity heuristic, take the form of sensory misperceptions. Others take the form of biases. Others appear simply as irrational anomalies in our thinking. What makes all these traps so dangerous is their invisibility. Because most are hard-wired into our thinking process, we fail to recognize them— even when we’re falling right into them.
Though we can’t rid our minds of these ingrained flaws, we can learn to understand the traps and compensate for them. In this chapter, we examine some of the most common psychological traps and how they affect decision making. By familiarizing yourself with them and the diverse forms they take, you’ll be better able to ensure that the decisions you make are sound and reliable. The best protection against these traps is awareness.
Overrelying on First Thoughts: The Anchoring Trap
How would you answer these two questions?
•Is the population of Turkey greater than 35 million?
•What’s your best estimate of Turkey’s population?
If you’re like most people, the figure of 35 million cited in the first question—a figure that we chose arbitrarily—influenced your answer to the second question. Over the years, we’ve posed these questions to many groups of people. In half the cases we use 35 million in the first question; in the other half we use 100 million. Without fail, the answers to the second question increase by many millions when the larger figure is used in the first question. This simple test illustrates the common and often pernicious mental phenomenon known as anchoring. In considering a decision, the mind gives disproportionate weight to the first information it receives. Initial impressions, ideas, estimates, or data ‘‘anchor’’ subsequent thoughts.
Anchors take many guises. They can be as simple and seemingly innocuous as a comment offered by your spouse or a statistic appearing in the morning newspaper. They can be embedded in the wording of your decision problem. One of the most common types of anchors is a past event or trend. A forecaster attempting to project the number of patients who will visit a medical clinic next January often begins by looking at the number who visited last January. The historical number becomes the anchor, which the forecaster then adjusts based on other factors. Although this approach may often lead to a reasonably accurate estimate, it tends to give too much weight to the past figure and not enough weight to other factors. Particularly in situations characterized by rapid change, the historical anchor can lead to poor forecasts and, in turn, to misguided choices.
Whatever their source, anchors often prejudice our thinking in ways that prevent us from making good decisions. Because anchors have the effect of establishing the terms on which a decision will be made, they are often used by savvy negotiators as a bargaining tactic. Say you’ve been looking for a work of art to hang over the fireplace in your living room. You visit a local art dealer and see on display a unique and compelling painting by an unknown young artist—a work that has no clear market value (and no price tag!). You estimate its worth at approximately $1,200, but when you begin talking about the painting with the dealer, he immediately suggests a price of $2,800. As an opening gambit, that price may be designed to anchor you, to shift your sense of the piece’s worth upward. If you respond by attempting to bargain down from $2,800, the final cost may be unduly influenced by the dealer’s initial proposal—the anchor.
What can you do about it? The effect of anchors in decision making has been documented in thousands of experiments. Anchors influence the decisions of everyone—doctors, lawyers, managers, homeowners, students. No one can avoid their influence; they’re just too widespread. You can, however, reduce their impact by using the following techniques:
•Always view a decision problem from different perspectives. Try using alternative starting points and approaches rather than seizing on and sticking with the first line of thought that occurs to you. After exploring various paths, reconcile any differences in their implications.
•Think about the decision problem on your own before consulting others, to avoid becoming anchored by their ideas.
•Seek information and opinions from a variety of people to widen your frame of reference and push your mind in fresh directions. Be open-minded.
•Be careful to avoid anchoring other people from whom you solicit information and counsel. Tell them as little as possible about your own ideas, estimates, and tentative decisions. If you say too much, you may simply get back your own preconceptions (which have become your advisors’ anchors).
•Prepare well before negotiating. You’ll be less susceptible to anchoring tactics.
Keeping on Keeping On: The Status Quo Trap
You inherit 100 shares of a blue-chip stock that you would never have bought yourself. You can sell the shares and reinvest the money for a minimal commission and no tax consequences. What will you do?
When faced with this situation, a surprising number of people hang on to the inherited shares. They find the status quo comfortable, and they avoid taking action that would upset it. ‘‘Maybe I’ll rethink it later,’’ they say. But later is always later.
In fact, most decision makers display a strong bias toward alternatives that perpetuate the current situation. On a broad scale, we can see this tendency at work whenever a radically new product is introduced. The first automobiles, revealingly called ‘‘horseless carriages,’’ looked very much like the buggies they replaced. The first ‘‘electronic newspapers’’ appearing on the World Wide Web looked very much like their print precursors.
Many psychological experiments have shown the magnetic attraction of the status quo. In one, a group of people were randomly given one of two gifts—half received a decorated mug, the other half, a large Swiss chocolate bar. They were then told they could effortlessly exchange the gift they received for the other gift. You might expect that about half would have wanted to make the exchange, but only one in ten actually did. The status quo exerted its power even though it had been arbitrarily established only minutes before!
Other experiments have shown that the pull of the status quo is even stronger when there are several alternatives to it as opposed to just one. More people choose the status quo when there are two alternatives to it, A and B, than when there is only one, A. Why? Choosing between A and B requires more effort.
What can you do about it? First of all, remember that, in any given decision, maintaining the status quo may indeed be the best choice—but you don’t want to choose it just because it is the status quo. Use these techniques to le
ssen the pull of the present:
•Always remind yourself of your objectives and examine how they would be served by the status quo. You may find that elements of the current situation are incompatible with those objectives.
•Never think of the status quo as your only alternative. Identify other options and use them as counterbalances, carefully evaluating all their pluses and minuses.
•Ask yourself whether you would choose the status quo alternative if, in fact, it weren’t the status quo.
•Avoid exaggerating the effort or cost involved in switching from the status quo.
•Put the status quo to a rigorous test. Don’t simply compare how the status quo is with how the other alternatives would be. Things can change with the status quo, too.
•If several alternatives are clearly superior to the status quo, don’t default to the status quo because you have a hard time picking the best one. Force yourself to choose one.
Protecting Earlier Choices: The Sunk-Cost Trap
Three months ago, your eight-year-old car suddenly required serious engine repairs. Faced with spending $3,000 on the engine work or junking the car and buying a new one, you chose the repairs. Now, however, your transmission’s shot, and fixing it will cost you another $1,500. Alternatively, you could sell the car as is for $1,000 and buy a new one. You know that the car will likely require further expensive repairs in the future, though you hope it won’t happen soon. What will you do?
If you’re like most people, you’ll decide to fix the transmission, not wanting to ‘‘lose’’ the $3,000 you already spent on the engine. But that’s the wrong reason for the choice! Would you make the same choice if the engine repair had (miraculously) been done for free? Almost certainly not—yet that’s how you should think about the problem. What matters now is the current condition of the car and the economic pros and cons of the two alternatives. The past is past; what you spent then is irrelevant to your decision today.
As this example illustrates, we tend to make choices in a way that justifies past choices, even when the past choices no longer seem valid. Our past decisions create what economists term ‘‘sunk costs’’—old investments of time or money that are now unrecoverable. We know, rationally, that sunk costs are irrelevant to the present decision, but nevertheless they prey on our psyche, leading us to make wrong-headed decisions. We may have refused, for example, to sell a stock or a mutual fund at a loss, forgoing other, more attractive investments. Or we may have poured enormous effort into improving the performance of an employee whom we know we shouldn’t have hired in the first place. Remember, your decisions influence only the future, not the past.
Why can’t people free themselves from past decisions? Sometimes it’s just fuzzy thinking. But frequently it’s because they are unwilling, consciously or not, to admit to a mistake (even if the ‘‘mistake’’ was caused by bad luck rather than a bad decision). Acknowledging a decision that’s gone awry may be purely a private matter, involving only one’s self-esteem, but in many cases it’s a very public matter, inviting critical comments or negative assessments from friends, family members, colleagues, or bosses. If you fire your poorly performing recent hire, you’re making a public admission of poor judgment. It seems psychologically safer to let him stay on, even though all you’re doing is compounding the error.
What can you do about it? For all decisions with a history, you will need to make a conscious effort to set aside any sunk costs— whether psychological or economic—that will muddy your thinking about the choice at hand. Try these techniques:
•Seek out and listen carefully to the views and arguments of people who weren’t involved with the earlier decisions and hence are unlikely to have a commitment to them.
•Examine why admitting to an earlier mistake distresses you. If the problem lies in your own wounded self-esteem, deal with it head on. Remind yourself that even smart choices can have bad consequences and that even the most experienced decision makers are not immune to errors in judgment. Remember the wise words of the noted investor Warren Buffet: ‘‘When you find yourself in a hole, the best thing you can do is stop digging.’’
•If you worry about being second-guessed by others, make this consequence an explicit part of your decision process. Also consider how you would explain your new choice to these people.
•If you fear sunk-cost biases in your subordinates at work, pick one who was previously uninvolved to make the new decision. (See the example below.)
Avoiding the Sunk-Cost Bias: Reassigning Bankers
The sunk-cost bias shows up with disturbing regularity in banking, where it can have particularly dire consequences. When a borrower’s business runs into trouble, a lender will often advance additional funds in hopes of providing the business with some breathing room to recover. If the business does have a good chance of coming back, that’s a good investment. Otherwise, it’s throwing good money after bad.
Fifteen years ago, we helped a major U.S. bank recover after making many bad loans to foreign businesses. We found that bankers responsible for originating the problem loans were far more likely to advance additional funds—repeatedly, in many cases—than were bankers who took over the accounts after the original loans were made. Too often, the original bankers’ strategy—and loans—ended in failure. Having been trapped by an escalation of commitment, they had tried, consciously or unconsciously, to protect their earlier, flawed decisions. They had fallen victim to the sunk-cost bias. The bank finally solved the problem by instituting a policy requiring that a loan be immediately reassigned to another banker as soon as any problem became serious. The new banker would be able to take a fresh, unbiased look at whether offering more funds had merit.
Seeing What You Want to See: The Confirming-Evidence Trap
For a while you’ve been concerned that the stock market has gone too high, and you’ve all but decided to sell most of your portfolio and invest the cash in a money market mutual fund. But before you call your broker, you decide to do one more thing to check the wisdom of selling. You call a friend, who you know sold out her portfolio last week, to find out her reasoning. She presents a strong case for an imminent market decline. What do you do?
You’d better not let that conversation be the clincher, because you’ve probably just fallen into the confirming-evidence trap. This trap leads us to seek out information that supports our existing instinct or point of view while avoiding information that contradicts it. What, after all, did you expect your friend to give other than a strong argument in favor of her own decision?
The confirming-evidence trap not only affects where we go to collect evidence, but also how we interpret the evidence we do receive, leading us to give too much weight to supporting information and too little to conflicting information. If you had read an article on the stock market in an investing magazine, for example, you would have tended to be less critical of arguments in favor of selling stock and more critical of arguments in favor of remaining in the market.
In one psychological study of this phenomenon, groups opposed to and supporting capital punishment read two reports of careful research on the effectiveness of the death penalty. One report concluded that the death penalty was effective; the other concluded that it was not. In spite of being exposed to solid scientific information supporting counterarguments, the members of both groups became even more convinced of the validity of their own position after reading both reports. They automatically accepted the supporting information and dismissed the conflicting information.
There are two fundamental psychological forces at work here. First is our tendency to subconsciously decide what we want to do before we figure out why we want to do it. Second is our tendency to be more engaged by things we like than by things we dislike—a tendency well documented even in babies. Naturally, then, we are drawn to information that confirms our subconscious leanings.
What can you do about it? It’s not that you shouldn’t make the choice toward which you’re subconsc
iously drawn. It’s just that you want to be sure it’s the smart choice. You need to put it to the test. Here’s how:
•Get someone you respect to play devil’s advocate, to argue against the decision you’re contemplating. Better yet, build the counterarguments yourself. What’s the strongest reason to do something else? the second strongest reason? the third? Consider the position with an open mind.
•Be honest with yourself about your motives. Are you really gathering information to help you make a smart choice, or are you just looking for evidence confirming what you think you’d like to do?
•Expose yourself to conflicting information. Always make sure that you examine all the evidence with equal rigor and understand its implications. And don’t be soft on the disconfirming evidence.
•In seeking the advice of others, don’t ask leading questions that invite confirming evidence.
Posing the Wrong Question: The Framing Trap
A young priest asked his bishop, ‘‘May I smoke while praying?’’ The answer was an emphatic ‘‘No!’’ Later, encountering an older priest puffing on a cigarette while praying, the younger priest scolded, ‘‘You shouldn’t be smoking while praying! I asked the bishop, and he said I couldn’t.’’
‘‘That’s strange,’’ the older priest replied. ‘‘I asked the bishop if I could pray while I’m smoking, and he told me that it was okay to pray at any time.’’
As this old joke shows, the way you ask a question can profoundly influence the answer you get. The same is true in decision making. If you frame your problem poorly, you’re unlikely to make a smart choice.
In a recent case involving automobile insurance, framing made a $200 million difference. To reduce insurance costs, two neighboring states, New Jersey and Pennsylvania, made similar changes in their laws. Each state gave drivers a new option: by accepting a limited right to sue, they could lower their premiums. In New Jersey you automatically got the limited right to sue unless you specified otherwise, but in Pennsylvania the choice was framed so that you automatically got the full right to sue unless you specified otherwise. In New Jersey, about 80 percent of drivers chose the limited right to sue, while in Pennsylvania only 25 percent chose it. The different frames in this case established different status quos, creating biases that in large part determined consumers’ behavior. As a result, Pennsylvania failed to gain approximately $200 million in expected insurance and litigation savings.
Smart Choices Page 16