Predictably Irrational
Page 23
Beyond spam, I get a lot of other proposals and requests. I was particularly curious about one that came from my local cable company. It promised one month of free digital cable. Since I’m interested in any offer that has the word free in it (see Chapter 3, “The Cost of Zero Cost”), I decided to take this one on. I called the company, and in a matter of days, a technician came to my house and installed my free digital cable. A month later, I received a bill for the free digital cable and found that it actually cost $60. When I called the customer service department, the nice fellow who took my call patiently explained that, unfortunately, I had problems with reading comprehension. He pointed out that the terms of the agreement were clearly explained in the seven-point font at the bottom of the company’s ad. After I paid for the regular analog service, the box, the connection fee, and the remote, it said, I would “get for free” the difference between that amount ($60) and the standard amount ($79) for digital cable.
I generally consider myself a fairly trusting person, but with all the dubious offers and news about the bad behavior of businesses, I feel myself becoming less trusting and more suspicious. It seems that I’m always looking for the catch, and it turns out that I’m not alone in this paranoid mindset. Some years ago, two very perspicacious researchers, Marian Friestad and Peter Wright, suggested that people in general are starting to understand that the offers companies put before us are in their best interest and not ours. As a consequence, we’ve become more distrustful—not only of those who are trying to swindle us but of everyone.
Free Money
After talking about our own experience of increasing disillusionment, Ayelet Gneezy (a professor at the University of California at San Diego), Stephen Spiller (a doctoral student at Duke University), and I decided to try to measure the extent of the public’s suspicion of companies. Our first question was how to measure the extent of distrust. We could, of course, ask each person in a group, “How suspicious do you feel on a scale from 1 to 10, 1 being not at all suspicious and 10 being very suspicious?” The problem with this type of measurement is that it’s difficult to tell exactly what 5.7 on such a scale really means. Would it suggest that people believed that the amazing offer from their cable company was real? Or would it mean that they were skeptical enough to read the fine print first? In addition, a lot of our research has shown that people often have wrong intuitions about their own behavior—they can say one thing but do another.
So, we decided to measure a behavior that would tell us people’s degree of distrust, and our tool of choice was a free money experiment. One lovely spring day, we set up a booth at a big commercial center in Cambridge, Massachusetts, manned by some undergraduate students. Above our booth was a large sign that read “Free Money.”
On the front of the booth was a smaller sign that indicated how much money we would give people for free. Sometimes the sign said $1, sometimes $5, and other times $10, $20, or even $50. The busy people who worked at this commercial center passed a sign saying “Free Money, $20” (or whichever denomination it happened to be at the time) on their way to lunch or as they were leaving for the day. They also saw a pile of bills in the given denomination on the table. How many people do you think stopped and took us up on our offer?
We didn’t expect people to slow down and simply pick one of the bills off the table, taking us (or our sign) completely literally. We thought they would first ask if we were serious. On hearing us say, “Yes, we are, and you are welcome to take a bill,” we assumed they would take a bill and go on their way. And lo, every so often someone would walk by the booth slowly, reading the sign and glancing at the students, who really did not look like crooks.
Here’s how the scenario generally went:
MAN: (warily approaching the booth and eyeing the $50 bills) Is this a trick?
STUDENT: (smiling) Absolutely not.
MAN: Is there something I have to sign?
STUDENT: Nothing to sign.
MAN: There must be a catch.
STUDENT: No.
MAN: Is this for real?
STUDENT: Yes. Help yourself! One to a customer.
Reassured, the man would look around and help himself to a bill. He would hold it up for a second, as if waiting for something to happen. Then, he would turn around and begin to walk slowly away. His pace would eventually pick up, and he would disappear around a corner.
When we offered $1, only 1 percent of those who passed our booth actually stopped to check it out. When we offered $5, a few more did, and so on up to $50. But even when we offered $50, only 19 percent of passersby stopped and took a bill. We were surprised and slightly disappointed with this low level of trust; 19 percent is not a high level of success, particularly when free money is (literally) on the table.
It is important to realize that in this setting, people did not necessarily have to believe that the money we offered was entirely free. They might have expected to have to do something in return—answer a short survey, for example. But even if they suspected as much, it might still be the case that the payment was a worthwhile return on an investment of answering a few questions.
Clearly, the vast majority believed that this was some kind of trick—so much so that it was not worthwhile to even ask. Occasionally, one of the students would approach someone who had clearly seen the offer on the booth but chosen to ignore it. When asked why they chose not to approach the booth, respondents said they believed that it was some kind of scheme. (A favorite claim made by economists is that there are no $100 bills lying on the sidewalk; if there were, the argument goes, someone would have already picked them up.) To us, this looked like evidence of deep mistrust.
The Tragedy of the Commons
Trust, like money, is a crucial lubricant for the economy. When people trust other people, a merchant, or a company, they are more likely to buy, lend, and extend credit. In the old days, business was conducted on a gentleman’s handshake. But when the handshake results in a swindle, trust disappears and all subsequent transactions—whether between cheaters or the genuinely good-hearted—become more difficult.
A good analogy for social distrust can be found in the “tragedy of the commons.” This phrase can be traced back to Oxford professor William Forster Lloyd, who described the phenomenon in his 1833 book on population. He noted that in medieval England, parishes had common land on which each member of the community could graze a limited number of cattle and sheep.* Keeping the number of animals low allowed the grass to grow back at a speed that kept its level more or less the same. This approach was rather successful when all the farmers stuck to the rule. Unfortunately, in their selfish desire to improve their own financial situations, some of the farmers increased the number of their animals to a level that the land could not sustain. This strategy was very good (at least in the short run) for the individual farmers who had more animals, but each additional cow or sheep resulted in less grass for all of the animals. As the grass dwindled, all the livestock on the commons became malnourished and underproductive—a result that hurt everyone, including the greedy farmers.
Today, psychologists, economists and environmentalists use the phrase “the tragedy of the commons” to describe the same basic principle: when we use a common resource at a rate that is slower than the rate at which it replenishes, all is well. However, if a few individuals get greedy and use more than their share, the system of consumption becomes unsustainable, and in the long term, everybody loses. In essence, the tragedy of the commons is about two competing human interests. On one hand, an individual should care about the sustainability of shared resources in the long term because everyone, including the individual, benefits from it. At the same time, in the short term, the individual benefits immediately from taking more than his or her fair share. (Social scientists refer to such betrayers of social contracts as “defectors.”)
Of course, if we all cared about the common good or thought about the long-term consequences of our actions, we might not run into resource-sharing pr
oblems. But because human beings tend to focus on short-term benefits and our own immediate needs, such tragedies of the commons occur frequently. Take the wild salmon population, for example. While it is ideal for fishermen in general to limit their own catches so that the salmon population can be sustained, it’s more profitable for an individual fisherman to overfish in a given year. But if too many fishermen even slightly surpass the sustainable limit, the overall fish population becomes depleted. (For this reason, salmon fishermen are now constrained by law to a limited number every year).
The current energy crisis is another example of the tragedy of the commons. Although there is a finite amount of fossil fuel in the world, some countries, industries, businesses, and individuals use far more than others while making little effort to minimize their impact on the common pool. Even public resources such as clean air, land, trees, and water fall victim to this problem. In the absence of cooperation among all the players in protecting such resources, a small number of misbehaving entities can have a devastating effect on everyone.
The Public Goods Game
The following thought experiment offers an interesting example of the phenomenon of the tragedy of the commons. Imagine that I offer you and three other people $10 each, which is yours to keep. But I also give you an opportunity to make more money. You can put as much of your $10 as you want into the “group pot.” Once all the players have privately decided how much of their money to put in the group pot, all the money doubles and then is split evenly among the four participants, regardless of how much money each individual contributed. How much of your $10 would you put into the group pot? If all four of you put in $10, the group pot would double from $40 to $80, be divided four ways, and each of you would take away a nice $20.
So let’s say you put in your $10, thinking that the other players will do the same. Once the pot is divided, however, you only get $15 back, not $20. What happened? It turns out that one of the other players—let’s call him “Bernie”—decided to cheat. At the beginning of the game, Bernie realized that he would make the most money if he withheld his $10 while everyone else contributed to the central pot. So Bernie kept his $10 while the other three players placed their $10 in the central pot, making the total $30. This doubled for a total of $60, subsequently split four ways (remember, Bernie gets the proceeds of the group pot, whether or not he contributes), and each of the three contributors get $15. Bernie, meanwhile, makes out with $25 (his original $10, plus $15 from the split), more than any other player.
Now, let’s say you get a chance to play the same game again with the same players (I give you all new $10 bills). How would you play this time? You don’t want to be too trusting because you know that Bernie might defect again. So you only put in $4. It turns out that the other three players feel the same way and make the same decision, making the total in the group’s pot $12 (Bernie again does not put in any money). The pot doubles to $24, which is split evenly among four people. Each of the three contributing players gets $6 (in addition to the $6 they kept for themselves), while Bernie winds up with $16 in total.
Now, your trust has been eroded. You play a few more games, but you don’t put in any of your money. Each time you end up with the $10 you started with. You don’t lose anything, but since you don’t trust others to behave in a cooperative way (and neither do the other players), you don’t gain anything either. In contrast, if you and the others had acted cooperatively, each of you would have wound up with as much as $20 per game.
Viewed this way, the Public Goods Game illustrates how we, as a society, share the public good of trust. When we all cooperate, trust is high and the total value to society is maximal. But distrust is infectious. When we see people defect by lying in their advertisements, proposing scams, etc., we start acting similarly; trust deteriorates, and everybody loses, including the individuals who initially gained from their selfish acts.
If we start to think about trust as a public good (like clean air and water), we see that we can all benefit from higher levels of trust in terms of communicating with others, making financial transitions smoother, simplifying contracts, and many other business and social activities. Without constant suspicion, we can get more out of our exchanges with others while spending less time making sure that others will fulfill their promises to us. Yet as the tragedy of the commons exemplifies, in the short term it is beneficial for each individual to violate and take advantage of the established trust.
I suspect that most people and companies miss or ignore the fact that trust is an important public resource and that losing it can have long-term negative consequences for everyone involved. It doesn’t take much to violate trust. Just a few bad players in the market can spoil it for everyone else.
Erosion of Trust over Time
In the picaresque film Little Big Man, the protagonist, a once-trusting but increasingly cynical young Jack Crabb (played by Dustin Hoffman), takes up with a one-eyed snake-oil salesman named Merriweather, who promises him that “if you stick with Merriweather, you’ll wear silk.” Crabb and Merriweather travel from town to town in the Old West, selling a miracle drug that Merriweather promises can make the blind see and cripples walk, but that is really just a brew concocted from rattlesnake heads. Of course, people get sick after swallowing it, and the townspeople take revenge by tarring and feathering the two tricksters. (In the real world of 1903, an analogous medicine called Rexall Americanitis Elixir was “especially recommended for nervous disorders, exhaustion, and all troubles arising from Americanitis.”20) Of course, old-fashioned trust abusers like Merriweather and the makers of Rexall Americanitis Elixir are but mild offenders relative to those of the modern age.
Today, it’s easy for one individual to start selling “wonder pills” that promise to help you lose weight, keep or regrow your hair, enhance your sex life, and energize your workday. Unsuspecting people who buy such concoctions get all the benefits of placebos and, in the process, lose plenty of money. Meanwhile, the dishonest pill-purveyors gain substantially while further eroding the overall level of trust (at least for those who didn’t benefit from the placebo effect). This erosion not only makes it harder for the next wonder pill salesperson to peddle goods (which by itself is a good thing), but also makes it more difficult for us to believe those who truly deserve to be trusted.
THERE’S ANOTHER ASPECT to this problem. Imagine that you are an honest person and you want to remain that way. How should you behave in a world where most people are untrustworthy and most individuals don’t trust others anyway?
For a concrete example, suppose you’ve just joined an online dating site. If you suspect that most people on the site slightly exaggerate their vital and biographical statistics, you’re right! When Günter Hitsch and Ali Hortaçsu (both professors at the University of Chicago) and I looked into the world of online dating, we discovered that men cared mostly about women’s weight and women cared mostly about men’s height and income. We also discovered, perhaps not surprisingly, that the online women reported their weight to be substantially below average, while the men claimed to be taller and richer than average. This suggests that both men and women know what the other half is looking for, and so they cheat just a little bit when describing their own attributes. A fellow who is 5'9" and earns $60,000 annually typically gives himself an extra inch and $30,000 raise, describing himself as being 5'10" and making $90,000. Meanwhile, his potential partner remembers her weight in college and, with a 5 percent discount, becomes 133 pounds.
But what happens if you’re a 5'9" man and you decide to be honest since you believe honesty is a critical component of a good relationship? You’re going to be penalized because women reading your 5'9" assume that your real height is 5'8" or 5'7". By refusing to cheat, you have substantially lowered your market value. So what do you do? You sigh, stuff your hands in your pockets, and realize that there’s a lot stacked against honesty. Given the importance of finding a partner and having surrendered to the sad fact that everyone cheats
a little, you too, I suspect, would give in and decide to fudge the facts just a little bit.
Of course, once we begin to cheat, even if only by a little, over time it can become a habit. Consider, for example, the process of writing a résumé. I see many of my students’ résumés when they apply for jobs or graduate school and ask me for letters of recommendation. In their desire to stand out and grab the attention of the prospective employer, and because they think that everyone exaggerates a bit, they do too. Accordingly, anyone who’s ever taken intro to statistics is suddenly “fluent in statistical analysis,” a part-time job spent inputting data for an experiment turns into “assisting in data analysis,” and a two-month internship in Paris becomes “fluent in French.” In fact, the situation is so severe that when my research assistants show me their résumés, I sometimes feel as though the projects we have worked on together are actually theirs and that I’ve been assisting them.
How Deep Is Your Mistrust?
Following our “free money” experiment, Ayelet, Stephen, and I set up an experiment to see just how deep the level of mistrust of companies really goes. Specifically, we wanted to find out the degree to which people would doubt obviously truthful statements when these statements were associated with a brand.
We started out by asking people whether they thought that completely unambiguous statements such as “the sun is yellow” and “a camel is bigger than a dog” were true or false, and 100 percent of the participants agreed they were true. Then we asked another group of people to evaluate the same statements, with the added information that they were made by either Proctor & Gamble, the Democratic Party, or the Republican Party. Would giving these statements a corporate or political origin color our participants’ impressions and would they be more likely to suspect the truthfulness of these statements?