by Gneezy, Uri
The weight-loss experiment is a small one; but consider a large-scale experiment Humana is running today. Although McCallister believes all people should have access to affordable healthcare, he recognizes that the Medicare bureaucracy has very little incentive to invest in preventive care. This, McCallister says, leads to “fraud, abuse, and overuse of services.” In the face of a huge generation of rapidly aging baby boomers and ballooning healthcare costs, he thinks there’s a much better way of delivering patient care—one that focuses on patient wellness, which he believes saves both money and lives.
To that end, the company recently adopted a mantra: help people achieve lifelong well-being. But what works? To find out, he hired a consultant named Judi Israel to build a “behavioral economics consortium.” As part of this consortium, we helped design some field experiments and behavioral interventions. Our common goal was to see what kinds of interventions best helped patients improve or stabilize their health while managing costs.
For example, consider a senior citizen on Medicare who suffers a heart attack. She survives the attack, receives appropriate treatment, and goes home. But then she ends up back in the hospital within a month for some comparatively trivial issue, such as failing to take her prescribed medications. Each hospital readmission averages a $10,000 cost, not including “extras” such as prescriptions, rehabilitative services, and so on. Given that a whopping one in five patients on Medicare is readmitted to the hospital within a month of his or her first admission,3 these costs can be massive—and readmission is no fun for the patient, either. Humana, which covers the costs Medicare doesn’t, has a vested interest in addressing the situation.
So the firm did a little poking around in its databases and discovered that a substantial number of the two million Medicare-enrolled members it insures were being readmitted. The company chartered its analytic team to build a model to address this problem. Among other insights, the team found that members who suffered from chronic health problems (diabetes, obesity, heart disease, pneumonia, congestive heart failure, and so on) were at the top of the list. Accordingly, Humana made a point of following up with patients after they were released from the hospital. All patients receive an automated phone call offering help or advice via a toll-free number, but patients with chronic problems receive a call from a nurse who walks them through the steps of their rehabilitative care and makes sure they stay on track. And patients who suffer from several chronic problems at once receive a home visit from a nurse who monitors and coaches them along. More than 100,000 Humana Medicare members with multiple chronic illnesses receive this kind of help.
Through controlled tests, Humana has discovered that a proactive, low-cost, and simple intervention, such as sending a nurse to visit the patient, can save significant amounts of money while helping the patient. We continue to work with Humana using simple behavioral interventions that we trust will make significant bottom line advances.
From a business and healthcare industry standpoint, these moves all make sense. “Our industry has not been innovative,” McCallister insists. “This nation is productive on the back of technology, but there is no innovation in insurance or healthcare outside of products. We are trying to solve a big problem—to control healthcare spending and address deteriorating health at the same time. Maybe what we learn from our experiments here can spread.”
The Price Is Right
Field experiments focusing on products, services, and prices are not just the domain of big companies such as Intuit and Humana. They may, in fact, be even more crucial for smaller businesses, many of which teeter on the brink of bankruptcy daily.
In the summer of 2009, Uri and his wife Ayelet received a call from a fellow we will call “George,” a winery owner in Temecula, California, a lovely, languid town about an hour northeast of San Diego. George asked for their help with pricing his wines—clearly one of the most important business decisions he needed to make. They were delighted to take up the invitation to visit George’s winery, taste some of his products, and possibly help him in the process.4
When Uri and Ayelet asked him how he’d chosen prices in the past, they heard about the usual suspects: George looked at how other wineries price similar wines, intuition, his last year’s prices, and so on. He expected the business professors to come over, look around, do some quick calculations—and come up with the magic numbers that would make him rich. You can imagine how disappointed he was when, after having spent some time with him (and his lovely cabernet), Uri and Ayelet told him they had no idea what the “right” price was, and that the magic number didn’t exist. He almost took away the wine he’d already poured for them.
In an attempt to save their drinks, Uri and Ayelet did offer him help, in the form of a method—no magic, no equations, and no superior knowledge—just a simple experimental design. Pricing wines is a particularly tricky task since quality is not objective. We automatically assume a connection between price and quality; all else being equal, if a laptop costs more because it weighs less, people think it’s better. And that’s how much of the world works—evidence that runs counter to this basic intuition is hard to find.
Is this also the case with wines? You’d assume so, since the price range for wines is so enormous—you can pay a few bucks for a bottle of rotgut, or $10,000 for a bottle of 1959 Domaine de la Romanée-Conti. Research suggests that even when evaluating the quality of a product is subjective (as is the case with wine, since people have different taste preferences), increasing its price may increases its attractiveness to consumers.
Visitors to George’s winery, as with other wineries in this region, can taste different wines and subsequently choose to buy from the selection. Consumers typically come to Temecula for wine trips, going from one winery to another, sampling, and buying wine. The wine with which Uri and Ayelet experimented was a 2005 cabernet sauvignon, a “wine with complex notes of blueberry, black currant liqueur, and a hint of citrus.” The price George had previously chosen for it was $10, and it sold well.
For the experiment, we manipulated the price of the cabernet to be $10, $20, or $40 on different days over the course of a few weeks. Each experimental day, George greeted the visitors and told them about the tasting. Then visitors went to the counter, where they met the person who administered the tasting and handed them a single printed page containing the names and prices of the nine sample wines, ranging from $8 to $60, of which visitors could try six of their choice. As in most wineries, the list was constructed from “light to heavy,” starting with white wines, moving to red wines, and concluding with dessert wines. Visitors typically chose wines going down the list, and the cabernet sauvignon was always number seven. Tastings took between fifteen and thirty minutes, after which visitors could decide whether to buy any of the wines.
The results shocked George. Visitors were almost 50 percent more likely to buy the cabernet when he priced it at $20 than when he priced it at $10! That is, when we increased the price, the wine became more popular.
Using an almost cost-free experiment, and adopting prices accordingly, George increased the winery’s total profits by 11 percent. Following this experiment, he happily adopted the results and changed the price of this wine to $20. Since the vast majority of the winery’s clients are one-time visitors (this winery sells most of its wines in its store), very few people noticed the change in price.
Be Creative
Finding the “right” price is important. But sometimes you need more. It’s not just about the price, but also about how it’s collected.
A few years ago, a graduate student at University of California, San Diego, Amber Brown, went to work for Disney Research—a to-die-for kind of job for a young psychologist. Disney has an in-house, interdisciplinary group of researchers that uses science to try to improve the company’s performance and explore new technologies, marketing, and economics. As is the case with Humana, this group understands the importance of using behavioral research to simultaneously improve both the
customers’ experience and the company’s bottom line.
At about the same time Amber nabbed her job, we were becoming interested in an emerging behavioral pricing approach: pay-what-you-want. A famous example of this pricing is from the British band Radiohead. In 2007, the band released a CD as a digital download. It encouraged fans to log on to its website and download the album for any price they chose. Fans could get the album for free or pay as little as 65 cents (the cost of handling by the credit card company) or more. But would the fans pay for something they could get for free? And did they pay? Interestingly, hundreds of thousands of people downloaded the album from the band’s website, and many of them (around 50 percent) paid something for the CD. (By the way, as our friend, the recent Nobelist Al Roth likes to say, “Columbus wasn’t the first to discover America; he was the last.” After Columbus, everyone knew about the “new” continent. The same is true here. Radiohead wasn’t the first to discover this pricing strategy, but the group is famous enough to be the “last”—no one will ever need to “rediscover” it.)
This example shows that even in markets, people are not completely selfish. But the data from Radiohead’s model, and other companies who had used it, left many questions open. Clearly people paid more than they had to, but whether the pricing strategy had positive or negative consequences for the band remained unclear. Did the band make or lose money relative to a standard pricing scheme?
We decided to study the pay-what-you-want scheme in a field experiment.5 We thought a combination of a pay-what-you-want pricing strategy and charity might be an interesting way to go. We called this combination Shared Social Responsibility (SSR) because instead of the company alone deciding how much to give to the charity, customers could share in the donations, too. If people could pay what they wanted for an item, would they pay more if we appealed to the “better angels of their nature”?
So together with Disney Research, we designed a large field experiment that included over 100,000 participants to test the effect of pay-what-you-want pricing combined with charity. We set up our experiment at a roller coaster–like ride at a Disney park where people go on the ride and can afterwards buy a snapshot of themselves screaming and laughing.
We offered the photo either for its regular price of $12.95 or under a pay-what-you-want scheme. We also added treatments in which half of the revenue from selling the picture went to a well-known and well-liked charity. This experimental design resulted in four different treatments that we ran over different days during a month-long period.
The figure below shows the profits per rider:
As you can see, we found that at the standard fixed price of $12.95, the charitable component only slightly increased demand—raising the revenue per rider by just a few cents. But what happened when participants could choose their own price? The demand rates went through the roof. Sixteen times more people (8 percent instead of 0.5 percent) bought the photo. But since they only paid about a dollar on average, Disney didn’t make any money from them. (Remember: we are interested in running experiments in which we can find a win-win solution for both companies and their customers. That’s the best way to make changes that stick.)
And what, in the experiment results, were we most interested in? When we mixed the pay-what-you-want scheme with charity, 4 percent of the people bought the picture, but they paid much more (roughly $5) for it. Adding the charity option proved very profitable. In fact, the amusement park stood to make an additional $600,000 a year by offering the pay-what-you-want/charity combination just in this one location in the park. More generally, making this change also increased the benefit to the charity—and presumably to the customers, who felt they were doing something good.
An important takeaway from our experiment is that if you want your clients to act unselfishly, you need to show you can do the same. When Disney agreed to experiment with the new pricing, the company signaled to its customers that it cared about charitable causes and, more importantly, was willing to share the risk of acting on that concern. More generally, we learned that being creative with your pricing strategies proves you can do good while doing well (as we discussed in Chapters 9 and 10).
How Can We Get You to Respond?
As we mentioned in the previous chapter, we’re all used to the piles of junk mail with offers that sound too good to be true (probably because they are not so good or not so true). Many of us never open such mail—we just “file” it in the trash without even looking at it. Those who do open it usually ignore the content or requests. Knowing this, how can a company get your attention through direct mail (or social media)?
Imagine you open a direct-mail plea and a $20 bill falls from the envelope. Whoever sent the mail likely has your attention now. Curious, you read the enclosed letter. The company is asking you to complete and return a short survey. Would you do it? What if there was only $10 enclosed? Or just $1?
Earlier, we showed how charities like Smile Train and Wonder Work.org have successfully used reciprocity, the basic principle that if someone does something nice for you, you should do something nice in return. But what if you’re not a charity?
In this direct-mail case, the company sweetly sent you cash and is asking you to do something for them in return. Let’s say you’re the chief marketing officer of a big chain store, and you ask yourself whether it makes sense to try to appeal to people’s sense of reciprocity when asking them to respond to a direct-mail pitch. Your company has lots of experience in sending surveys and is also good at collecting data. But it isn’t very good at figuring out which kind of incentives work best when it comes to direct mail.
With our colleague, Pedro Rey-Biel (of the University Autonoma of Barcelona), we analyzed the results of a large field experiment, comprising 29 treatments and 7,250 “club members” who were already registered customers of a big chain store.6 The company sent letters asking club members to complete a fifteen-minute survey. The company was interested in the question: Which is better—paying customers ahead of time to respond to a direct-mail plea, or promising to pay them after they’ve responded?
Put another way: Would more people respond—and would the study be more cost-effective—if the company used the reciprocity angle and sent people money with the survey in hopes that they would fill it out? Or would it be smarter to do things the old-fashioned way? That is, would it be better to treat people more like employees and make the reward contingent on having done the work? Or should the company forget the whole incentive thing and just send out surveys without a reward?
In one treatment, the company sent letters with cash, ranging from $1 to $30 (we called this the “social” treatment, since reciprocity is a social phenomenon), to about half the addressees. In another treatment, the company promised to send 3,500 people checks (with the same amounts as in the other treatment) if they filled out the survey (we called this the “contingent” treatment). In the control treatment, the company just sent the survey to 250 people and asked them to respond. The chart below shows the response.
This chart shows the “breaking point” was around $15. Up to $15, we found giving people money up front made them feel like reciprocating and therefore more likely to return the survey, even for small amounts such as $1. In fact, significantly more people responded when we told them: “You’ll get a dollar if you’ll fill out the survey and send it back.” But after $15, more people responded to the contingent, fill-out-our-survey-and-then-we’ll-pay-you approach.
Importantly, the contingent was less expensive than the paying-up-front approach. Makes sense: after all, sending money only to those who send the survey back is cheaper than paying everyone regardless of whether they respond. The average cost of a returned survey in the social treatment was $45.40, more than double the cost in the contingent treatment ($20.97). As a result, the total cost in the social treatments was almost three times higher than in the contingent treatments ($38,820 vs. $13,212).
What can companies that send out direct mail learn fro
m this exercise? If your budget allows you to only pay $1 for a returned survey, put the buck in the envelope. People (at least the nice ones) will be happy to get it and reciprocate in kind. But if you can spend enough money per person, you’ll be better off paying only people who send the survey back. You might, of course, sample different people in the two cases; our bet is that you’ll get more people who think like economists when you make the payment contingent and more noneconomist thinkers when you don’t.
A Trip to China
In Chapter 4, we talked about the way framing a bonus as a gain or loss affected teacher and student performance. Framing can be an important tool for businesses, too. Let’s say that you are the marketing manager of a product called Sunny Sunscreen SPF 50 Lotion, and you are deciding what kind of spin to put on a campaign. Your “gain-framed,” or positive message might go like this: “Use Sunny Sunscreen to decrease your risk of getting skin cancer” or “Use Sunny Sunscreen to help your skin stay healthy.” Alternatively, a “loss-framed,” or negative message could be “Without Sunny Sunscreen, you increase your risk of developing skin cancer” or “Without Sunny Sunscreen, you cannot guarantee the health of your skin.”
Similarly, a manager can tell employees, “If we boost production by 10 percent this year, we will all be in for a bonus!” Or he could say, “If we don’t boost production by 10 percent this year, none of us will get a bonus.” Which kind of framing do you think is the better motivator?