These future antibiotics are a public good (since they protect the public’s health), and so there ought to be significant government-provided incentives to correct this market failure—for example, extended patent protection or shared development costs. Otherwise, we will remain in a tragedy-of-the-commons situation, because without such incentives we are all free-riding on the development of these drugs, directly leading to their underproduction.
No one knows when these drugs will be needed, and given their typical ten-year development timeline, there is no time to waste. A deadly bacterial outbreak could be around the corner. In any situation where risk and reward are separated across different entities, like this one, you want to look out for risk-related unintended consequences.
BE CAREFUL WHAT YOU WISH FOR
“The best laid schemes o’ mice an’ men [often go awry],” penned poet Robert Burns in 1785. In other words, things don’t always go as planned. Consider Gosplan, the agency charged with central economic planning for the Soviet Union for most of the twentieth century. Their plans often involved setting economy-wide target amounts for commodities (wheat, tires, etc.), which broke down into production targets for specific facilities. In 1990, economist Robert Heilbroner described some of the complications with this system in “After Communism,” published in The New Yorker:
For many years, targets were given in physical terms—so many yards of cloth or tons of nails—but that led to obvious difficulties. If cloth was rewarded by the yard, it was woven loosely to make the yarn yield more yards. If the output of nails was determined by their number, factories produced huge numbers of pinlike nails; if by weight, smaller numbers of very heavy nails. The satiric magazine Krokodil once ran a cartoon of a factory manager proudly displaying his record output, a single gigantic nail suspended from a crane.
Goodhart’s law summarizes the issue: When a measure becomes a target, it ceases to be a good measure. This more common phrasing is from Cambridge anthropologist Marilyn Strathern in her 1997 paper “‘Improving Ratings’: Audit in the British University System.” However, the “law” is named after English economist Charles Goodhart, whose original formulation in a conference paper presented at the Reserve Bank of Australia in 1975 stated: “Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.”
Social psychologist Donald T. Campbell formulated a similar “law” (known as Campbell’s law) in his 1979 study, “Assessing the Impact of Planned Social Change.” He explains the concept a bit more precisely: “The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.”
Both describe the same basic phenomenon: When you try to incentivize behavior by setting a measurable target, people focus primarily on achieving that measure, often in ways you didn’t intend. Most importantly, their focus on the measure may not correlate to the behavior you hoped to promote.
High-stakes testing culture—be it for school examinations, job interviews, or professional licensing—creates perverse incentives to “teach to the test,” or worse, cheat. In the city of Atlanta in 2011, 178 educators were implicated in a widespread scandal involving correcting student answers on standardized tests, ultimately resulting in eleven convictions and sentences of up to twenty years on racketeering charges. Similarly, hospitals and colleges have been increasingly criticized for trying to achieve rankings at the expense of providing quality care and education, the very things the rankings are supposed to be measuring.
In A Short History of Nearly Everything, Bill Bryson describes a situation in which paleontologist Gustav Heinrich Ralph von Koenigswald accidentally created perverse incentives on an expedition:
Koenigswald’s discoveries might have been more impressive still but for a tactical error that was realized too late. He had offered locals ten cents for every piece of hominid bone they could come up with, then discovered to his horror that they had been enthusiastically smashing large pieces into small ones to maximize their income.
It’s like a wish-granting genie who finds loopholes in your wishes, meeting the letter of the wish but not its spirit, and rendering you worse off than when you started. In fact, there is a mental model for this more specific situation, called the cobra effect, describing when an attempted solution actually makes the problem worse.
This model gets its name from a situation involving actual cobras. When the British were governing India, they were concerned about the number of these deadly snakes, and so they started offering a monetary reward for every snake brought to them. Initially the policy worked well, and the cobra population decreased. But soon, local entrepreneurs started breeding cobras just to collect the bounties. After the government found out and ended the policy, all the cobras that were being used for breeding were released, increasing the cobra population even further.
A similar thing happened under French rule of Vietnam. In Hanoi, the local government created a bounty program for rats, paying the bounty based on a rat’s tail. Enterprising ratcatchers, however, would catch and release the rats after just cutting off their tails; that way the rats could go back and reproduce. Whenever you create an incentive structure, you must heed Goodhart’s law and watch out for perverse incentives, lest you be overrun by cobras and rats!
The Streisand effect applies to an even more specific situation: when you unintentionally draw more attention to something when you try to hide it. It’s named for entertainer Barbra Streisand, who sued a photographer and website in 2003 for displaying an aerial photo of her mansion, which she wanted to remain private. Before the suit, the image had been downloaded a total of six times from the site; after people saw news stories about the lawsuit, the site was visited hundreds of thousands of times, and now the photo is free to license and is on display on Wikipedia and many other places. As was said of Watergate, It’s not the crime, it’s the cover-up.
Streisand Effect
A related model to watch out for is the hydra effect, named after the Lernaean Hydra, a beast from Greek mythology that grows two heads for each one that is cut off. When you arrest one drug dealer, they are quickly replaced by another who steps in to meet the demand. When you shut down an internet site where people share illegal movies or music, more pop up in its place. Regime change in a country can result in an even worse regime.
An apt adage is Don’t kick a hornet’s nest, meaning don’t disturb something that is going to create a lot more trouble than it is worth. With all these traps—Goodhart’s law, along with the cobra, hydra, and Streisand effects—if you are going to think about changing a system or situation, you must account for and quickly react to the clever ways people may respond. There will often be individuals who try to game the system or otherwise subvert what you’re trying to do for their personal gain or amusement.
If you do engage, another trap to watch out for is the observer effect, where there is an effect on something depending on how you observe it, or even who observes it. An everyday example is using a tire pressure gauge. In order to measure the pressure, you must also let out some of the air, reducing the pressure of the tire in the process. Or, when the big boss comes to town, everyone acts on their best behavior and dresses in nicer clothes.
The observer effect is certainly something to be aware of when making actual measurements, but you should also consider how people might indirectly change their behavior as they become less anonymous. Think of how hard it is to be candid when you know the camera is rolling. Or how differently you might respond to giving a colleague performance feedback in an anonymous survey versus one with your name attached to it.
In “Chilling Effects: Online Surveillance and Wikipedia Use,” Oxford researcher Jonathon Penny studied Wikipedia traffic patterns before and after the 2013 revelations by Edward Snowden about the U.S. National Security Agency’s internet spying tactics, finding a 20 p
ercent decline in terrorism-related article views involving terms like al-Qaeda, Taliban, and car bomb. The implication is that when people realized they were being watched by their governments, some of them stopped reading articles that they thought could get them into trouble. The name for this concept is chilling effect.
Wikipedia Chilling Effect
In the legal context where the term chilling effect originated, it refers to when people feel discouraged, or chilled, from freely exercising their rights, for fear of lawsuits or prosecution. More generally, chilling effects are a type of observer effect where the threat of retaliation creates a change in behavior.
Sometimes chilling effects are intentional, such as when someone is made an example of to send a message to others about how offenders will be treated. For instance, a company will sue another aggressively over its patents to scare off other companies that might be thinking of competing with them.
Many times, though, chilling effects are unintentional. Mandated harassment reporting can give victims pause when contemplating reaching out for help, since they might not yet be ready for that level of scrutiny.
Fear of harassment also curbs usage of social media. In a June 6, 2017, Pew Research study, 13 percent of respondents said they stopped using an online service and 27 percent said they chose not to post something online after witnessing online harassment toward others.
In your personal relationships, you might find yourself walking on eggshells around a person you know has an anger management problem. Similarly, some romantic partners may not be totally honest about their relationship grievances if they perceive their partner as having one foot out the door.
Like the Wikipedia study discussed above, another unintentional chilling effect was found by an MIT study, “Government Surveillance and Internet Search Behavior,” which showed that post-Snowden, people have also stopped searching for as many health-related terms on Google, even though the terms weren’t directly related to illegal activity of any kind. As people understand more about corporate and government tracking, their searching of sensitive topics in general has been chilled. The authors noted: “Suppressing health information searches potentially harms the health of search engine users and, by reducing traffic on easy-to-monetize queries, also harms search engines’ bottom line.”
This negative unintended consequence could be considered collateral damage. In a military context, this term means injuries, damage, inflicted on unintended, collateral, targets. You can apply this model to any negative side effects that result from an action. The U.S. government maintains a No Fly List of people who are prohibited from commercial air travel within, into, or out of the U.S. There have been many cases of people with the same names as those on the list who experienced the collateral damage of being denied boarding and missing flights, including a U.S. Marine who was prevented from boarding a flight home from his military tour in Iraq. When people are deported or jailed, even for good reason, collateral damage can be inflicted on their family members. For instance, the loss of income could take a financial toll, or children could experience the trauma of growing up without one or both parents, possibly ending up in foster care.
Sometimes collateral damage can impact the entity that inflicted the damage in the first place, which is called blowback. Blowback sometimes can occur well after the initial action. The U.S. supported Afghan insurgents in the 1980s against the USSR. Years later these same groups joined al-Qaeda to fight against the U.S., using some of the very same weapons the U.S. had provided decades earlier.
Like Goodhart’s law and related models, observer and chilling effects concern unintended consequences that can happen after you take a deliberate action—be it a policy, experiment, or campaign. Again, it is best to think ahead about what behaviors you are actually incentivizing by your action, how there might be perverse incentives at play, and what collateral damage or even blowback these perverse incentives might cause.
Take medical care as a modern example. Fee-for-service medicine, prevalent in the United States, pays healthcare providers based on how much treatment is provided. Quite simply, the more treatment that is provided, the more money that is made, effectively incentivizing quantity of treatment. If you have a surgery, any additional care required (follow-up surgeries, tests, physical therapy, medications, etc.) will be billed separately by the provider conducting the treatment, including any care resulting from surgical complications that might arise. Each piece of the treatment is generally individually profitable to the providers.
With value-based care, by contrast, there is usually just one reimbursement payment amount for everything related to the surgery, including much of the directly related additional care. This payment scheme therefore incentivizes quality over quantity, as the healthcare provider conducting the surgery is also on the hook for some of the additional care, sometimes even if it is administered by other providers. This payment scheme therefore focuses healthcare providers on determining the exact right amount of treatment because they face financial consequences for over- or under-providing care.
This straightforward change in how medicine is billed (one lump-sum payment to one provider versus many payments to multiple providers) significantly changes the incentives for healthcare providers. The Medicare system in the United States is shifting to this value-based reimbursement model both to reduce costs and to improve health outcomes, taking advantage of its better-aligned incentives between payment and quality care.
In other words, seemingly small changes in incentive structures can really matter. You should align the outcome you desire as closely as possible with the incentives you provide. You should expect people generally to act in their own perceived self-interest, and so you want to be sure this perceived self-interest directly supports your goals.
IT’S GETTING HOT IN HERE
In the first section of this chapter, we warned about the tyranny of small decisions, where a series of isolated and seemingly good decisions can nevertheless add up to a bad outcome. There is a broader class of unintended consequences to similarly watch out for, which also involve making seemingly good short-term decisions that can still add up to a bad outcome in the long term. The mental model often used to describe this class of unintended consequences is called the boiling frog: Suppose a frog jumps into a pot of cold water. Slowly the heat is turned up and up and up, eventually boiling the frog to death.
It turns out real frogs generally jump out of the hot water in this situation, but the metaphorical boiling frog persists as a useful mental model describing how a gradual change can be hard to react to, or even perceive. The boiling frog has been used as a cautionary tale in a variety of contexts, from climate change to abusive relationships to the erosion of personal privacy. It is sometimes paired with another animal metaphor, also scientifically untrue—that of the ostrich with its head in the sand, ignoring the signs of danger. In each case the unintended consequence of not acting earlier is eventually an extremely unpleasant state that is hard to get out of—global warming, domestic violence, mass surveillance.
These unintended consequences are likely to arise when people don’t plan for the long term. From finance, short-termism describes these types of situations, when you focus on short-term results, such as quarterly earnings, over long-term results, such as five-year profits. If you focus on just short-term financial results, you won’t invest enough in the future. Eventually you will be left behind by competitors who are making those long-term investments, or you could be swiftly disrupted by new upstarts (which we cover in Chapter 9).
There are many examples of the deleterious effects of short-termism in everyday life. If you put off learning new skills because of the tasks in front of you, you will never expand your horizons. If you decorate your house one piece at a time in isolation, you won’t end up with a cohesive décor. If there are additions to the tax code without any thought to long-term simplification, it eventually becomes a bloated mess.
The software industry has a name
for the consequences of short-termism: technical debt. The idea comes from writing code: if you prioritize short-term code fixes, or “hacks,” over long-term, well-designed code and processes, then you accumulate debt that will eventually have to be paid down by future code rewrites and refactors. Accumulating technical debt isn’t necessarily harmful—it can help projects move along faster in the short term—but it should be done as a conscientious observer, not as an unaware boiling frog.
If you have been involved in any small home repairs, you’re probably familiar with this model. When something small is broken, many people opt for a short-term fix today, DIY-style (or even duct-tape-style), because it is cheaper and faster. However, these “fixes,” which may not be up to building standards, may cost you in the long run. In particular, the item may need to be repaired again at greater cost, such as when you want to sell your home.
Startup culture has extended this concept to other forms of “debt”: Management debt is the failure to put long-term management team members or processes in place. Design debt means not having a cohesive product design language or brand style guide. Diversity debt refers to neglecting to make necessary hires to ensure a diverse team. This model can likewise be extended to any area to describe the unintended consequences of short-term thinking: relationship debt, diet debt, cleaning debt.
Technical Debt
Customer’s view
Developer’s view
In these scenarios, you need to keep up with your “payments” or else the debt can become overwhelming: the out-of-control messy house, the expanding waistline, or the deteriorating relationship. These outstanding debts impact your long-term flexibility. The general model for this impact comes from economics and is called path dependence, meaning that the set of decisions, or paths, available to you now is dependent on your past decisions.
Super Thinking Page 7