Several larger public prediction markets also exist, such as PredictIt, which focuses on political predictions in the manner described above. While this market has successfully predicted many election outcomes across the world, in 2016 it failed to correctly predict both the election of Donald Trump and the UK’s Brexit vote. Retrospective analysis showed that diversity of opinion seemed lacking and that participants in the prediction market likely didn’t have enough direct contact with Trump or Brexit supporters. In addition, predictors were not operating fully independently, instead being influenced by the initial outsized odds against Trump and Brexit.
Another project, called the Good Judgment Project, crowdsources predictions for world events. Its co-creator, Philip E. Tetlock, studied thousands of participants and discovered superforecasters, people who make excellent forecasts, repeatedly. He found that these superforecasters consistently beat the world’s leading intelligence services in their predictions of world events, even though they lack classified intelligence that these services have access to!
In a book entitled Superforecasting, Tetlock examines characteristics that lead superforecasters to make such accurate predictions. As it happens, these are good characteristics to cultivate in general:
Intelligence: Brainpower is crucial, especially the ability to enter a new domain and get up to speed quickly.
Domain expertise: While you can learn about a particular domain on the fly, the more you learn about it, the more it helps.
Practice: Good forecasting is apparently a skill you can hone and get better at over time.
Working in teams: Groups of people can outperform individuals as long as they avoid groupthink.
Open-mindedness: People who are willing to challenge their beliefs tend to make better predictions.
Training in past probabilities: People who looked at probabilities of similar situations in the past were better able to assess the current probability, avoiding the base rate fallacy (see Chapter 5).
Taking time: The more time people took to make the prediction, the better they did.
Revising predictions: Forecasters who continually revised their predictions based on new information successfully avoided confirmation bias (see Chapter 1).
Using prediction markets and the techniques of superforecasters can help you improve your scenario analysis by making it more accurate and focusing it on the events that are actually more likely to occur. As we’ve seen in Chapters 2 and 4, many unpredictable changes will inevitably occur; however, by spending time with these mental models, you can be better prepared for these changes. Even if you cannot predict exactly what will happen, you may envision similar scenarios and your preparation for those scenarios will help you.
In this chapter as a whole, we’ve seen an array of decision models that surpass the simple pro-con list that we started with. When you’ve arrived at a decision using one or more of these mental models, a good final step is to produce a business case, a document that outlines the reasoning behind your decision.
This process is a form of arguing from first principles (see Chapter 1). You are laying out your premises (principles) and explaining how they add up to your conclusion (decision). You are making your case. Taking this explicit step will help you identify holes in your decision-making process. In addition, a business case provides a jumping-off point to discuss the decision with your colleagues.
A business case can range from very short and informal (a few paragraphs) to extremely detailed and formal (a massive report) and is often accompanied by a presentation. In its final form it is used to convince others (or yourself!) that the decision is the right one. By using the mental models from this chapter, you can put together compelling business cases to help you and your organization make excellent decisions.
And it’s not just for business. We started this chapter by discussing a potential career change. Knowing what you know now, you can approach that same problem in a much better way. For example, you could do scenario analysis to better uncover and imagine how different possible career futures could unfold. You could then more systematically analyze the seemingly best possible career paths more numerically through cost-benefit analysis, or using a decision tree if some of the choices are more probabilistic in nature. Then, in the end, you can put all of it together into a succinct business case to lay out the argument for your next career move.
KEY TAKEAWAYS
When tempted to use a pro-con list, consider upgrading to a cost-benefit analysis or decision tree as appropriate.
When making any quantitative assessment, run a sensitivity analysis across inputs to uncover key drivers and appreciate where you may need to seek greater accuracy in your assumptions. Pay close attention to any discount rate used.
Beware of black swan events and unknown unknowns. Use systems thinking and scenario analysis to more systematically uncover them and assess their impact.
For really complex systems or decision spaces, consider simulations to help you better assess what may happen under different scenarios.
Watch out for blind spots that arise from groupthink. Consider divergent and lateral thinking techniques when working with groups, including seeking more diverse points of view.
Strive to understand the global optimum in any system and look for decisions that move you closer to it.
7
Dealing with Conflict
IN ADVERSARIAL SITUATIONS, nearly every one of your choices directly or indirectly affects other people, and these effects can play a large role in how a conflict turns out. In the words of English poet John Donne, “No man is an island.”
In Chapter 6, we discussed mental models that help you with making decisions. In this chapter we will give you more models to help with decision making, with a focus on guiding you through adversarial situations.
As an example, consider the arms race. The term was originally used to describe a race between two or more countries to accumulate weapons for a potential armed conflict. It can also be used more broadly to describe any type of escalating competition. Think of the Cold War between the U.S. and Russia after World War II, where both countries kept accumulating more and more sophisticated nuclear weapons. And that’s not even the only arms race from the Cold War: both countries also intensely competed for dominance of the Olympics (medal race) and space exploration (space race).
Arms races are prevalent in our society. For example, many employers in the U.S. have increasingly required college or even more advanced degrees as a condition of employment, even though many of these jobs don’t use the knowledge acquired from these degrees.
Arms Race
Growing Educational Demand for Employment
And getting these degrees is increasingly more expensive as the result of another arms race, in which colleges spend more and more on making their campuses feel like resorts. Stereotypical cinder-block dorm rooms with a mini-fridge and a communal bathroom down the hall are being replaced with apartment-style suites that come with stainless-steel appliances and private bathrooms. And, according to The New York Times, some schools have even been building “lazy rivers” like the ones at amusement parks! This arms race has directly contributed to the cost of U.S. higher education going sky high.
Getting into an arms race is not beneficial to anyone involved. There is usually no clear end to the race, as all sides continually eat up resources that could be spent more usefully elsewhere. Think about how much better it would be if the money spent on making campuses luxurious was instead invested in better teaching and other areas that directly impact the quality and accessibility of a college education.
Unfortunately, situations like this are common in everyday personal life too: many people go into considerable debt trying to keep up with their social circles (or circles they aspire to belong to) by buying bigger houses, fancier cars, and designer clothes, and by sending their kids to expensive private schools. The phrase keeping up with the Joneses describes this phenomenon and comes from the name
of a comic strip that followed the McGinis family, who were fixated on matching the lifestyle of their neighbors, the Joneses.
Based on what you know about us so far, you might not be surprised to find out that we send our sons to engineering camps in the summer. Last year, the closest offering for one of these camps was at a private school on the Philadelphia Main Line, a highly affluent region. When Lauren was waiting to pick up one of our sons, she overheard a group of campers arguing over which of their families owned the most Teslas. While comparisons of social status are not uncommon, it was disheartening to see elementary school–aged children engage in this kind of discussion, especially one so extreme.
As an individual, avoiding an arms race means not getting sucked into keeping up with the Joneses. You want to use your income on things that make you fulfilled (such as on family vacations or on classes that interest you), rather than on unfulfilling status symbols.
As an organization, avoiding an arms race means differentiating yourself from the competition instead of pursuing a one-upmanship strategy on features or deals, which can eat away at your profit margins. By focusing on your unique value proposition, you can devote more resources to improving and communicating it rather than to keeping up with your competition. The satirical publication The Onion famously parodied the corporate arms race between razor blade manufacturers, as depicted below.
In the rest of this chapter, we explore mental models to help you analyze and better deal with conflicts like arms races. We hope that after reading it, you will be equipped to emerge from any adversarial situation with the best outcome for yourself.
PLAYING THE GAME
Game theory is the study of strategy and decision making in adversarial situations, and it provides several foundational mental models to help you think critically about conflict. Game in this context refers to a simplified version of a conflict, in which players engage in an artificial scenario with well-defined rules and quantifiable outcomes, much like a board game.
In most familiar games—chess, poker, baseball, Monopoly, etc.—there are usually winners and losers. However, game theorists recognize that in real-life conflicts there isn’t always a clear winner or a clear loser. In fact, sometimes everyone playing the game can win and other times everyone can lose.
The most famous “game” from game theory is called the prisoner’s dilemma. It can be used to illustrate useful game-theory concepts and can also be adapted to many life situations, including the arms race.
Here’s the setup: Suppose two criminals are captured and put in jail, each in their own cell with no way to communicate. The prosecutor doesn’t have enough evidence to convict either one for a major crime but does have enough to convict both for minor infractions. However, if the prosecutor could get one of the prisoners to turn on their co-conspirator, the other one could be put away for the major crime. So the prosecutor offers each prisoner the same deal: the first one who betrays their partner walks free now, and anyone who stays silent goes to prison.
In game theory, diagrams can help you study your options. One example is called a payoff matrix, showing the payoffs for possible player choices in matrix form (see 2 × 2 matrix in Chapter 4). From the prisoner’s perspective, the payoff matrix looks like this:
Prisoner’s Dilemma
Payoff Matrix: Sentences Received
B remains silent
B betrays A
A remains silent
1 year, 1 year
10 years, 0 years
A betrays B
0 years, 10 years
5 years, 5 years
Here’s where it gets interesting. The simplest formulation of this game assumes that the consequences for the players are only the prison sentences listed, i.e., there is no consideration of real-time negotiation or future retribution. If, as a player, you are acting independently and rationally, the dominant strategy given this formulation and payoff matrix is always to betray your partner: No matter what they do, you’re better off betraying, and that’s the only way to get off free. If your co-conspirator remains silent, you go from one to zero years by betraying them, and if they betray you too, you go from ten to five years.
The rub is that if your co-conspirator follows the same strategy, you both go away for much longer than if you both just remained silent (five years versus one year). Hence the dilemma: do you risk their betrayal, or can you trust their solidarity and emerge with a small sentence?
The dual betrayal with its dual five-year sentences is known as the Nash equilibrium of this game, named after mathematician John Nash, one of the pioneers of game theory and the subject of the biopic A Beautiful Mind. The Nash equilibrium is a set of player choices for which a change of strategy by any one player would worsen their outcome. In this case, the Nash equilibrium is the strategy of dual betrayals, because if either player instead chose to remain silent, that player would get a longer sentence. To both get a shorter sentence, they’d have to act cooperatively, coordinating their strategies. That coordinated strategy is unstable (i.e., not an equilibrium) because either player could then betray the other to better their outcome.
In any game you play, you want to know whether there is a Nash equilibrium, as that is the most likely outcome unless something is done to change the parameters of the game. For example, the Nash equilibrium for an arms race is choosing a high arms strategy where both parties continue to arm themselves. Here’s an example of a payoff matrix for this scenario:
Arms Race
Payoff Matrix: Economic Outcomes
B disarms
B arms
A disarms
win, win
lose big, win big
A arms
win big, lose big
lose, lose
As you can see, the arms race directly parallels the prisoner’s dilemma. Both A and B arming (the lose-lose situation) is the Nash equilibrium, because if either party switched to disarming, they’d be worse off, enabling an even poorer outcome, such as an invasion they couldn’t defend against (denoted as “lose big”). The best outcome again results from being cooperative, with both parties agreeing to disarm (the win-win situation), thus opening up the opportunity to spend those resources more productively. That’s the arms race equivalent of remaining silent, but it is also an unstable situation, since either side could then better their situation by arming again (and potentially invading the other side, leading to a “win big” outcome).
In both scenarios, a superior outcome is much more likely if everyone involved does not consider the situation as just one turn of the game but, rather, if both sides can continually take turns, running the same game over and over—called an iterated or repeated game. When we mentioned earlier the possibility of future retribution, this is what we were talking about. What if you have to play the game with the same people again and again?
In an iterated game of prisoner’s dilemma, cooperating in a tit-for-tat approach usually results in better long-term outcomes than constant betrayal. You can start out cooperating, and thereafter follow suit with what your opponent has recently done. In these situations, you want to wait for your opponent to establish a pattern of bad behavior before you reciprocate in kind. You don’t want to destroy a previously fruitful relationship based on one bad choice by your counterpart.
Similarly, cooperation pays off in most long-term life situations where reputation matters. If you are known as a betrayer, people will not want to be your friend or do business with you. On the other hand, if people can trust you based on your repeated good behavior, they will want to make you their ally and collaborate with you.
In any case, analyzing conflicts from a game-theory perspective is a sound approach to help you understand how your situation is likely to play out. You can write out the payoff matrix and use a decision tree (see Chapter 6) to diagram different choice scenarios and their potential outcomes, from your perspective. Then you can figure out how you get to your desired outcome.
NUDGE NUDG
E WINK WINK
To get to a desired outcome in a game, you may have to influence other players to make the moves you want them to make, even if they may not want to make them initially. In these next few sections, we present mental models that can help you do just that. They work well in conflict situations but also in any situation where influence is useful. First, consider six powerful yet subtle influence models that psychologist Robert Cialdini presents in his book Influence: The Psychology of Persuasion.
Cialdini recounts a study (since replicated) showing that waiters increase their tips when they give customers small gifts. In the study, a single mint increased tips by 3 percent on average, two mints increased tips by 14 percent, and two mints accompanied by a “For you nice people, here’s an extra mint” increased tips by 23 percent.
The mental model this study illustrates is called reciprocity, whereby you tend to feel an obligation to return (or reciprocate) a favor, whether that favor was invited or not. In many cultures, it is generally expected that people in social relationships will exchange favors like this, such as taking turns driving a carpool or bringing a bottle of wine to a dinner party. Quid pro quo (Latin for “something for something”) and I’ll scratch your back if you’ll scratch mine are familiar phrases that relate to this model.
Reciprocity also explains why some nonprofits send you free address labels with your name on them along with their donation solicitation letters. It also explains why salespeople give out free concert or sports tickets to potential high-profile clients. Giving someone something, even if they didn’t ask for it, significantly increases the chances they will reciprocate.
Super Thinking Page 24