Super Thinking

Home > Other > Super Thinking > Page 27
Super Thinking Page 27

by Gabriel Weinberg


  A sister mental model to deterrence is containment. In global conflicts, containment is an attempt to contain the enemy, to prevent its further expansion, be it expanding geographically (e.g., invading a neighboring country), militarily (e.g., obtaining more nuclear weapons), or politically (e.g., spreading an ideology). Containing an ongoing conflict can save you energy and resources. Think of it like treating a cut before it gets infected or removing an early-stage tumor before it metastasizes.

  Containment acknowledges that an undesirable occurrence has already happened, that you cannot easily undo it, and so instead you’re going to try to stop it from spreading or occurring again in the future. For example, HIV isn’t yet easily curable, but if you catch it early, with modern treatments you can usually contain it such that it does not develop into AIDS.

  You should apply a containment strategy in situations where you want to stop something bad from spreading, such as a negative rumor or a harmful business practice. For example, Facebook and Twitter probably couldn’t have gotten rid of fake news from their platform in the run-up to the 2016 U.S. election, but they could have done a better job at containing it.

  These types of situations can get out of hand quickly, so you often first want to stop the bleeding, by a quick and dirty method if necessary. Once the situation has stabilized, you can take a step back, find the root cause (see Chapter 1), and then try to find a more reliable long-term solution. In an emergency medical situation, you may use a tourniquet to stop actual bleeding. The metaphorical equivalent is situationally dependent, but it usually involves doing something fast and definitive, such as issuing a clear apology.

  In some cases, the best short-term option might be shutting down the area where the problem exists, kind of like amputating an infected limb to prevent sepsis. In a personal context, that might mean severing a toxic relationship, at least for the time being. In an organizational context, it might mean terminating a project or employee.

  Another containment tactic is quarantine, the restriction of the movement of people or goods in order to prevent the spread of disease. Your spam folder is a form of quarantine, curbing the impact of suspicious emails. Twitter started dealing with aggressive bots and people by quarantining them behind an additional tap or click so that fewer people see their messages.

  A related tactic is flypaper theory, which calls for you to deliberately attract enemies to one location where they are more vulnerable, like attracting flies to flypaper, usually also directing them far away from your valuable assets. A former commander of U.S. ground forces in Iraq, General Ricardo Sánchez, described the benefits of this strategy in a 2003 CNN interview with regard to preventing terrorism on U.S. soil: “This is what I would call a terrorist magnet, where America, being present here in Iraq, creates a target of opportunity. . . . But this is exactly where we want to fight them. . . . This will prevent the American people from having to go through their attacks back in the United States.”

  In a computing context, this is known as a honeypot, which is used to attract and trap malicious actors for study, in the same way honey lures bears. A honeypot may be a special set of servers set up to look like the core servers where valuable data is stored, but which instead are isolated servers set up specifically to entrap hackers. A sting operation by police where they lure criminals into a place to arrest them could be called an offline honeypot.

  Without containment, bad circumstances can spread, possibly leading to a domino effect, where more negative consequences unfold in inevitable succession like falling dominoes (see also cascading failure in Chapter 4). In the game-theory context, this effect could be a series of player choices that lands you in a bad outcome. Consider an iterated game of prisoner’s dilemma. While in each turn it is attractive to betray the other players because you get outsized yields that turn, doing so, especially repeatedly, in most cases leads to everyone else following suit, leaving you and everyone else stuck in the suboptimal Nash equilibrium.

  In the Cold War, the primary worry for the West was the spread of communism, and the dominoes were countries that might fall, one after another, which was justification to fight containment wars, as in Korea and Vietnam. The thought was that if Korea and Vietnam fell, then Laos and Cambodia might be next, and more and more countries would fall until all of Asia (even places like India) would eventually be subsumed by communism.

  Domino Effect

  However, be aware that the domino effect is invoked a lot more than is warranted, because people are generally bad at determining both the likelihood that events might occur and the causal relationship between events. These miscalculations often manifest in three related models, usually fallacious though not always, that you should be on the lookout for.

  The first is the slippery slope argument: arguing that one small thing leads to an inevitable chain of events and a terrible final outcome (in the eyes of the person making the argument). Here is an example of a common slippery slope argument: “If we allow any gun control, then it will eventually result in the government taking all guns away.” This line of reasoning is usually fallacious because there often isn’t 100 percent inevitability in each piece of the logical chain.

  The second model is broken windows theory, which proposes that visible evidence of small crimes, for example broken windows in a neighborhood, creates an environment that encourages worse crimes, such as murder. The thinking goes that broken windows are a sign that lawlessness is tolerated, and so there is a perceived need to hold the line and prevent a descent into a more chaotic state (see herd immunity in Chapter 2).

  While interventions associated with broken windows theory are intuitively appealing, it is unclear how effective they are at actually reducing widespread criminal activity relative to alternatives. Related theories often take the form of a contagion metaphor, where something the person doesn’t like (e.g., rap music, homosexuality, socialism) is compared to a disease that will spread through society, continually becoming more virulent if left unchecked.

  The third model to watch out for is gateway drug theory, which makes the claim that one drug, such as marijuana, is a gateway to more dangerous drug use. However, the evidence for this claim is also murky at best (see correlation does not imply causation in Chapter 5). You should question any situation where one of these models arises and analyze its veracity for yourself (see arguing from first principles in Chapter 1).

  Nevertheless, there are instances when a model like this can be true. Consider how businesses sometimes capture customers through a loss leader strategy, where one product is priced low (the gateway drug) to increase demand for complementary products with higher margins. The prototypical example is a supermarket discounting milk to draw in customers, who will almost certainly leave with more items. Similarly, companies sell mobile phones or printers for low prices knowing they will make money up in the long run through monthly service plans or high ink prices. We have nearly given up on letting our kids download free apps because we anticipate the endless nagging about in-app purchases.

  When analyzing these domino-effect situations, write down each step in the logical chain (list each domino) and try to ascribe a realistic probability to each event (the probability that each will fall). Even if not 100 percent in every case, it might be likely that some dominoes might fall. In that case, you need to ask yourself: Is that acceptable? Do I need to engage in more active containment, or can a more wait-and-see approach be taken? For example, with gun control, banning assault rifles is extremely unlikely to lead to the government taking away all guns, but it might very well lead to more gun control of other assault-like weapons or add-ons. A 2017 Politico/Morning Consult poll found 72 percent of Americans favored both “banning assault-style weapons” and “banning high-capacity magazines.”

  If you are in no position to meaningfully deter or contain an emerging conflict that you’d like to avoid, appeasement may be a necessary evil. This involves appeasing opponents by making concessions in order to avoid direct o
r further conflict with them. The most famous example of appeasement occurred in 1938 when Britain allowed Germany to annex Sudetenland, an important piece of Czechoslovakia, to avoid an armed conflict with Hitler’s army. Of course, the conflict Britain sought to avoid happened anyway. And that’s the worry with appeasement: you may just be delaying the inevitable.

  As a parent, sometimes appeasement is necessary to get through the day. For instance, we tend to bend the rules when we are traveling. Everyone is tired, and much of the time is spent in crowded, cramped hotel rooms or cars. At times like these, our normal diplomacy, deterrence, and containment tactics don’t work as smoothly. As a result, the kids often end up with way more snacks and screen time than they would normally get—appeasement tactics that effectively prevent meltdowns and fights.

  Deterrence, containment, and appeasement are all strategic mental models to keep you out of costly direct conflict. You want to enlist these models when other conflict-avoidance models have failed and you are still faced with a situation that you think you can’t “win,” as when engaging would create so much damage it isn’t worth it or when you want to preserve your resources for more fruitful engagements (see opportunity cost in Chapter 3).

  Finally, as Joshua said in WarGames, sometimes “the only winning move is not to play.” An increasingly common example is conflict with the online troll, someone whose whole game is to irritate people and bait them into arguments they can’t win. As a result, the best move is usually not to engage with them (don’t feed the trolls; don’t stoop to their level; rise above the fray), though, as in any situation, you have to assess it on a case-by-case basis, and where reporting mechanisms exist, you should consider them too. Any parent will similarly tell you that you need to pick your battles.

  CHANGING THE GAME

  From a game-theory perspective, deterrence and related models effectively change a game, adjusting how players perceive their payoff matrix and therefore what decisions they make when playing the game. When you practice deterrence through a credible threat, you enumerate a red line, which describes a figurative line that, if crossed, would trigger retaliation (see commitment in Chapter 3). That threat of retaliation causes other players to reconsider their choices. This line is also referred to as a line in the sand, describing a figurative line (drawn in the sand) that you do not intend to be crossed.

  When using this strategy, you must give enough notice so that others can adjust their strategies based on your threat. You also have to explain exactly what you intend to do when the red line is crossed. The most severe threat is a so-called nuclear option, signaling that you will undertake some kind of extreme action if pressed. For example, North Korea has repeatedly threatened the literal nuclear bombing of South Korea if invaded.

  Another extreme tactic is a zero-tolerance policy, where even a minor infraction results in strict punishment. For example, a zero-tolerance drug policy would have you fired from your job or expelled from school on the first offense, as opposed to a series of punishments that escalate to an extreme measure.

  The problem with these tactics is someone can call your bluff, challenging you to act on your threat, claim, or policy, and actually prove it is true, calling you out. At that point, if you don’t follow through on your promise of action, you will lose significant credibility and your opponent’s payoff matrix might not change the way you want it to. For that reason, you should be prepared to follow through on whatever deterrence threats you make.

  Another common situation to look out for is a war of attrition, where a long series of battles depletes both sides’ resources, eventually leaving vulnerable the side that starts to run out of resources first. Each battle in a war of attrition hurts everyone involved, so in these situations you want either to have more resources at the start, make sure you are losing resources at a much slower rate than your opponents, or both.

  The most famous military example is Germany’s invasion of Russia in World War II, the deadliest conflict in human history. Over the course of the invasion, military losses for the Soviets were more than ten million, compared with more than four million for Germany. Russia had significantly more resources, however, and Germany was never able to capture Moscow. This war of attrition accounted for 80 percent of deaths suffered by the German armed forces in all of World War II, depleting their resources enough to open them up to defeat on all fronts.

  Big companies often use this strategy against upstarts through various means, such as protracted lawsuits, price wars, marketing campaigns, and other head-to-head face-offs, bleeding them dry. In sports, a team may use this strategy if they are more physically fit than the other, such that at the end of the game, the more fit team can push to victory. It’s essentially a waiting game.

  Because a war of attrition is a long-term strategy, it can counterintuitively make sense to lose a battle intentionally, or even many battles, to win the eventual war. The winner of such a battle gets a hollow victory, sometimes referred to as an empty victory or Pyrrhic victory. The latter is named after King Pyrrhus of Epirus, Greece, whose army suffered irreplaceable casualties in defeating the Romans at the Battle of Heraclea, and then ultimately lost the war. In sports and gaming, this scenario is known as a sacrifice play. Examples include bunts and sacrifice flies in baseball and intentionally giving up a piece to get better board position in chess.

  From the other side, though, if you see that you are going to lose a war of attrition, you need to find a way out or a way to change the game. One way to do that is to engage in guerrilla warfare, which focuses your smaller force on nimbler (guerrilla) tactics that the unwieldy larger force has trouble reacting to effectively (see leverage in Chapter 3). Max Boot, author of Invisible Armies, recounted in a 2013 interview on NPR, titled “American Revolution Reinvents Guerrilla Warfare,” how the colonists in the American Revolution used guerrilla warfare right from the start of the conflict:

  Well, it first of all comes down to not coming out into the open, where you could be annihilated by the superior firepower of the enemy. The British got a taste of how the Americans would fight on the very first day of the Revolution, with the shot heard around the world, the Battle of Lexington and Concord, where the British regulars marched through the Massachusetts countryside.

  And the Americans did not mass in front of them but instead chose to slither on their bellies—these Yankee scoundrels, as the British called them—and fired from behind trees and stone walls. And not come out into the kind of open gentleman’s fight that the British expected, and instead, took a devastating toll on the British regiment.

  This concept has taken up a direct parallel in guerrilla marketing, where startup businesses use unconventional marketing techniques to promote their products and services on relatively small budgets. Examples of this type of marketing include PR stunts and viral videos, often taking direct aim at larger competitors, much like guerrilla warriors taking aim at a larger army. As an example, Dollar Shave Club, a subscription razor service, launched its product with a viral video. While it couldn’t compete on the bigger businesses’ terms (e.g., expensive TV and print ads), its edgy launch video entitled “Our Blades Are F***ing Great” immediately put the company on the map, setting it on a rapid path ultimately to a one-billion-dollar acquisition.

  One adage to keep in mind when you find yourself in a guerrilla warfare situation is that generals always fight the last war, meaning that armies by default use strategies, tactics, and technology that worked for them in the past, or in their last war. The problem is that what was most useful for the last war may not be best for the next one, as the British experienced during the American Revolution.

  The most effective strategies, tactics, and especially technologies change over time. If your opponent is using outdated tactics and you are using more modern, useful ones, then you can come out the victor even with a much smaller force. Essentially, you use your tactical advantage to change the game without their realizing it; they think they are still winning a
war of attrition.

  On May 27 and 28, 1905, Japan’s navy decisively beat Russia’s navy in the Battle of Tsushima, sinking twenty-one ships, including seven battleships, with more than ten thousand Russian troops killed, injured, or captured, compared with just three torpedo boats sunk and seven hundred troops killed or injured for Japan.

  Admiral Tōgō Heihachirō of Japan used advanced tactics for the time and his fleet easily overcame his Russian counterparts, who were clearly fighting the last war. Japan’s ships were twice as fast as those of the Russians and equipped with much better guns, shooting 50 percent farther, using mostly high-explosive shells, causing significantly more damage on every hit. It was also the first naval battle where wireless telegraphy was used, and while both sides had some form of it, the Japanese version functioned much better and was more useful in fleet formations.

  Decisive battles have been won on the back of superior technology like this many times in military history. Don’t bring a knife to a gunfight. This concept is far-reaching, describing any situation where circumstances have changed significantly, leaving the status quo unequipped to deal with new threats.

  In business, many well-known companies have lost out because they were focused on the old way of doing business, without recognizing rapidly evolving markets. IBM famously miscalculated the rise of the personal computer relative to its mainframe business, actually outsourcing its PC operating system to Microsoft. This act was pivotal for Microsoft, propelling it to capture a significant part of the profits of the entire industry for the next thirty years. Microsoft, in turn, became so focused on its Windows operating system that it didn’t adapt it quickly enough to the next wave of operating system needs on the smartphone, ceding most of the profits in the smartphone market to Apple, which is now the most profitable company in history.

 

‹ Prev