Super Thinking

Home > Other > Super Thinking > Page 4
Super Thinking Page 4

by Gabriel Weinberg


  Speaking of privilege, we (the authors) often say we are lucky to have won the birth lottery. Not only were we not born into slavery, but we were also not born into almost any disadvantaged group. At birth, we were no more deserving of an easier run at life than a child who was born into poverty, or with a disability, or any other type of disadvantage. Yet we are the ones who won this lottery since we do not have these disadvantages.

  It can be challenging to acknowledge that a good portion of your success stems from luck. Many people instead choose to believe that the world is completely fair, orderly, and predictable. This view is called the just world hypothesis, where people always get what they deserve, good or bad, because of their actions alone, with no accounting for luck or randomness. This view is summed up as you reap what you sow.

  Ironically, belief in a just world can get in the way of actual justice by leading people to victim-blame: The sexual assault victim “should have worn different clothes” or the welfare recipient “is just lazy.” Victims of circumstance are actually blamed for their circumstances, with no accounting for factors of randomness like the birth lottery.

  The problem with the just world hypothesis and victim-blaming is that they make broad judgments about why things are happening to people that are often inaccurate at the individual level. You should also keep in mind that the model of learned helplessness can make it hard for some people to strive for improvement without some assistance. Learned helplessness describes the tendency to stop trying to escape difficult situations because we have gotten used to difficult conditions over time. Someone learns that they are helpless to control their circumstances, so they give up trying to change them.

  In a series of experiments summarized in “Learned Helplessness” in the February 1972 Annual Review of Medicine, psychologist Martin Seligman placed dogs in a box where they were repeatedly shocked at random intervals. Then he placed them in a similar box where they could easily escape the shocks. However, they did not actually try to escape; they simply lay down and waited for the shocks to stop. On the other hand, dogs who were not shocked would quickly jump out of the box.

  Learned helplessness can be overcome when animals or people see that their actions can make a difference, that they aren’t actually helpless. A shining light in the reduction of chronic homelessness has been a strategy that directly combats learned helplessness, helping people take back control of their lives after years on the streets. The strategy, known as Housing First, involves giving apartments to the chronic homeless and, at the same time, assigning a social worker to help each person reintegrate into society, including finding work and living day-to-day in their apartment. Utah has been the leader in this strategy, reducing its chronic homeless population by as much as 72 percent. And the strategy actually saves on average eight thousand dollars per person in annual expenses, as the chronic homeless tend to use a lot of public resources, such as hospitals, jails, and shelters.

  Learned helplessness is not found only in dire situations. People can also exhibit learned helplessness in everyday circumstances, believing they are incapable of doing or learning certain things, such as public speaking or using new technologies. In each of these cases, though, they are probably capable of improving their area of weakness if guided by the right mentor, a topic we cover in more detail later in Chapter 8. You don’t want to make a fundamental attribution error by assuming that your colleague is incapable of doing something when they really just need the proper guidance.

  All the mental models in this section—from the third story to learned helplessness—can help you increase your empathy. When applying them, you are effectively trying to understand people’s actual circumstances and motivations better, trying as best you can to walk a mile in their shoes.

  PROGRESS, ONE FUNERAL AT A TIME

  Just as you can be anchored to a price, you can also be anchored to an entire way of thinking about something. In other words, it can be very difficult to convince you of a new idea when a contradictory idea is already entrenched in your thinking.

  Like many kids in the U.S., our sons are learning “Singapore math,” an approach to arithmetic that includes introducing pictorial steps in order to develop a deeper understanding of basic concepts. Even to mathematically inclined parents, this alternative way of doing arithmetic can feel foreign after so many years of thinking about it another way.

  Singapore Math: Addition

  Singapore math teaches addition using “number bonds” that break apart numbers so that students can add in groups of ten.

  In science, this phenomenon is documented in Thomas Kuhn’s book The Structure of Scientific Revolutions, which popularized the paradigm shift model, describing how accepted scientific theories change over time.

  Instead of a gradual, evolving progression, Kuhn describes a bumpy, messy process in which initial problems with a scientific theory are either ignored or rationalized away. Eventually so many issues pile up that the scientific discipline in question is thrown into a crisis mode, and the paradigm shifts to a new explanation, entering a new stable era.

  Essentially, the old guard holds on to the old theories way too long, even in the face of an obvious-in-hindsight alternative. Nobel Prize–winning physicist Max Planck explained it like this in his Scientific Autobiography and Other Papers: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it,” or, more succinctly, “Science progresses one funeral at a time.”

  In 1912, Alfred Wegener put forth the theory of continental drift that we know to be true today, in which the continents drift across the oceans. Wegener noticed that the different continents fit together nicely like a jigsaw puzzle. Upon further study, he found that fossils seemed strikingly similar across continents, as if the continents indeed were put together this way sometime in the past.

  Distribution of Fossils Across the Southern Continents of Pangea

  We now know this to be the case—all of our continents were previously grouped together into one supercontinent now called Pangea. However, his theory was met with harsh criticism because Wegener was an outsider—a meteorologist by training instead of a geologist—and because he couldn’t offer an explanation of the mechanism causing continental drift, just the idea that it likely had taken place. It basically sat uninvestigated by mainstream geologists for forty years, until the new science of paleomagnetism started creating additional data in support of it, reviving the theory.

  The major theory that held during this time was that there must have been narrow land bridges (called Gondwanian bridges) sometime in the past that allowed the animals to cross between continents, even though there was never any concrete evidence of their existence. Instead of helping to investigate Wegener’s theory (which certainly wasn’t perfect but had promise), geologists chose to hold on to this incorrect land bridge theory until the evidence for continental drift was so overwhelming that a paradigm shift occurred.

  Gondwanian Bridges

  The work of Ignaz Semmelweis, a nineteenth-century Hungarian doctor, met a similar fate. He worked at a teaching hospital where doctors routinely handled cadavers and also delivered babies, without appropriately washing their hands in between. The death rate of mothers who gave birth in this part of the hospital was about 10 percent! In another part of the same hospital, where babies were mostly delivered by midwives who did not routinely handle cadavers, the comparable death rate was 4 percent.

  Semmelweis obsessed about this difference, painstakingly eliminating all variables until he was left with just one: doctors versus midwives. After studying doctor behavior, he concluded that it must be due to their handling of the cadavers and instituted a practice of washing hands with a solution of chlorinated lime. The death rate immediately dropped to match that in the other part of the hospital.

  Despite the clear drop in the death rate, his theories were completely rejec
ted by the medical community at large. In part, doctors were offended by the idea that they were killing their patients. Others were so hung up on the perceived deficiencies of Semmelweis’s theoretical explanation that they ignored the empirical evidence that the handwashing was improving mortality. After struggling to get his ideas adopted, Semmelweis went crazy, was admitted to an asylum, and died at the age of forty-seven. It took another twenty years after his death for his ideas about antiseptics to start to take hold, following Louis Pasteur’s unquestionable confirmation of germ theory.

  Like Wegener, Semmelweis didn’t fully understand the scientific mechanism that underpinned his theory and crafted an initial explanation that turned out to be somewhat incorrect. However, they both noticed obvious and important empirical truths that should have been investigated by other scientists but were reflexively rejected by these scientists because the suggested explanations were not in line with the conventional thinking of the time. Today, this is known as a Semmelweis reflex.

  Individuals still hang on to old theories in the face of seemingly overwhelming evidence—it happens all the time in science and in life in general. The human tendency to gather and interpret new information in a biased way to confirm preexisting beliefs is called confirmation bias.

  Unfortunately, it’s extremely easy to succumb to confirmation bias. Correspondingly, it is hard to question your own core assumptions. There is a reason why many startup companies that disrupt industries are founded by industry outsiders. There is a reason why many scientific breakthroughs are discovered by outsiders to the field. There is a reason why “fresh eyes” and “outside the box” are clichés. The reason is because outsiders aren’t rooted in existing paradigms. Their reputations aren’t at stake if they question the status quo. They are by definition “free thinkers” because they are free to think without these constraints.

  Confirmation bias is so hard to overcome that there is a related model called the backfire effect that describes the phenomenon of digging in further on a position when faced with clear evidence that disproves it. In other words, it often backfires when people try to change your mind with facts and figures, having the opposite effect on you than it should; you become more entrenched in the original, incorrect position, not less.

  In one 2008 Yale study, pro-choice Democrats were asked to give their opinions of Supreme Court nominee John Roberts before and after hearing an ad claiming he supported “violent fringe groups and a convicted [abortion] clinic bomber.” Unsurprisingly, disapproval went from 56 percent to 80 percent. However, disapproval stayed up at 72 percent when they were told the ad was refuted and withdrawn by the abortion rights advocacy group that created it.

  You may also succumb to holding on to incorrect beliefs because of disconfirmation bias, where you impose a stronger burden of proof on the ideas you don’t want to believe. Psychologist Daniel Gilbert put it like this in an April 16, 2006, article for The New York Times, “I’m O.K., You’re Biased”:

  When our bathroom scale delivers bad news, we hop off and then on again, just to make sure we didn’t misread the display or put too much pressure on one foot. When our scale delivers good news, we smile and head for the shower. By uncritically accepting evidence when it pleases us, and insisting on more when it doesn’t, we subtly tip the scales in our favor.

  The pernicious effects of confirmation bias and related models can be explained by cognitive dissonance, the stress felt by holding two contradictory, dissonant, beliefs at once. Scientists have actually linked cognitive dissonance to a physical area in the brain that plays a role in helping you avoid aversive outcomes. Instead of dealing with the underlying cause of this stress—the fact that we might actually be wrong—we take the easy way out and rationalize the conflicting information away. It’s a survival instinct!

  Once you start looking for confirmation bias and cognitive dissonance, we guarantee you will spot them all over, including in your own thoughts. A real trick to being wrong less is to fight your instincts to dismiss new information and instead to embrace new ways of thinking and new paradigms. The meme on the next page perfectly illustrates how cognitive dissonance can make things we take for granted seem absurd.

  There are a couple of tactical mental models that can help you on an everyday basis to overcome your ingrained confirmation bias and tribalism. First, consider thinking gray, a concept we learned from Steven Sample’s book The Contrarian’s Guide to Leadership. You may think about issues in terms of black and white, but the truth is somewhere in between, a shade of gray. As Sample puts it:

  Most people are binary and instant in their judgments; that is, they immediately categorize things as good or bad, true or false, black or white, friend or foe. A truly effective leader, however, needs to be able to see the shades of gray inherent in a situation in order to make wise decisions as to how to proceed.

  The essence of thinking gray is this: don’t form an opinion about an important matter until you’ve heard all the relevant facts and arguments, or until circumstances force you to form an opinion without recourse to all the facts (which happens occasionally, but much less frequently than one might imagine). F. Scott Fitzgerald once described something similar to thinking gray when he observed that the test of a first-rate mind is the ability to hold two opposing thoughts at the same time while still retaining the ability to function.

  This model is powerful because it forces you to be patient. By delaying decision making, you avoid confirmation bias since you haven’t yet made a decision to confirm! It can be difficult to think gray because all the nuance and different points of view can cause cognitive dissonance. However, it is worth fighting through that dissonance to get closer to the objective truth.

  A second mental model that can help you with confirmation bias is the Devil’s advocate position. This was once an official position in the Catholic Church used during the process of canonizing people as saints. Once someone is canonized, the decision is eternal, so it was critical to get it right. Hence this position was created for someone to advocate from the Devil’s point of view against the deceased person’s case for sainthood.

  More broadly, playing the Devil’s advocate means taking up an opposing side of an argument, even if it is one you don’t agree with. One approach is to force yourself literally to write down different cases for a given decision or appoint different members in a group to do so. Another, more effective approach is to proactively include people in a decision-making process who are known to hold opposing viewpoints. Doing so will help everyone involved more easily see the strength in other perspectives and force you to craft a more compelling argument in favor of what you believe. As Charlie Munger says, “I never allow myself to have an opinion on anything that I don’t know the other side’s argument better than they do.”

  DON’T TRUST YOUR GUT

  You make most of your everyday decisions using your intuition, with your subconscious automatically intuiting what to do from instinct or encoded knowledge. It’s your common or sixth sense, your gut feeling, drawing on your past experiences and natural programming to react to circumstances.

  In his book Thinking, Fast and Slow, economics Nobel laureate Daniel Kahneman makes a distinction between this intuitive fast thinking and the more deliberate, logical thinking you do when you slow down and question your intuitive assumptions.

  He argues that when you do something frequently, it gradually gets encoded in your brain until at some point your intuition, via your fast thinking, takes over most of the time and you can do the task mindlessly: driving on the highway, doing simple arithmetic, saying your name. However, when you are in uncertain situations where you do not have encoded knowledge, you must use your slower thinking: driving on new roads, doing complex math, digging into your memory to recall someone you used to know. These are not mindless tasks.

  You can run into trouble when you blindly trust your gut in situations where it is unclear whether you should be thinking fast or slow. Following your intuition a
lone at times like these can cause you to fall prey to anchoring, availability bias, framing, and other pitfalls. Getting physically lost often starts with you thinking you intuitively know where to go and ends with the realization that your intuition failed you.

  Similarly, in most situations where the mental models in this book will be useful, you will want to slow down and deliberately look for how to best apply them. You may use intuition as a guide to where to investigate, but you won’t rely on it alone to make decisions. You will need to really take out the map and study it before making the next turn.

  You probably do not have the right experience intuitively to handle everything that life throws at you, and so you should be especially wary of your intuition in any new or unfamiliar situation. For example, if you’re an experienced hiker in bear country, you know that you should never stare down a bear, as it will take this as a sign of aggression and may charge you in response. Suppose now you’re hiking in mountain lion country and you come across a lion—what should you do? Your intuition would tell you not to stare it down, but in fact, you should do exactly that. To mountain lions, direct eye contact signals that you aren’t easy prey, and so they will hesitate to attack.

  At the same time, intuition can help guide you to the right answer much more quickly. For example, the more you work with mental models, the more your intuition about which one to use in a given situation will be right, and the faster you will get to better decisions working with these models.

  In other words, as we explained at the beginning of this chapter, using mental models over time is a slow and steady way to become more antifragile, making you better able to deal with new situations over time. Of course, the better the information you put into your brain, the better your intuition will be.

 

‹ Prev