Book Read Free

Super Thinking

Page 3

by Gabriel Weinberg


  A frame-of-reference mental trap (or useful trick, depending on your perspective) is framing. Framing refers to the way you present a situation or explanation. When you present an important issue to your coworker or family member, you try to frame it in a way that might help them best understand your perspective, setting the stage for a beneficial conversation. For example, if you want your organization to embark on an innovative yet expensive project, you might frame it to your colleagues as a potential opportunity to outshine the competition rather than as an endeavor that would require excessive resources. The latter framing may have it rejected out of hand.

  You also need to be aware that family members and coworkers are constantly framing issues for you as well, and your perception of their ideas can vary widely based on how they are framed. When someone presents a new idea or decision to you, take a step back and consider other ways in which it could be framed. If a colleague tells you they are leaving for another job to seek a better opportunity, that may indeed be true, but it also may be true that they want to leave the organization after feeling overlooked. Multiple framings can be valid yet convey vastly different perspectives.

  If you visit news sites on the internet, then you probably know all about framing, or at least you should. For example, headlines have a framing effect, affecting the meaning people take away from stories. On August 31, 2015, three police officers responded to a 911 call about a burglary in progress. Unfortunately, the call did not specify an exact address, and the officers responded to the wrong house. Upon finding the back door unlocked, they entered, and encountered a dog. Gunfire ensued, and the dog, homeowner, and one of the officers were shot, all by officer gunfire. The homeowner and officer survived. Two headlines framed the incident in dramatically different ways.

  Framing Effect

  In a study by Ullrich Ecker and others, “The Effects of Subtle Misinformation in News Headlines,” presented in the December 2014 issue of the Journal of Experimental Psychology: Applied, students read an article about a small increase in burglary rates over the last year (0.2 percent) that was anomalous in a much larger decline over the past decade (10 percent). The same article came with one of two different headlines: “Number of Burglaries Going Up” or “Downward Trend in Burglary Rate.” The headline had a significant effect on which facts in the article were remembered:

  The pattern was clear-cut: A misleading headline impaired memory for the article. . . . A misleading headline can thus do damage despite genuine attempts to accurately comprehend an article. . . . The practical implications of this research are clear: News consumers must be [made] aware that editors can strategically use headlines to effectively sway public opinion and influence individuals’ behavior.

  A related trap/trick is nudging. Aldert Vrij presents a compelling example in his book Detecting Lies and Deceit:

  Participants saw a film of a traffic accident and then answered the question, “About how fast were the cars going when they contacted each other?” Other participants received the same question, except that the verb contacted was replaced by either hit, bumped, collided, or smashed. Even though the participants saw the same film, the wording of the question affected their answers. The speed estimates (in miles per hour) were 31, 34, 38, 39, and 41, respectively.

  You can be nudged in a direction by a subtle word choice or other environmental cues. Restaurants will nudge you by highlighting certain dishes on menu inserts, by having servers verbally describe specials, or by just putting boxes around certain items. Retail stores and websites nudge you to purchase certain products by placing them where they are easier to see.

  Nudging

  Another concept you will find useful when making purchasing decisions is anchoring, which describes your tendency to rely too heavily on first impressions when making decisions. You get anchored to the first piece of framing information you encounter. This tendency is commonly exploited by businesses when making offers.

  Dan Ariely, behavioral economist and author of Predictably Irrational, brings us an illustrative example of anchoring using subscription offers for The Economist. Readers were offered three ways to subscribe: web only ($59), print only ($125), and print and web ($125).

  Yes, you read that right: the “print only” version cost the same as the “print and web” version. Who would choose that? Predictably, no one. Here is the result when one hundred MIT students reported their preference:

  Web only ($59): 16 percent

  Print only ($125): 0 percent

  Print and web ($125): 84 percent

  So why include that option at all? Here’s why: when it was removed from the question, this result was revealed:

  Web only ($59): 68 percent

  Print and web ($125): 32 percent

  Just having the print-only option—even though no one chooses it—anchors readers to a much higher value for the print-and-web version. It feels like you are getting the web version for free, causing many more people to choose it and creating 43 percent more revenue for the magazine by just adding a version that no one chooses!

  Shoppers at retailers Michaels or Kohl’s know that these stores often advertise sales, where you can save 40 percent or more on selected items or departments. However, are those reduced prices a real bargain? Usually not. They’re reduced from the so-called manufacturer’s suggested retail price (MSRP), which is usually very high. Being aware of the MSRP anchors you so that you feel you are getting a good deal at 40 percent off. Often, that reduction just brings the price to a reasonable level.

  Anchoring isn’t just for numbers. Donald Trump uses this mental model, anchoring others to his extreme positions, so that what seem like compromises are actually agreements in his favor. He wrote about this in his 1987 book Trump: The Art of the Deal:

  My style of deal-making is quite simple and straightforward. I aim very high, and then I just keep pushing and pushing to get what I’m after. Sometimes I settle for less than I sought, but in most cases I still end up with what I want.

  More broadly, these mental models are all instances of a more general model, availability bias, which occurs when a bias, or distortion, creeps into your objective view of reality thanks to information recently made available to you. In the U.S., illegal immigration has been a hot topic with conservative pundits and politicians in recent years, leading many people to believe it is at an all-time high. Yet the data suggests that illegal immigration via the southern border is actually at a five-decade low, indicating that the prevalence of the topic is creating an availability bias for many.

  U.S. Southern Border Apprehensions: at Five-Decade Low

  Availability bias can easily emerge from high media coverage of a topic. Rightly or wrongly, the media infamously has a mantra of “If it bleeds, it leads.” The resulting heavy coverage of violent crime causes people to think it occurs more often than it does. The polling company Gallup annually asks Americans about their perception of changing violent crime rates and found in 2014 that “federal crime statistics have not been highly relevant to the public’s crime perceptions in recent years.”

  U.S. Crime Rate: Actual vs. Perceived

  In a famous 1978 study, “Judged Frequency of Lethal Events,” from the Journal of Experimental Psychology, Sarah Lichtenstein and others asked people about forty-one leading causes of death. They found that people often overstate the risk of sensationally over-reported causes of death, like tornados, by fifty times and understate the risk of common causes of death, like stroke, by one hundred times.

  Mortality Rates by Causes: Actual vs. Perceived

  Availability bias stems from overreliance on your recent experiences within your frame of reference, at the expense of the big picture. Let’s say you are a manager and you need to write an annual review for your direct report. You are supposed to think critically and objectively about her performance over the entire year. However, it’s easy to be swayed by those really bad or really good contributions over just the past few weeks. Or you might just co
nsider the interactions you have had with her personally, as opposed to getting a more holistic view based on interactions with other colleagues with different frames of reference.

  With the rise of personalized recommendations and news feeds on the internet, availability bias has become a more and more pernicious problem. Online this model is called the filter bubble, a term coined by author Eli Pariser, who wrote a book on it with the same name.

  Because of availability bias, you’re likely to click on things you’re already familiar with, and so Google, Facebook, and many other companies tend to show you more of what they think you already know and like. Since there are only so many items they can show you—only so many links on page one of the search results—they therefore filter out links they think you are unlikely to click on, such as opposing viewpoints, effectively placing you in a bubble.

  In the run-up to the 2012 U.S. presidential election and again in 2018, the search engine DuckDuckGo (founded by Gabriel) conducted studies where individuals searched on Google for the same political topics, such as gun control and climate change. It discovered that people got significantly different results, personalized to them, when searching for the same topics at the same time. This happened even when they were signed out and in so-called incognito mode. Many people don’t realize that they are getting tailored results based on what a mathematical algorithm thinks would increase their clicks, as opposed to a more objective set of ranked results.

  The Filter Bubble

  When you put many similar filter bubbles together, you get echo chambers, where the same ideas seem to bounce around the same groups of people, echoing around the collective chambers of these connected filter bubbles. Echo chambers result in increased partisanship, as people have less and less exposure to alternative viewpoints. And because of availability bias, they consistently overestimate the percentage of people who hold the same opinions.

  It’s easy to focus solely on what is put in front of you. It’s much harder to seek out an objective frame of reference, but that is what you need to do in order to be wrong less.

  WALK A MILE IN THEIR SHOES

  Most of the significant problems in the world involve people, so making headway on these problems often requires a deep understanding of the people involved. For instance, enough food is produced to feed everyone on the planet, yet starvation still exists because this food cannot be distributed effectively. Issues involving people, such as in corrupt governments, are primary reasons behind these distribution failures.

  However, it is very easy to be wrong about other people’s motivations. You may assume they share your perspective or context, think like you do, or have circumstances similar to yours. With such assumptions, you may conclude that they should also behave like you would or hold your beliefs. Unfortunately, often these assumptions are wrong.

  Consequently, to be wrong less when thinking about people, you must find ways to increase your empathy, opening up a deeper understanding of what other people are really thinking. This section explores various mental models to help you do just that.

  In any conflict between two people, there are two sides of the story. Then there is the third story, the story that a third, impartial observer would recount. Forcing yourself to think as an impartial observer can help you in any conflict situation, including difficult business negotiations and personal disagreements.

  The third story helps you see the situation for what it really is. But how do you open yourself up to it? Imagine a complete recording of the situation, and then try to think about what an outside audience would say was happening if they watched or listened to the recording. What story would they tell? How much would they agree with your story? Authors Douglas Stone, Bruce Patton, and Sheila Heen explore this model in detail in their book Difficult Conversations: “The key is learning to describe the gap—or difference—between your story and the other person’s story. Whatever else you may think and feel, you can at least agree that you and the other person see things differently.”

  If you can coherently articulate other points of view, even those directly in conflict with your own, then you will be less likely to make biased or incorrect judgments. You will dramatically increase your empathy—your understanding of other people’s frames of reference—whether or not you agree. Additionally, if you acknowledge the perspective of the third story within difficult conversations, it can have a disarming effect, causing others involved to act less defensively. That’s because you are signaling your willingness and ability to consider an objective point of view. Doing so encourages others involved to do the same.

  Another tactical model that can help you empathize is the most respectful interpretation, or MRI. In any situation, you can explain a person’s behavior in many ways. MRI asks you to you interpret the other parties’ actions in the most respectful way possible. It’s giving people the benefit of the doubt.

  For example, suppose you sent an email to your kid’s school asking for information on the science curriculum for the upcoming year, but haven’t heard back in a few days. Your first interpretation may be that they’re ignoring your request. A more respectful interpretation would be that they are actively working to get back to you but haven’t completed that work yet. Maybe they are just waiting on some crucial information before replying, like a personnel decision that hasn’t been finalized yet, and that is holding up the response.

  The point is you don’t know the real answer yet, but if you approach the situation with the most respectful interpretation, then you will generally build trust with those involved rather than destroy it. With MRI, your follow-up email or call is more likely to have an inquisitive tone rather than an accusatory one. Building trust pays dividends over time, especially in difficult situations where that trust can serve as a bridge toward an amicable resolution. The next time you feel inclined to make an accusation, take a step back and think about whether that is really a fair assumption to make.

  Using MRI may seem naïve, but like the third story, this model isn’t asking you to give up your point of view. Instead, MRI asks you to approach a situation from a perspective of respect. You remain open to other interpretations and withhold judgment until necessary.

  Another way of giving people the benefit of the doubt for their behavior is called Hanlon’s razor: never attribute to malice that which is adequately explained by carelessness. Like Ockham’s razor, Hanlon’s razor seeks out the simplest explanation. And when people do something harmful, the simplest explanation is usually that they took the path of least resistance. That is, they carelessly created the negative outcome; they did not cause the outcome out of malice.

  Hanlon’s razor is especially useful for navigating connections in the virtual world. For example, we have all misread situations online. Since the signals of body language and voice intonation are missing, harmless lines of text can be read in a negative way. Hanlon’s razor says the person probably just didn’t take enough time and care in crafting their message. So the next time you send a message and all you get back is OK, consider that the writer is in a rush or otherwise occupied (the more likely interpretation) instead of coming from a place of dismissiveness.

  The third story, most respectful interpretation, and Hanlon’s razor are all attempts to overcome what psychologists call the fundamental attribution error, where you frequently make errors by attributing others’ behaviors to their internal, or fundamental, motivations rather than external factors. You are guilty of the fundamental attribution error whenever you think someone was mean because she is mean rather than thinking she was just having a bad day.

  You of course tend to view your own behavior in the opposite way, which is called self-serving bias. When you are the actor, you often have self-serving reasons for your behavior, but when you are the observer, you tend to blame the other’s intrinsic nature. (That’s why this model is also sometimes called actor-observer bias.)

  For example, if someone runs a red light, you often assume that person is inh
erently reckless; you do not consider that she might be rushing to the hospital for an emergency. On the other hand, you will immediately rationalize your own actions when you drive like a maniac (“I’m in a hurry”).

  Another tactical model to help you have greater empathy is the veil of ignorance, put forth by philosopher John Rawls. It holds that when thinking about how society should be organized, we should do so by imagining ourselves ignorant of our particular place in the world, as if there were a veil preventing us from knowing who we are. Rawls refers to this as the “original position.”

  For example, you should not just consider your current position as a free person when contemplating a world where slavery is allowed. You must consider the possibility that you might have been born a slave, and how that would feel. Or, when considering policies regarding refugees, you must consider the possibility that you could have been one of those seeking refuge. The veil of ignorance encourages you to empathize with people across a variety of circumstances, so that you can make better moral judgments.

  Suppose that, like many companies in recent years, you are considering ending a policy that has allowed your employees to work remotely because you believe that your teams perform better face-to-face. As a manager, it may be easy to imagine changing the policy from your perspective, especially if you personally do not highly value remote working. The veil of ignorance, though, pushes you to imagine the change from the original position, where you could be any employee. What if you were an employee caring for an elderly family member? What if you were a single parent? You may find that the new policy is warranted even after considering its repercussions holistically, but putting on the veil of ignorance helps you appreciate the challenges this might pose for your staff and might even help you come up with creative alternatives.

 

‹ Prev