Think Again: The Power of Knowing What You Don't Know

Home > Other > Think Again: The Power of Knowing What You Don't Know > Page 15
Think Again: The Power of Knowing What You Don't Know Page 15

by Adam Grant


  Yet by 2018, only 59 percent of Americans saw climate change as a major threat—and 16 percent believed it wasn’t a threat at all. Across many countries in Western Europe and Southeast Asia, higher percentages of the populations had opened their minds to the evidence that climate change is a dire problem. In the past decade in the United States, beliefs about climate change have hardly budged.

  This thorny issue is a natural place to explore how we can bring more complexity into our conversations. Fundamentally, that involves drawing attention to the nuances that often get overlooked. It starts with seeking and spotlighting shades of gray.

  A fundamental lesson of desirability bias is that our beliefs are shaped by our motivations. What we believe depends on what we want to believe. Emotionally, it can be unsettling for anyone to admit that all life as we know it might be in danger, but Americans have some additional reasons to be dubious about climate change. Politically, climate change has been branded in the United States as a liberal issue; in some conservative circles, merely acknowledging the fact that it might exist puts people on a fast track to exile. There’s evidence that higher levels of education predict heightened concern about climate change among Democrats but dampened concern among Republicans. Economically, we remain confident that America will be more resilient in response to a changing climate than most of the world, and we’re reluctant to sacrifice our current ways of achieving prosperity. These deep-seated beliefs are hard to change.

  As a psychologist, I want to zoom in on another factor. It’s one we can all control: the way we communicate about climate change. Many people believe that preaching with passion and conviction is necessary for persuasion. A clear example is Al Gore. When he narrowly lost the U.S. presidential election in 2000, one of the knocks against him was his energy—or lack thereof. People called him dry. Boring. Robotic. Fast-forward a few years: his film was riveting and his own platform skills had evolved dramatically. In 2016, when I watched Gore speak in the red circle at TED, his language was vivid, his voice pulsated with emotion, and his passion literally dripped off him in the form of sweat. If a robot was ever controlling his brain, it short-circuited and left the human in charge. “Some still doubt that we have the will to act,” he boomed, “but I say the will to act is itself a renewable resource.” The audience erupted in a standing ovation, and afterward he was called the Elvis of TED. If it’s not his communication style that’s failing to reach people, what is?

  At TED, Gore was preaching to the choir: his audience was heavily progressive. For audiences with more varied beliefs, his language hasn’t always resonated. In An Inconvenient Truth, Gore contrasted the “truth” with claims made by “so-called skeptics.” In a 2010 op-ed, he contrasted scientists with “climate deniers.”

  This is binary bias in action. It presumes that the world is divided into two sides: believers and nonbelievers. Only one side can be right, because there is only one truth. I don’t blame Al Gore for taking that position; he was presenting rigorous data and representing the consensus of the scientific community. Because he was a recovering politician, seeing two sides to an issue must have been second nature. But when the only available options are black and white, it’s natural to slip into a mentality of us versus them and to focus on the sides over the science. For those on the fence, when forced to choose a side, the emotional, political, and economic pressures tilt in favor of disengaging or dismissing the problem.

  To overcome binary bias, a good starting point is to become aware of the range of perspectives across a given spectrum. Polls suggest that on climate change, there are at least six camps of thought. Believers represent more than half of Americans, but some are concerned while others are alarmed. The so-called nonbelievers actually range from cautious to disengaged to doubtful to dismissive.

  It’s especially important to distinguish skeptics from deniers. Skeptics have a healthy scientific stance: They don’t believe everything they see, hear, or read. They ask critical questions and update their thinking as they gain access to new information. Deniers are in the dismissive camp, locked in preacher, prosecutor, or politician mode: They don’t believe anything that comes from the other side. They ignore or twist facts to support their predetermined conclusions. As the Committee for Skeptical Inquiry put it in a plea to the media, skepticism is “foundational to the scientific method,” whereas denial is “the a priori rejection of ideas without objective consideration.”*

  The complexity of this spectrum of beliefs is often missing from coverage of climate change. Although no more than 10 percent of Americans are dismissive of climate change, it’s these rare deniers who get the most press. In an analysis of some hundred thousand media articles on climate change between 2000 and 2016, prominent climate contrarians received disproportionate coverage: they were featured 49 percent more often than expert scientists. As a result, people end up overestimating how common denial is—which in turn makes them more hesitant to advocate for policies that protect the environment. When the middle of the spectrum is invisible, the majority’s will to act vanishes with it. If other people aren’t going to do anything about it, why should I bother? When they become aware of just how many people are concerned about climate change, they’re more prepared to do something about it.

  As consumers of information, we have a role to play in embracing a more nuanced point of view. When we’re reading, listening, or watching, we can learn to recognize complexity as a signal of credibility. We can favor content and sources that present many sides of an issue rather than just one or two. When we come across simplifying headlines, we can fight our tendency to accept binaries by asking what additional perspectives are missing between the extremes.

  This applies when we’re the ones producing and communicating information, too. New research suggests that when journalists acknowledge the uncertainties around facts on complex issues like climate change and immigration, it doesn’t undermine their readers’ trust. And multiple experiments have shown that when experts express doubt, they become more persuasive. When someone knowledgeable admits uncertainty, it surprises people, and they end up paying more attention to the substance of the argument.

  Of course, a potential challenge of nuance is that it doesn’t seem to go viral. Attention spans are short: we have only a few seconds to capture eyeballs with a catchy headline. It’s true that complexity doesn’t always make for good sound bites, but it does seed great conversations. And some journalists have found clever ways to capture it in few words.

  A few years ago, the media reported on a study of the cognitive consequences of coffee consumption. Although their headlines were drawn from the same data, some newspapers praised the benefits of coffee, while other outlets warned about the costs:

  The actual study showed that older adults who drank a daily cup or two of coffee had a lower risk of mild cognitive impairment, relative to abstainers, occasional consumers, and heavier consumers. If they increased their consumption by another cup or more per day, they had a higher risk than those who stayed at or below a single cup a day. Each of the one-sided headlines took seven to twelve words to mislead the reader about the effects of drinking coffee. A more accurate headline needed just twelve words to serve up a jolt of instant complexity:

  Imagine if even this kind of minimal nod to complexity appeared in articles on climate change. Scientists overwhelmingly agree about its human causes, but even they have a range of views on the actual effects—and the potential remedies. It’s possible to be alarmed about the situation while recognizing the variety of ways to improve it.*

  Psychologists find that people will ignore or even deny the existence of a problem if they’re not fond of the solution. Liberals were more dismissive of the issue of intruder violence when they read an argument that strict gun control laws could make it difficult for homeowners to protect themselves. Conservatives were more receptive to climate science when they read about a green technology policy proposal than
about an emissions restriction proposal.

  Featuring shades of gray in discussions of solutions can help to shift attention from why climate change is a problem to how we can do something about it. As we’ve seen from the evidence on the illusion of explanatory depth, asking “how” tends to reduce polarization, setting the stage for more constructive conversations about action. Here are examples of headlines in which writers have hinted at the complexity of the solutions:

  I WORK IN THE ENVIRONMENTAL MOVEMENT. I DON’T CARE IF YOU RECYCLE

  CAN PLANTING A TRILLION TREES STOP CLIMATE CHANGE? SCIENTISTS SAY IT’S A LOT MORE COMPLICATED

  SOME CAVEATS AND CONTINGENCIES

  If you want to get better at conveying complexity, it’s worth taking a close look at how scientists communicate. One key step is to include caveats. It’s rare that a single study or even a series of studies is conclusive. Researchers typically feature multiple paragraphs about the limitations of each study in their articles. We see them less as holes in our work and more as portholes to future discoveries. When we share the findings with nonscientists, though, we sometimes gloss over these caveats.

  That’s a mistake, according to recent research. In a series of experiments, psychologists demonstrated that when news reports about science included caveats, they succeeded in capturing readers’ interest and keeping their minds open. Take a study suggesting that a poor diet accelerates aging. Readers were just as engaged in the story—but more flexible in their beliefs—when it mentioned that scientists remained hesitant to draw strong causal conclusions given the number of factors that can affect aging. It even helped just to note that scientists believed more work needed to be done in this area.

  We can also convey complexity by highlighting contingencies. Every empirical finding raises unanswered questions about when and where results will be replicated, nullified, or reversed. Contingencies are all the places and populations where an effect may change.

  Consider diversity: although headlines often say “Diversity is good,” the evidence is full of contingencies. Although diversity of background and thought has the potential to help groups think more broadly and process information more deeply, that potential is realized in some situations but not others. New research reveals that people are more likely to promote diversity and inclusion when the message is more nuanced (and more accurate): “Diversity is good, but it isn’t easy.”* Acknowledging complexity doesn’t make speakers and writers less convincing; it makes them more credible. It doesn’t lose viewers and readers; it maintains their engagement while stoking their curiosity.

  In social science, rather than cherry-picking information to fit our existing narratives, we’re trained to ask whether we should rethink and revise those narratives. When we find evidence that doesn’t fit neatly into our belief systems, we’re expected to share it anyway.* In some of my past writing for the public, though, I regret not having done enough to emphasize areas where evidence was incomplete or conflicting. I sometimes shied away from discussing mixed results because I didn’t want to leave readers confused. Research suggests that many writers fall into the same trap, caught up in trying to “maintain a consistent narrative rather than an accurate record.”

  A fascinating example is the divide around emotional intelligence. On one extreme is Daniel Goleman, who popularized the concept. He preaches that emotional intelligence matters more for performance than cognitive ability (IQ) and accounts for “nearly 90 percent” of success in leadership jobs. At the other extreme is Jordan Peterson, writing that “There is NO SUCH THING AS EQ” and prosecuting emotional intelligence as “a fraudulent concept, a fad, a convenient band-wagon, a corporate marketing scheme.”

  Both men hold doctorates in psychology, but neither seems particularly interested in creating an accurate record. If Peterson had bothered to read the comprehensive meta-analyses of studies spanning nearly two hundred jobs, he’d have discovered that—contrary to his claims—emotional intelligence is real and it does matter. Emotional intelligence tests predict performance even after controlling for IQ and personality. If Goleman hadn’t ignored those same data, he’d have learned that if you want to predict performance across jobs, IQ is more than twice as important as emotional intelligence (which accounts for only 3 to 8 percent of performance).

  I think they’re both missing the point. Instead of arguing about whether emotional intelligence is meaningful, we should be focusing on the contingencies that explain when it’s more and less consequential. It turns out that emotional intelligence is beneficial in jobs that involve dealing with emotions, but less relevant—and maybe even detrimental—in work where emotions are less central. If you’re a real estate agent, a customer service representative, or a counselor, being skilled at perceiving, understanding, and managing emotions can help you support your clients and address their problems. If you’re a mechanic or an accountant, being an emotional genius is less useful and could even become a distraction. If you’re fixing my car or doing my taxes, I’d rather you didn’t pay too much attention to my emotions.

  In an effort to set the record straight, I wrote a short LinkedIn post arguing that emotional intelligence is overrated. I did my best to follow my own guidelines for complexity:

  Nuance: This isn’t to say that emotional intelligence is useless.

  Caveats: As better tests of emotional intelligence are designed, our knowledge may change.

  Contingencies: For now, the best available evidence suggests that emotional intelligence is not a panacea. Let’s recognize it for what it is: a set of skills that can be beneficial in situations where emotional information is rich or vital.

  Over a thousand comments poured in, and I was pleasantly surprised that many reacted enthusiastically to the complexified message. Some mentioned that nothing is either/or and that data can help us reexamine even our closely held beliefs. Others were downright hostile. They turned a blind eye to the evidence and insisted that emotional intelligence was the sine qua non of success. It was as if they belonged to an emotional intelligence cult.

  From time to time I’ve run into idea cults—groups that stir up a batch of oversimplified intellectual Kool-Aid and recruit followers to serve it widely. They preach the merits of their pet concept and prosecute anyone who calls for nuance or complexity. In the area of health, idea cults defend detox diets and cleanses long after they’ve been exposed as snake oil. In education, there are idea cults around learning styles—the notion that instruction should be tailored to each student’s preference for learning through auditory, visual, or kinesthetic modes. Some teachers are determined to tailor their instruction accordingly despite decades of evidence that although students might enjoy listening, reading, or doing, they don’t actually learn better that way. In psychology, I’ve inadvertently offended members of idea cults when I’ve shared evidence that meditation isn’t the only way to prevent stress or promote mindfulness; that when it comes to reliability and validity, the Myers-Briggs personality tool falls somewhere between a horoscope and a heart monitor; and that being more authentic can sometimes make us less successful. If you find yourself saying ____ is always good or ____ is never bad, you may be a member of an idea cult. Appreciating complexity reminds us that no behavior is always effective and that all cures have unintended consequences.

  xkcd.com

  In the moral philosophy of John Rawls, the veil of ignorance asks us to judge the justice of a society by whether we’d join it without knowing our place in it. I think the scientist’s veil of ignorance is to ask whether we’d accept the results of a study based on the methods involved, without knowing what the conclusion will be.

  MIXED FEELINGS

  In polarized discussions, a common piece of advice is to take the other side’s perspective. In theory, putting ourselves in another person’s shoes enables us to walk in lockstep with them. In practice, though, it’s not that simple.

  In a pair of experiments,
randomly assigning people to reflect on the intentions and interests of their political opposites made them less receptive to rethinking their own attitudes on health care and universal basic income. Across twenty-five experiments, imagining other people’s perspectives failed to elicit more accurate insights—and occasionally made participants more confident in their own inaccurate judgments. Perspective-taking consistently fails because we’re terrible mind readers. We’re just guessing.

  If we don’t understand someone, we can’t have a eureka moment by imagining his perspective. Polls show that Democrats underestimate the number of Republicans who recognize the prevalence of racism and sexism—and Republicans underestimate the number of Democrats who are proud to be Americans and oppose open borders. The greater the distance between us and an adversary, the more likely we are to oversimplify their actual motives and invent explanations that stray far from their reality. What works is not perspective-taking but perspective-seeking: actually talking to people to gain insight into the nuances of their views. That’s what good scientists do: instead of drawing conclusions about people based on minimal clues, they test their hypotheses by striking up conversations.

  For a long time, I believed that the best way to make those conversations less polarizing was to leave emotions out of them. If only we could keep our feelings off the table, we’d all be more open to rethinking. Then I read evidence that complicated my thinking.

  It turns out that even if we disagree strongly with someone on a social issue, when we discover that she cares deeply about the issue, we trust her more. We might still dislike her, but we see her passion for a principle as a sign of integrity. We reject the belief but grow to respect the person behind it.

 

‹ Prev