Book Read Free

The Republican Brain

Page 7

by is Mooney


  The Nyhan and Reifler study presents another piece of evidence suggesting that conservatives may defend their beliefs more strongly than liberals do in the face of challenge, and be less amenable to changing their minds based on the evidence—at least in the political realm.

  Another similar study gives some inkling of what may be going through people’s minds when they resist persuasion—and shows powerful evidence of conservative defensiveness in particular.

  Take the common insinuation during the George W. Bush years that Iraq and Al Qaeda were secretly collaborating in some way. Northwestern University sociologist Monica Prasad and her colleagues wanted to test whether they could dislodge this belief among those most likely to hold it—Republican partisans from highly GOP-friendly counties in North Carolina and Illinois. So the researchers set up a study in which they directly challenged some of these Republicans in person, citing the findings of the 9/11 Commission as well as a statement by George W. Bush, in which the former president himself protested that his administration had “never said that the 9/11 attacks were orchestrated between Saddam and Al Qaeda.”

  As it turned out, not even Bush’s own words could change the minds of these Bush voters. Just one out of 49 partisans who originally believed the Iraq–Al Qaeda claim changed his or her mind about it upon being challenged and presented with new information. Seven more claimed never to have believed the claim in the first place (although they clearly had). The remaining 41 all came up with ways to preserve their beliefs, ranging from generating counterarguments to simply being un-movable:

  INTERVIEWER: . . . the September 11 Commission found no link between Saddam and 9/11, and this is what President Bush said. [pause] This is what the commission said. Do you have any comments on either of those?

  RESPONDENT: Well, I bet they say that the Commission didn’t have any proof of it but I guess we still can have our opinions and feel that way even though they say that.

  I didn’t choose these two studies of political misinformation and the Iraq war by accident. It is hard to think of many liberal-conservative divides over the facts that have held greater consequences for lives, economies, and international security, than this one.

  The split over whether Iraq had the touted “WMD,” and whether Saddam and Osama were frat buddies, represented a true turning point in the relationship between our politics and objective reality. In case you missed it: Reality lost badly. Conservatives and Republicans were powerfully and persistently wrong, following a cherished leader into a war based on false premises—and then, according to these studies, finding themselves unable to escape the quagmire of unreality even after several years had passed.

  And still, I have not yet described what may be the most insidious side of motivated reasoning, particularly as it relates to conservative denial of the seemingly undeniable.

  Call it the “smart idiots” effect: The politically sophisticated or knowledgeable are often more biased, and less persuadable, than the ignorant. “People who have a dislike of some policy—for example, abortion—if they’re unsophisticated they can just reject it out of hand,” says Stony Brook’s Milton Lodge. “But if they’re sophisticated, they can go one step further and start coming up with counterarguments.” These counterarguments, because they are emotionally charged and become stored in memory and the brain, literally become part of us. They thus allow a person with more sophistication to convince him- or herself even more strongly about the correctness of an initial conviction.

  It was this “smart idiots” effect, and especially its recurrent appearance on the political right, that changed how I think about our disputes over science and the facts, and eventually set in motion the writing of this book. I even remember when I first became aware of it. It was thanks to a 2008 Pew report documenting the intense partisan divide in the U.S. over the reality of global warming—a divide that, maddeningly for scientists, has shown a tendency to widen even as the basic facts about global warming have become more firmly established.

  Those facts are these: Humans, since the industrial revolution, have been burning more and more fossil fuels to power their societies, and this has led to a steady accumulation of greenhouse gases, and especially carbon dioxide, in the atmosphere. At this point, very simple physics takes over, and you are pretty much doomed, by what scientists refer to as the “radiative” properties of carbon dioxide molecules (which trap infrared heat radiation that would otherwise escape to space), to have a warming planet. Since about 1995, scientists have not only confirmed that this warming is taking place, but have also grown confident that it has, like the gun in a murder mystery, our fingerprint on it. Natural fluctuations, although they exist, can’t explain what we’re seeing. The only reasonable verdict is that humans did it, in the atmosphere, with their cars and smokestacks.

  The Pew data, however, showed that humans aren’t as predictable as carbon dioxide molecules. Despite a growing scientific consensus about global warming, as of 2008 Democrats and Republicans had, like a couple in a divorce, cleaved over the facts stated above, so that only 29 percent of Republicans accepted the core reality about our planet (centrally, that humans are causing global warming), compared with 58 percent of Democrats. (The divide is, if anything, even bigger nowadays.)

  But that’s not all. Buried in the Pew report was a little chart showing the relationship between one’s political party affiliation, one’s acceptance that humans are causing global warming, and one’s level of education. And here’s the mind-blowing surprise: For Republicans, having a college degree didn’t make one any more open to what scientists have to say. On the contrary, better educated Republicans were more skeptical of modern climate science than their less educated brethren. Only 19 percent of college-educated Republicans agreed that the planet is warming due to human actions, versus 31 percent of non-college-educated Republicans.

  For Democrats and Independents, precisely the opposite was the case. More education correlated with being more accepting of climate science—among Democrats, dramatically so. The difference in acceptance between more and less educated Democrats was 23 percentage points.

  This finding recurs, in a variety of incarnations, throughout the rapidly growing social science literature on the resistance to climate science. Again and again, Republicans or conservatives who know more about the issue, or are more educated, are shown to be more in denial, and often more sure of themselves too—and are confident they don’t need any more information on the issue.

  The same “smart idiots” effect also occurs on nonscientific but factually contested issues, like the claim that President Obama is a Muslim. Belief in this falsehood actually increased more among better educated Republicans from 2009 to 2010 than it did among less educated Republicans, according to research by George Washington University political scientist John Sides.

  Finally, the same effect has been captured in relation to the myth that the healthcare reform bill empowered government “death panels.” According to research by Brendan Nyhan, Republicans who thought they knew more about the Obama health care plan were “paradoxically more likely to endorse the misperception than those who did not.” Well informed Democrats were the opposite—quite certain there were no “death panels” in the bill. (The Democrats also happened to be right, by the way.)

  What accounts for the “smart idiot” effect? For one thing, well informed or well educated conservatives probably consume more conservative news and opinion, such as by watching Fox News. Thus, they are more likely to know what they’re supposed to think about the issues—what people like them think—and to be familiar with the arguments or reasons for holding these views. If challenged, they can then recall and reiterate these arguments. They’ve made them a part of their identities, a part of their brains, and in doing so, they’ve drawn a strong emotional connection between certain “facts” or claims, and their deeply held political values.

  What this suggests, critically, is that sophisticated conservatives, like Andrew Schlaf
ly, may be very different from unsophisticated or less-informed ones. Paradoxically, we would expect less informed conservatives to be easier to persuade, and more responsive to new and challenging information.

  The “smart idiots” effect generates endless frustration for many scientists—and indeed, for many well-educated, reasonable people.

  These people—and I know many of them—want to believe that the solution to the problem of resistance to science, or to accurate information in general, is more and better education—leading, presumably, to greater public Enlightenment (capital E). No less than President Obama’s science adviser John Holdren (a man whom I greatly admire, but disagree with in this instance) has stated, when asked how to get Republicans in Congress to accept the science of climate change, that it’s an “education problem.”

  But scientists must now acknowledge that science itself refutes this idea. In fact, Dan Kahan’s research team at Yale found a clever way to test it, and it failed badly.

  In another study, Kahan and his colleagues once again surveyed how the four cultural groups—egalitarians, communitarians, hierarchs, and individualists—respond to the issue of climate change. Only this time, they included two revealing new measurements in the analysis—ones that caught the smart idiots red handed (or, red-brained, if you’d prefer).

  This time, people weren’t just asked about their cultural worldviews and their views on how dangerous global warming is. They were also asked standard questions to determine their degree of scientific literacy (e.g., “Antibiotics kill viruses as well as bacteria—true or false?”) as well as their numeracy or capacity for mathematical reasoning (e.g., “If Person A’s chance of getting a disease is 1 in 100 in ten years, and person B’s risk is double that of A, what is B’s risk?”). The latter attribute is particularly significant in light of what we’ve already said about the brain, because aptitude in mathematical reasoning requires the use of calmer and more deliberative “System 2” cognition. You can’t intuit or emote your answer to a math problem using “System 1.”

  Kahan’s group now had four sets of information, for over 1,500 randomly selected Americans: Their views on global warming, their political values, their degree of scientific literacy, and their capacity for mathematical reasoning. The relationships between them were stunning and alarming. The standard view that knowing more science, or being better at mathematical reasoning, ought to make you more accepting of mainstream climate science simply crashed and burned.

  Instead, here was the result: If you were already part of a cultural group predisposed to distrust climate science—e.g., a hierarchical-individualist—then more science knowledge and more skill in mathematical reasoning tended to make you even more dismissive, not more open to the science. Precisely the opposite happened with the other group—egalitarian-communitarians—who tended to worry more as they knew more science and math. The result was that, overall, more scientific literacy and mathematical ability led to greater political polarization over climate change.

  So much for education serving as an antidote to politically biased reasoning.

  Kahan’s studies, I should note, are presented in an entirely even-handed fashion. Like many motivated reasoning researchers, he does not postulate that any of his cultural groups are more biased than any other—just that they’re biased in different directions.

  Still, it is hard to miss that in his studies, one group in particular, the hierarchical-individualists—which includes not only Republicans and conservatives but also right-wing authoritarians, who are very hierarchical and religious, and very defensive of their beliefs—not only starts out highly disconnected from scientific reality on climate change, but also becomes even more out of touch with greater scientific literacy and mathematical ability.

  By contrast, when I discuss the views of liberals concerning nuclear power, I will turn again to Kahan’s results—because they are not the mirror image of these findings on conservatives and global warming.

  By now, we’ve seen ample evidence of just how biased humans can be by their preexisting beliefs and convictions—and how this infects not only our relationships and our personal lives, but also our politics.

  It all leads to an overwhelming question—and one that’s very difficult to answer: How “irrational” is all this?

  On the one hand, it surely makes sense not to discard an entire belief system, built up over a lifetime, because of some new snippet of information. “It is quite possible to say, ‘I reached this pro-capital punishment decision based on real information that I arrived at over my life,’” explains Stanford social psychologist Jon Krosnick. Indeed, there’s a sense in which even right-wing science denial could be considered keenly “rational.” In certain conservative communities of the United States, explains Dan Kahan, “people who say, ‘I think there’s something to climate change,’ that’s going to mark them out as a certain kind of person, and their life is going to go less well.”

  Rational or otherwise, however, motivated reasoning poses a deep challenge to the ideal of Jeffersonian democracy, which assumes that voters will be informed about the issues—not deeply wedded to misinformation. We’re divided enough about politics as it is, without adding irreconcilable views about the nature of reality on top of that.

  And there’s an even bigger question looming in the background. It’s one we’ve already begun to consider: How can evolution explain all of this? But now it’s time to go farther.

  Even after what we’ve already learned about the brain and the emotions, it’s still hard to imagine why evolution would create a creature that is capable of reason, and yet performs so badly at it. One might think there would have been an absolute premium on accurately perceiving our environments, and a survival advantage accompanying this capacity that would be preserved by natural selection and passed on to offspring.

  Explaining why that is not the case is a fascinating question in evolutionary biology and evolutionary psychology right now. And it is going to be a difficult one to definitively answer, since we can’t reset the clock of evolution to see what actually occurred. Whatever its strengths or weaknesses, human reason has not yet given us the ability to create a time machine.

  Still, a few considerations may cast some light.

  First, from the perspective of an organism trying to keep itself alive, not all errors of perception or belief are equal. Some have much greater consequences. For instance, and as Michael Shermer argues in his recent book The Believing Brain, it is far better to be a little bit wrong and still alive—because you overreacted to defend yourself and ran the other way at the tiniest rustle in the leaves—than to be wrong and dead, because you didn’t think there was anything to worry about and didn’t run away fast enough.

  This distinction between what are called “Type 1” and “Type 2” errors—erring on the side of credulous belief (“false positive”), versus erring on the side of too much skepticism (“false negative”)—surely helps to explain why we have quick-fire, emotional, and defensive reactions to begin with. Evolution won’t let us commit the kinds of Type 2 errors that will rapidly get us killed. So it gave us the much touted fight or flight response, which we share with other animals.(For this same reason, Shermer suggests, we have a default design that inclines us to believe things rather than to question them.)

  It’s equally important to recognize that our brains evolved in a very different context from the one in which we now find ourselves. They evolved with none of the media that we now consume, and none of the cognitively dazzling and sometimes exploitive stimuli—from advertisements to movies to blogs. So it is not at all clear that they should be suited for being particularly rational in the current context.

  None of this, though, explains our elaborate heights of rationalization—our argumentative creativity—and just how floridly idiotic we can be. We’re not only capable of being wrong; we make quite the show of it. We go to elaborate lengths to defend wrong beliefs; we come up with bizarre doctrines like Christ
ian Science and Theosophy; we even write equations to refute Einstein. How do you explain that?

  One team of thinkers—philosopher Hugo Mercier of the University of Pennsylvania and cognitive scientist Dan Sperber of the Jean Nicod Institute in France—suggest an intriguing answer. They’ve proposed that we’ve been reasoning about reasoning all wrong—trying to fix what didn’t need fixing, if we’d only understood what its original purpose was. “People have been trying to reform something that works perfectly well,” writes Mercier, “as if they had decided that hands were made for walking and that everybody should be taught that.”

  Contrary to the claims of Enlightenment idealists, Mercier and Sperber suggest human reason did not evolve as a device for getting at the objective truth. Rather, they suggest that its purpose is to facilitate selective arguing in defense of one’s position in a social context—something that, we can hardly dispute, we are very good at.

  When thought about in the context of the evolution of human language and communication, and cooperation in groups, this makes a lot of sense. There would surely have been a survival value to getting other people in your hunter-gatherer group to listen to you and do what you want them to do—in short, a value to being persuasive. And for the listeners, there would have been just as much of a premium on being able to determine whether a given speaker is reliable and trustworthy, and should be heeded. Thus, everybody in the group would have benefited from an airing of different views, so that their strengths and weaknesses could be debated—regarding, say, where it would be a good place to hunt today or whether the seasons are changing.

 

‹ Prev