Book Read Free

The Intelligence Trap

Page 17

by David Robson


  Schwarz is sceptical about whether we can protect ourselves from all misinformation through mere intention and goodwill, though: the sheer deluge means that it could be very difficult to apply our scepticism even-handedly. ‘You couldn’t spend all day checking every damn thing you encounter or that is said to you,’ he told me.*

  * Pennycook has, incidentally, shown that reflective thinking is negatively correlated with smartphone use – the more you check Facebook, Twitter and Google, the less well you score on the CRT. He emphasises that we don’t know if there is a causal link – or which direction that link would go – but it’s possible that technology has made us lazy thinkers. ‘It might make you more intuitive because you are less used to reflecting – compared to if you are not looking things up, and thinking about things more.’

  When it comes to current affairs and politics, for instance, we already have so many assumptions about which news sources are trustworthy – whether it’s the New York Times, Fox News, Breitbart, or your uncle – and these prejudices can be hard to overcome. In the worst scenario, you may forget to challenge much of the information that agrees with your existing point of view, and only analyse material you already dislike. As a consequence, your well-meaning attempts to protect yourself from bad thinking may fall into the trap of motivated reasoning. ‘It could just add to the polarisation of your views,’ Schwarz said.

  This caution is necessary: we may never be able to build a robust psychological shield against all the misinformation in our environment. Even so, there is now some good evidence that we can bolster our defences against the most egregious errors while perhaps also cultivating a more reflective, wiser mindset overall. We just need to do it more smartly.

  Like Patrick Croskerry’s attempts to de-bias his medical students, these strategies often come in the form of an ‘inoculation’ – exposing us to one type of bullshit, so that we will be better equipped to spot other forms in the future. The aim is to teach us to identify some of the warning signs, planting little red flags in our thinking, so that we automatically engage our analytical, reflective reasoning when we need it.

  John Cook and Stephan Lewandowsky’s work suggests the approach can be very powerful. In 2017, Lewandowsky and Cook (who also wrote The Debunking Handbook) were investigating ways to combat some of the misinformation around human-made climate change – particularly the attempts to spread doubt about the scientific consensus.

  Rather than tackling climate myths directly, however, they first presented some of their participants with a fact sheet about the way the tobacco industry had used ‘fake experts’ to cast doubts on scientific research linking smoking to lung cancer.

  They then showed them a specific piece of misinformation about climate change: the so-called Oregon Petition, organised by the biochemist Arthur B. Robinson, which claimed to offer 31,000 signatures of people with science degrees, who all doubted that human release of greenhouse gases is causing disruption of the Earth’s climate. In reality, the names were unverified – the list even included the signature of Spice Girl ‘Dr’ Geri Halliwell31 – and fewer than 1 per cent of those questioned had formally studied climate science.

  Previous research had shown that many people reading about the petition fail to question the credentials of the experts, and are convinced of its findings. In line with theories of motivated reasoning, this was particularly true for people who held more right-wing views.

  After learning about the tobacco industry’s tactics, however, most of Cook’s participants were more sceptical of the misinformation, and it failed to sway their overall opinions. Even more importantly, the inoculation had neutralised the effect of the misinformation across the political spectrum; the motivated reasoning that so often causes us to accept a lie, and reject the truth, was no longer playing a role.32 ‘For me that’s the most interesting result – inoculation works despite your political background,’ Cook told me. ‘Regardless of ideology, no one wants to be misled by logical fallacies – and that is an encouraging and exciting thought.’

  Equally exciting is the fact that the inoculation concerning misinformation in one area (the link between cigarettes and smoking) provided protection in another (climate change). It was as if participants had planted little alarm bells in their thinking, helping them to recognise when to wake up and apply their analytic minds more effectively, rather than simply accepting any information that felt ‘truthy’. ‘It creates an umbrella of protection.’

  The power of these inoculations is leading some schools and universities to explore the benefits of explicitly educating students about misinformation.33

  Many institutions already offer critical thinking classes, of course, but these are often dry examinations of philosophical and logical principles, whereas inoculation theory shows that we need to be taught about it explicitly, using real-life examples that demonstrate the kinds of arguments that normally fool us.34 It does not seem to be enough to assume that we will readily apply those critical thinking skills in our everyday lives without first being shown the sheer prevalence of misinformation and the ways it could be swaying our judgements.

  The results so far have been encouraging, showing that a semester’s course in inoculation significantly reduced the students’ beliefs in pseudoscience, conspiracy theories and fake news. Even more importantly, these courses also seem to improve measures of critical thinking more generally – such as the ability to interpret statistics, identify logical fallacies, consider alternative explanations and recognise when additional information will be necessary to come to a conclusion.35

  Although these measures of critical thinking are not identical to the wise reasoning tests we explored in Chapter 5, they do bear some similarities – including the ability to question your own assumptions and to explore alternative explanations for events. Importantly, like Igor Grossmann’s work on evidence-based wisdom, and the scores of emotion differentiation and regulation that we explored in the last chapter, these measures of critical thinking don’t correlate very strongly with general intelligence, and they predict real-life outcomes better than standard intelligence tests.36 People with higher scores are less likely to try an unproven fad diet, for instance; they are also less likely to share personal information with a stranger online or to have unprotected sex. If we are smart but want to avoid making stupid mistakes, it is therefore essential that we learn to think more critically.

  These results should be good news for readers of this book: by studying the psychology of these various myths and misconceptions, you may have already begun to protect yourself from lies – and the existing cognitive inoculation programmes already offer some further tips to get you started.

  The first step is to learn to ask the right questions:

  Who is making the claim? What are their credentials? And what might be their motives to make me think this?

  What are the premises of the claim? And how might they be flawed?

  What are my own initial assumptions? And how might they be flawed?

  What are the alternative explanations for their claim?

  What is the evidence? And how does it compare to the alternative explanation?

  What further information do you need before you can make a judgement?

  Given the research on truthiness, you should also look at the presentation of the claims. Do they actually add any further proof to the claim – or do they just give the illusion of evidence? Is the same person simply repeating the same point – or are you really hearing different voices who have converged on the same view? Are the anecdotes offering useful information and are they backed up with hard data? Or do they just increase the fluency of the story? And do you trust someone simply because their accent feels familiar and is easy to understand?

  Finally, you should consider reading about a few of the more common logical fallacies, since this can plant those ‘red flags’ that will alert you when you are being duped by ‘truthy’ but deceptive information. To get you started, I’ve compile
d a list of the most common ones in the table below.

  These simple steps may appear to be stating the obvious, but overwhelming evidence shows that many people pass through university without learning to apply them to their daily life.37 And the over-confidence bias shows that it’s the people who think they are already immune who are probably most at risk.

  38

  If you really want to protect yourself from bullshit, I can’t over-emphasise the importance of internalising these rules and applying them whenever you can, to your own beloved theories as well as those that already arouse your suspicion. If you find the process rewarding, there are plenty of online courses that will help you to develop those skills further.

  According to the principles of inoculation, you should start out by looking at relatively uncontroversial issues (like the flesh-eating bananas) to learn the basics of sceptical thought, before moving on to more deeply embedded beliefs (like climate change) that may be harder for you to question. In these cases, it is always worth asking why you feel strongly about a particular viewpoint, and whether it is really central to your identity, or whether you might be able to reframe it in a way that is less threatening.

  Simply spending a few minutes to write positive, self-affirming things about yourself and the things that you most value can make you more open to new ideas. Studies have shown that this practice really does reduce motivated reasoning by helping you to realise that your whole being does not depend on being right about a particular issue, and that you can disentangle certain opinions from your identity.39 (Belief in climate change does not have to tear down your conservative politics, for instance: you could even see it as an opportunity to further business and innovation.) You can then begin to examine why you might have come to those conclusions, and to look at the information in front of you and test whether you might be swayed by its fluency and familiarity.

  You may be surprised by what you find. Applying these strategies, I’ve already changed my mind on certain scientific issues, such as genetic modification. Like many liberal people, I had once opposed GM crops on environmental grounds – yet the more I became aware of my news sources, the more I noticed that I was hearing opposition from the same small number of campaign groups like Greenpeace – creating the impression that these fears were more widespread than they actually were. Moreover, their warnings about toxic side effects and runaway plagues of Frankenstein plants were cognitively fluent and chimed with my intuitive environmental views – but a closer look at the evidence showed that the risks are tiny (and mostly based on anecdotal data), while the potential benefits of building insect-resistant crops and reducing the use of pesticides are incalculable.

  Even the former leader of Greenpeace has recently attacked the scaremongering of his ex-colleagues, describing it as ‘morally unacceptable . . . putting ideology before humanitarian action’.40 I had always felt scornful of climate change deniers and anti-vaccination campaigners, yet I had been just as blinkered concerning another cause.

  For one final lesson in the art of bullshit detection, I met the writer Michael Shermer in his home town of Santa Barbara, California. For the past three decades, Shermer has been one of the leading voices of the sceptical movement, which aims to encourage the use of rational reasoning and critical thinking to public life. ‘We initially went for the low-hanging fruit – television psychics, astrology, tarot card reading,’ Shermer told me. ‘But over the decades we’ve migrated to more “mainstream” claims about things like global warming, creationism, anti-vaccination – and now fake news.’

  Shermer has not always been this way. A competitive cyclist, he once turned to unproven (though legal) treatments to boost his performance, including colonic irrigation to ease his digestion, and ‘rolfing’ – a kind of intense (and painful) physiotherapy which involves manipulating the body’s connective tissue to reinforce its ‘energy field’. At night, he had even donned an ‘Electro-Acuscope’ – a device, worn over the skull, that was designed to enhance the brain’s healing ‘alpha waves’.

  Shermer’s ‘road-to-Damascus moment’ came during the 1983 Race Across America, from Santa Monica, California, to Atlantic City, New Jersey. For this race, Shermer hired a nutritionist, who advised him to try a new ‘multivitamin therapy’ – which involved ingesting a mouthful of foul-smelling tablets. The end result was the ‘most expensive and colourful urine in America’. By the third day, he decided that enough was enough – and on the steep climb to Loveland Pass, Colorado, he spat out the mouthful of acrid tablets and vowed never to be duped again. ‘Being sceptical seemed a lot safer than being credulous’, he later wrote.41

  A stark test of his newfound scepticism came a few days later, near Haigler, Nebraska. It was nearly halfway through the race and he was already suffering from severe exhaustion. After waking from a forty-five-minute nap, he was convinced that he was surrounded by aliens, posing as his crew members, trying to take him to the mothership. He fell back asleep, and awoke clear-headed and realised that he had experienced a hallucination arising from physical and mental exhaustion. The memory remains vivid, however, as if it were a real event. Shermer thinks that if he had not been more self-aware, he could have genuinely confused the event for a real abduction, as many others before him have done.

  As a historian of science, writer and public speaker, Shermer has since tackled psychics, quack doctors, 9/11 conspiracy theorists and holocaust deniers. He has seen how your intelligence can be applied powerfully to either discover or obfuscate the truth.

  You might imagine that he would be world-weary and cynical after so many years of debunking bullshit, yet he was remarkably affable on our meeting. A genial attitude is, I later found out, crucial for putting many of his opponents off their guard, so that he can begin to understand what motivates them. ‘I might socialise with someone like [Holocaust denier] David Irving, because after a couple of drinks, they open up and go deeper, and tell you what they are really thinking.’42

  Shermer may not use the term, but he now offers one of the most comprehensive ‘inoculations’ available in his ‘Skepticism 101’ course at Chapman University.43 The first steps, he says, are like ‘kicking the tyres and checking under the hood’ of a car. ‘Who’s making the claim? What’s the source? Has someone else verified the claim? What’s the evidence? How good is the evidence? Has someone tried to debunk the evidence?’ he told me. ‘It’s basic baloney detection.’

  Like the other psychologists I have spoken to, he is certain that the vivid, real-life examples of misinformation are crucial to teach these principles; it’s not enough to assume that a typical academic education equips us with the necessary protection. ‘Most education is involved in just teaching students facts and theories about a particular field – not necessarily the methodologies of thinking sceptically or scientifically in general.’

  To give me a flavour of the course, Shermer describes how many conspiracy theories use the ‘anomalies-as-proof’ strategy to build a superficially convincing case that something is amiss. Holocaust deniers, for instance, argue that the structure of the (badly damaged) Krema II gas chamber at Auschwitz-Birkenau doesn’t match eye-witness accounts of SS guards dropping gas pellets through the holes in the roof. From this, they claim that no one could have been gassed at Krema II, therefore no one would have been gassed at Auschwitz-Birkenau, meaning that no Jews were systematically killed by the Nazis – and the Holocaust didn’t happen.

  If that kind of argument is presented fluently, it may bypass our analytical thinking; never mind the vast body of evidence that does not hinge on the existence of holes in Krema II, including aerial photographs showing mass exterminations, the millions of skeletons in mass graves, and the confessions of many Nazis themselves. Attempts to reconstruct the Krema gas chamber have, in fact, found the presence of these holes, meaning the argument is built on a false premise – but the point is that even if the anomaly had been true, it wouldn’t have been enough to rewrite the whole of Holocaust history.
>
  The same strategy is often used by people who believe that the 9/11 attacks were ‘an inside job’. One of their central claims is that jet fuel from the aeroplanes could not have burned hot enough to melt the steel girders in the Twin Towers, meaning the buildings should not have collapsed. (Steel melts at around 1510° C, whereas the fuel from the aeroplanes burns at around 825° C.) In fact, although steel does not turn into a liquid at that temperature, engineers have shown that it nevertheless loses much of its strength, meaning the girders would have nevertheless buckled under the weight of the building. The lesson, then, is to beware of the use of anomalies to cast doubt on vast sets of data, and to consider the alternative explanations before you allow one puzzling detail to rewrite history.44

  Shermer emphasises the importance of keeping an open mind. With the Holocaust, for instance, it’s important to accept that there will be some revising of the original accounts as more evidence comes to light, without discounting the vast substance of the accepted events.

  He also advises us all to step outside of our echo chamber and to use the opportunity to probe someone’s broader worldviews; when talking to a climate change denier, for instance, he thinks it can be useful to explore their economic concerns about regulating fossil fuel consumption – teasing out the assumptions that are shaping their interpretation of the science. ‘Because the facts about global warming are not political – they are what they are.’ These are the same principles we are hearing again and again: to explore, listen and learn, to look for alternative explanations and viewpoints rather than the one that comes most easily to mind, and to accept you do not have all the answers.

 

‹ Prev