The Best Australian Science Writing 2012
Page 12
From where we stand now it is clear that the swine flu of 2009 was a novel and serious virus but not ‘the big one’. Post hoc analyses are essential in the public health outbreak domain and you have to own up to your mistakes if you have made any. Errors are inevitable in the face of uncertainty, and in the world of infectious diseases it’s hard to think of anything as unpredictable as the behaviour of the influenza virus. Had the strain been more virulent and its death rate only twice as high as it actually was, those millions of dollars now seen as wasted on unused vaccine would have been just petty cash.
Killer viruses
Emergency response
Why clever people believe in silly things
Craig Cormick
Why do so many otherwise clever people believe in paranormal events, or the benefits of fringe medicines and the dangers of infant vaccination – despite there being no real evidence to support their beliefs? According to some surveys, in Australia, about half the population believes in ESP (extra-sensory perception, such as telepathy) and one-third believes in UFOs as evidence of extraterrestrial visitation.
And a 2005 survey published in the Medical Journal of Australia stated that half of all Australians are using alternative medicines, and one in four are risking their health by not telling their doctor that they are doing so.
We’ve probably all met somebody at a party trying to convince us of the benefits of the latest alternative therapies, which is harmless enough. It becomes an issue of societal concern, however, when we see fringe beliefs, based on non-scientific values, leading to people dying from putting their trust in natural therapies or faith healing when Western medicine could have saved them.
A US National Science Foundation study found that almost nine in ten Americans agreed that there were some good ways of treating sickness that medical science did not recognise, while four in ten Americans had used alternative therapies. This is similar to Australian data, where such beliefs are more common among well-educated upper middle-class women.
* * * * *
The issue most under the spotlight here is infant vaccination and belief in its link to autism or other nasty side effects, an erroneous belief which has persisted despite the original study by Andrew Wakefield, which linked vaccinations with autism being discredited and retracted by The Lancet in February 2010.
Furthermore, his co-authors withdrew support for the study’s interpretations, other researchers were unable to confirm or reproduce his results, and there have since been revelations about undisclosed financial conflicts of interest on Wakefield’s part.
The reasons for the persistence of this belief in vaccination being linked to autism – despite the evidence – are complex. It is very important to understand why it persists if we believe that there is a need to counter the growth in anti-science in society.
Ben Goldacre, the British doctor and author of the Guardian column, book and blog called Bad Science, coined a phrase that is crucial for us to examine: ‘Why clever people believe stupid things’.
In the US, where the anti-vaccination movement has really taken off, the Centers for Disease Control and Prevention in Atlanta, Georgia, estimates that one in five Americans believes that vaccines can cause autism, and two in five have either delayed or refused vaccines for their child.
And in Australia, according to the Australian General Practice Network, vaccination rates have been dropping over the past seven years to the point that only 83 per cent of four-yearolds nationally are covered – which is below the 90 per cent rate needed to assure community-wide disease protection and prevent outbreaks of fatal, but preventable, diseases.
In some areas, usually where there are high pockets of people choosing alternative lifestyles – such as southeast Queensland, the northern rivers of New South Wales, the Adelaide Hills and the southwest of Western Australia – vaccination rates are as low as 70 per cent.
The problem is not just that non-scientific beliefs can be very strongly ingrained in people, but that such beliefs are unlikely to ever be influenced by scientific fact.
So should we be concerned? Well only if we think that the dangers of non-science and pseudoscience are tangible and that widespread support for non-scientific beliefs can impede a society’s ability to function, or compete, in an ever more complicated and science and technology-driven world.
* * * * *
If the answers to those questions are yes, then we need to better understand the factors that make otherwise rational people subscribe to irrational beliefs – and, importantly, what might be done to prevent a growth in anti-science thinking.
Fortunately, there is enough research in this area to provide a fairly clear overview of why this happens.
At the heart of the problem, as outlined extremely well by Goldacre in Bad Science, is the way we are wired psychologically, which leads us to common errors in our thinking which in turn lead to distortions of perception, inaccurate judgments or illogical interpretations.
Social scientists call these ways of thinking ‘heuristics’: mental shortcuts we take as a way of responding to rapid and complex information being fired at us. We need to quickly sort new information into categories – and an easy way to do this is to sort it according to our existing belief systems or values.
This holds true for beliefs about genetically modified (GM) foods, the safety of nanotechnology, climate change, your favourite football club and so on – and the more complex the issue, the more likely it is that people will make decisions based on beliefs or values.
* * * * *
In an ideal world, we look at different information, analyse it carefully and make up our minds on a case-by-case basis. But that doesn’t work when we don’t have the motive or ability to do this.
We are increasingly time-poor in an increasingly data-rich world; that forces us to make mental shortcuts more often, drawing upon whatever existing knowledge we have (all too often from the media rather than from formal education), or falling back on our basic beliefs. It’s not always one or the other, of course, as we tend to use a mix of emotion and logic, but we need to look at which is dominating our thinking.
Nobel prize winner Daniel Kahneman has coined the term ‘thinking fast and slow’ to describe our different ways of thinking, but he also points out two important things: firstly, that slow thinking does not always lead to better conclusions, and secondly, that while we can recognise errors of thinking in others, we can rarely recognise them in ourselves. So everybody else uses faulty thinking, except us!
And this is all made more complicated by the fact that in the age of the internet, the information and communication flows are entirely different from what we may have been used to even only a decade ago.
We all know that the promise of the internet to provide us with a wealth of information to make us smarter was akin to the early hopes that television would make us more educated and could teach us many languages and so on.
Instead, it turns out that we are most likely to be watching people dance and sing and cook on TV, and watching talking babies and satires of the Hitler bunker scene in the film Downfall on the internet. And among the tsunami of irrelevant data on the web, we invariably end up hunting down data that supports our existing beliefs.
It’s not the internet that is fully to blame – it’s just a channel for information – but the sheer amount of data of dubious credibility that is available and that doesn’t readily distinguish between comment and research, or blog and news has changed the relationship between information and attitude formation.
Where as kids we might have started with the germ of a wacky idea and sought to check its validity with experts such as teachers, or even by reading an encyclopaedia, we now have the ability to almost instantly find a community of people somewhere in the world with similar wacky ideas, never tested by an expert.
And through the internet, we can also reinforce each other’s wackiness to the point where it becomes a solid value that ain’t shifting for nobod
y, no how. Just Google ‘sexually abused by aliens’ or ‘sin causes cancer’ to see what I mean.
* * * * *
Access to the enormous breadth of opinions on the internet has revealed that when swamped with information, people use mental shortcuts, and follow that up by ‘motivated reasoning’. This means acknowledging only information that accords with our beliefs, and dismissing information that does not.
So if you believe UFOs are evidence of alien visitations, you would acknowledge every bit of data you found that supported that, and would dismiss everything that argued against it, and as a result you would tend to find only information that supported your beliefs, thus increasingly reinforcing them.
Likewise with climate change, as we are seeing played out over and over again in public debates. Those who reject climate change as being human induced – or even happening at all – are not swayed by any scientific evidence, and cling tenaciously to the sporadic data that might seem to support their views. Again, the reasons for this appear to be about the way we are wired.
It has been well documented in surveys that those who are politically conservative tend to reject human-induced climate change, while those who are more politically left-leaning tend to support it.
* * * * *
But it is not a person’s politics that are the key drivers of attitudes; it is our underlying beliefs and values that affect our political alignment.
If your underlying belief system is that humans should dominate or tame nature (anthropocentricism), that economic growth is inherently good for society and should be maintained at all costs, and that an individual’s rights are more important than the public good – then the idea that individual actions are actually causing damage will conflict so strongly with that belief system that you will instinctively reject it.
Likewise, if you believe that humanity must live in equilibrium with the planet (geocentricism), that we need to put the brakes on material progress to be more sustainable, and that public good is more important than individual rights – then the concept of human-induced climate change aligns well with your belief system and you will accept it very easily.
People then shop around for the data that most supports their existing values. And if you can’t find it in scientific studies, rest assured you will find it somewhere else, such as on the internet.
An interesting statistic from a 2003 PhD study by Cathy Fraser at the Australian National University into vaccinating and non-vaccinating parents, all of whom had access to the standard Health Department publications on vaccinations, shows that while only 1.6 per cent of vaccinating parents used the internet for more information, 36.2 per cent of non-vaccinating parents sought data from it.
So is getting more good facts out there the answer? Maybe not. Brendan Nyhan, at the University of Michigan, undertook a study which found that when people were shown information proving that their beliefs were wrong, they actually became more set in those beliefs. This is known in the business as ‘backfire’.
And what’s more, highly intelligent people tend to suffer backfire more than less intelligent people do. The adage that attitudes that are not formed by logic and facts cannot be influenced by logic and facts holds true here.
So what about providing the public with more balanced and factual information?
Well, that can be a problem too. When you present the public with both sides of a story, giving them the arguments for and against, research shows that again, people’s existing attitudes tend to become more entrenched. This research, conducted by Andrew Binder at North Carolina State University, found that most people, when faced with an issue related to science and technology, fairly quickly adopted an initial position of support or opposition, based on their own personal combination of mental shortcuts and previously held beliefs.
And the more people with opposing points of view talked together about divisive science and technology issues – such as GM food, nanotechnology, stem cells, take your pick – the less likely the different camps were to agree on any issue or even see it the same way.
Binder stated, ‘This is problematic because it suggests that individuals are very selective in choosing their discussion partners, and hearing only what they want to hear during discussions of controversial issues.’
This means that the media’s tendency to aim for balance in their stories, particularly on contentious topics, giving first one side of the argument and then the other, can actually exacerbate this problem of polarised extreme opinions.
* * * * *
The next thing we need to know is that the dismissing of facts and figures can be increased when somebody is highly emotive about a topic. So look around and see who is playing the ‘scare card’ and whipping up a bit of emotional concern about topics. The more agitated, scared, upset or angry we are, the more receptive to emotive messages – and the less receptive to facts – we are.
Which brings us to the fear factor. Former US President Franklin D. Roosevelt once said, ‘We have nothing to fear but fear itself.’ If only.
According to Frank Furedi, professor of sociology at the University of Kent and author of the Precautionary Principle and the Crisis of Causality, we are losing our capacity to deal with the unknown because we increasingly believe that we are powerless to deal with the perils confronting us.
He says that one of the many consequences of this is a growth in policies designed to deal with threats or risks being based on feeling and intuitions, rather than on evidence and facts.
Jenny McCarthy, celebrity leader of the anti-vaccination movement in the US, says she bases vaccine rejectionism on her intuition. Likewise, many alternative therapists advocate that people should put their trust in their own intuition to justify their choices.
* * * * *
So when people choose not to vaccinate, it’s not because they are stupid – it’s because their fear of the harm from vaccination has become stronger than their fear of the harm from not vaccinating, even though the evidence shows that they are wrong.
The diseases that we vaccinate against are these days unknown and unseen – we no longer see children dying from whooping cough or suffering from polio. However, what we do see are stories of children suffering autism and other conditions supposedly as a result of vaccinations.
No matter how small a risk this might be, it is one that is visible and known – and therefore given a higher priority.
A serious outbreak of whooping cough or measles might change all this, of course, and at the moment this is a dangerous possibility in some parts of Australia and the US: in California alone there were over 7800 cases of whooping cough in 2010, with ten deaths, due to lack of vaccination.
US health officials, when facing a large public rejection of smallpox vaccinations at the turn of the 20th century, talked about the need for a ‘fool killer’ – an outbreak of smallpox devastating enough to convince people of the need for vaccinations and overturn their intuitive mistrust of them.
Furedi argues that this reliance on intuition, which has served us well for tens of thousands of years – by stopping us from doing things like stepping out of the safe cave into the dangerous dark of night – can lead to superstition and belief in paranormal phenomena and pseudoscience.
And according to Bruce Hood from the University of Bristol, humans have evolved to be susceptible to supernatural beliefs. He has postulated that the human mind is adapted to reason intuitively and to understand unobservable properties, such as what makes something alive, or people’s motivations. On the plus side, it is this intuitive thinking that has led to many scientific theories and revelations, such as gravity; but it also leaves us prone to irrational ideas.
Furedi has stated that misconceptions about the working of the world around us, such as astrology and other supernatural beliefs, are due to naïve intuitive theories. Psychologists Marjaana Lindeman and Kia Aarnio, from the University of Helsinki, have gone one step further and described this as ‘immature errors of reasoning’,
which are on par with children still learning about the natural world.
They say there are three major sorts of knowledge that determine children’s understanding of the world: intuitive physics, which is an understanding of the physical world; intuitive psychology, which is an understanding of how people think and behave; and intuitive biology, which is an understanding of the principles and forces of life.
When we mix these up, they argue – such as by investing physical objects such as crystals with healing powers – we are suffering from ‘ontological confusion’. This confusion underpins many alternative health treatments that are based on the belief that thought can alter health outcomes, or that touch can convey healing power.
Similarly, they state that cognitive errors underlie homeopathy, reiki, healing by touch, distance healing and birth affirmations, which are often based on attributing some sort of ‘life forces’ to physical events.
* * * * *
Which brings us to the next thing we need to better understand – the impact of uncertainty and control. At the heart of a lot of our non-science beliefs is a need for more control.
We live in an always uncertain and ever more out-ofcontrol world, but superstitious beliefs and pseudoscience can give people a sense of control and certainty, providing simple answers that reduce their levels of stress – and stress reduction too is a necessary adaptive mechanism and something we tend to be wired to seek out.