I Think You'll Find It's a Bit More Complicated Than That

Home > Science > I Think You'll Find It's a Bit More Complicated Than That > Page 1
I Think You'll Find It's a Bit More Complicated Than That Page 1

by Ben Goldacre




  Copyright

  Fourth Estate

  An imprint of HarperCollinsPublishers

  77–85 Fulham Palace Road,

  Hammersmith, London W6 8JB

  www.4thestate.co.uk

  First published in Great Britain by Fourth Estate in 2014

  Copyright © Ben Goldacre 2014

  Ben Goldacre asserts the moral right to be identified as the author of this work

  ‘Dr Goldacre Doesn’t Make Everything Better’ by Jeremy Laurance © the Independent/www.independent.co.uk

  A catalogue record for this book is available from the British Library

  All rights reserved under International and Pan-American Copyright Conventions. By payment of the required fees, you have been granted the non-exclusive, non-transferable right to access and read the text of this e-book on-screen. No part of this text may be reproduced, transmitted, down-loaded, decompiled, reverse engineered, or stored in or introduced into any information storage and retrieval system, in any form or by any means, whether electronic or mechanical, now known or hereinafter invented, without the express written permission of HarperCollins

  Source ISBN: 9780007462483

  Ebook Edition © October 2014 ISBN: 9780007505159

  Version: 2014-09-18

  Dedication

  To whom it may concern.

  And Archie.

  And Alice.

  Contents

  Cover

  Title Page

  Copyright

  Dedication

  Intro

  HOW SCIENCE WORKS

  Why Won’t Professor Susan Greenfield Publish This Theory in a Scientific Journal?

  Cherry-Picking Is Bad. At Least Warn Us When You Do It

  Being Wrong

  Kids Who Spot Bullshit, and the Adults Who Get Upset About It

  Existential Angst About the Bigger Picture

  The Glorious Mess of Real Scientific Results

  Nullius in Verba

  Is It OK to Ignore Results from People You Don’t Trust?

  Foreign Substances in Your Precious Bodily Fluids

  How Myths Are Made

  Publish or Be Damned

  Academic Papers Are Hidden from the Public. Here’s Some Direct Action

  BIOLOGISING

  Neuro-Realism

  The Stigma Gene

  Pink, Pink, Pink, Pink. Pink Moan

  STATISTICS

  Guns Don’t Kill People, Puppies Do

  Datamining for Terrorists Would Be Lovely If It Worked

  Benford’s Law: Using Stats to Bust an Entire Nation for Naughtiness

  The Certainty of Chance

  Sampling Error, the Unspoken Issue Behind Small Number Changes in the News

  Scientific Proof That We Live in a Warmer and More Caring Universe

  Drink Coffee, See Dead People

  Voices of the Ancients

  BIG DATA

  There’s Something Magical About Watching Patterns Emerge from Data

  Give Us the Data

  Care.data Can Save Lives: But Not If We Bungle It

  Care.data Has Been Bungled

  SURVEYS

  The Huff

  A New and Interesting Form of Wrong

  ‘Hello Madam, Would You Like Your Children to Be Unemployed?’

  EPIDEMIOLOGY

  Beau Funnel

  When Journalists Do Primary Research

  Confound You!

  Bicycle Helmets and the Law

  Screen Test

  How Do You Know?

  Anecdotes Are Great. If They Really Illustrate the Data

  The Strange Case of the Magnetic Wine

  What Is Science? First, Magnetise Your Wine …

  BAD ACADEMIA

  What If Academics Were as Dumb as Quacks with Statistics?

  Brain-Imaging Studies Report More Positive Findings Than Their Numbers Can Support. This Is Fishy

  ‘None of Your Damn Business’

  Twelve Monkeys. No … Eight. Wait, Sorry, I Meant Fourteen

  Medical Hypotheses Fails the Aids Test

  Observations on the Classification of Idiots

  More Crap Journals?

  GOVERNMENT STATISTICS

  If You Want to Be Trusted More: Claim Less

  Is This the Worst Government Statistic Ever Created?

  Anarchy for the UK. Ish.

  More Than Sixty Children Saved from Abuse

  Home Taping Didn’t Kill Music

  Is This a Joke?

  EVIDENCE-BASED POLICY

  I’d Expect This from UKIP, or the Daily Mail. Not from a Government Leaflet

  Andrew Lansley and His Imaginary Evidence

  Why Is Evidence So Hard for Politicians?

  Politicians Can Divine Which Policy Works Best by Using Their Special Magic Politician Beam

  Pornography in Hospitals

  The Power of Ideas

  ‘Exams Are Getting Easier’

  Over There! An Eight-Mile-High Distraction Made of Posh Chocolate!

  As Far as I Understand Thinktanks …

  Meaningful Debates Need Clear Information

  Minority Retort

  Building Evidence into Education

  DRUGS

  A Rock of Crack as Big as the Ritz

  The Least Surrogate Outcome

  Heroin on Prescription

  LIBEL

  NMT Is Suing Dr Peter Wilmshurst. So How Trustworthy Is This Company? Let’s Look at Its Website …

  ‘We Are More Possible Than You Can Powerfully Imagine’

  Science Is About Embracing Your Knockers

  The Return of Dr McKeith

  QUACKS

  The Noble and Ancient Tradition of Moron-Baiting

  How Do You Regulate Wu?

  Blame Everyone But Yourselves

  MAGIC BOXES

  ADE 651: WTF?

  After Madeleine, Why Not Bin Laden?

  Who’s Holding the Smoking Gun on Bioresonance?

  AIDS

  House of Numbers

  Aids Denialism at the Spectator

  ELECTROSENSITIVITY

  Wi-Fi Wants to Kill Your Children … But Alasdair Philips of Powerwatch Sells the Cure!

  Why Don’t Journalists Mention the Data?

  POST-MODERNISM

  Archie Cochrane: ‘Fascist’

  IRRATIONALITY

  The Golden Arse-Beam Method

  Illusions of Control

  Empathy’s Failures

  Blind Prejudice

  Yeah, Well, You Can Prove Anything with Science

  Superstition

  Evidence-Based Smear Campaigns

  Why Cigarette Packs Matter

  All Bow Before the Mighty Power of the Nocebo Effect

  So Brilliantly You’ve Presented a Really Transgressive Case Through the Mainstream Media

  BAD JOURNALISM

  Asking for It

  Jab ‘as Deadly as the Cancer’

  Health Warning: Exercise Makes You Fat

  The Caveat in Paragraph Number 19

  Why Don’t Journalists Link to Primary Sources?

  A Fishy Friend, and His Friends

  MMR: The Scare Stories Are Back

  Prevention Is Better than Cure When It Comes to Health Scares

  Dodgy Academic PR

  Suicide

  Roger Coghill and the Aids Test

  BRAINIAC

  Ka-Boom! Science! COOL!!?!

  Who’s the Daddy?

  STUFF

  Here’s My … Foreword to the Romney, Hythe and Dymchurch Railway Guidebook

  How I Stalked My Girlfriend
/>
  EARLY SNARKS

  Staying Beautiful Is Easy to Do

  Because You’re Worth It

  More Than Water?

  ‘Nanniebots’ to Catch Paedophiles

  Nanniebots and Neverland

  Artificial Intransigence

  BOOKENDS

  Be Very Afraid: The Bad Science Manifesto

  What Eight Years of Writing the Bad Science Column Has Taught Me

  Footnotes

  Notes

  List of Illustrations

  Index

  Acknowledgements

  About the Author

  Also by Ben Goldacre

  About the Publisher

  Intro

  This is a collection of my most fun fights, but the fighting is just an excuse. There is nothing complicated about science, and people can understand anything, if they’re sufficiently motivated. Coincidentally, people like fights. That’s why I’ve spent the last ten years lashing science to mockery: it’s the cleanest way I know to help people see the joy of statistics, and the fascinating ways that evidence can be distorted or ignored.

  But these aren’t personal attacks, and I’m not an angry person. All too often, people hoping to make science accessible fall into the trap of triumphalism, presenting science as a canon, and a collection of true facts. In reality, science is about the squabble. Every fight you will read in this book, over the meaning of some data, is the story of the scientific process itself: you present your idea, you present your evidence, and we all take turns to try and pull them both apart. This process of close critical appraisal isn’t something we tolerate reluctantly, in science, with a grudge: far from it. Criticism and close examination of evidence is actively welcomed – it is the absolute core of the process – because ideas only exist to be pulled apart, and this is how we spiral in on the truth.

  Away from the newspapers and science TV shows you can see that process, very clearly, in the institutions of science. The question-and-answer session at any academic conference, after someone presents their scientific research, is often a bloodbath: but nobody’s resentful, everyone expects it, and we all consent to it, as a kind of intellectual S&M activity. We know it’s good for our souls. If the idea survives, then great; if it needs more evidence, we decide what studies are needed next and do them. Then we all come back next year, tear the evidence apart again, and have another think. Real scientists know this. Only the fakers cry foul.

  In short, this book has a manifesto: check the evidence and fight back against anyone who tries to stop you. Along the way, you will get a grounding in statistics, study design, evidence-based policy and much more, in bite-size chunks. Because while my last two books – Bad Science and Bad Pharma – were polemics with a shape, this is a racing collection of short pieces. As such, I hope it works as a kind of statistics toilet book, bringing satisfaction in short bursts, with a fight and an idea in each one.

  So in the section of this book on surveys, we laugh at the stupidity of the nuclear power industry, some silly anti-abortionists and StoneWall (whom I actually adore). Or, if you prefer: we learn about the distortions of ‘participant bias’, misleading question design and a sticky problem involving a complex time-dependent variable. In the first piece of the book we cover some surprisingly unprofessional behaviour from a Baroness, Professor and previous Director of the Royal Institution. Or, if you prefer: we cover post-publication peer review and why the conventions of academic journals are helpful.

  These pieces cover two decades of work. There are lots of Guardian columns, but also academic papers, a report for the UK education minister, my work in the Romney, Hythe and Dimchurch Railway Guidebook, the odd undergraduate essay and more. If I’m honest, it’s pretty soulful (for me, not you) looking back over two decades and seeing what has changed. I was in my twenties and barely out of medical school when I started writing a column in the Guardian. As time passed, the targets got bigger, my day job took me through postgraduate qualifications and grown-up battles, and I think I got better at pulling claims apart. There was also discipline from outside: writing about other people’s misdeeds, collecting ever greater numbers of increasingly powerful enemies – and all under British libel law – is like doing pop science with a gun to your head. So for that, thanks.

  At the end I might tell you a little about how I work, why I do what I do, who made me, and how things have changed over the past two decades. For now, let’s just say I’m very grateful to all the many companies and people who, by their optimistically bad behaviour under fire, have given narrative colour to what might otherwise have been some very dry explanations of basic statistical principles.

  What’s in this book

  I’ve written 500,000 words in the last decade, so there is no repetition, and the corpses of folk like Gillian McKeith, the homeopaths and Big Pharma are left in my previous two books (although these characters fight on, like zombies, in the real world). My academic work on statins and Big Data is saved for a fun project that will be launching shortly. Lastly, most of my writing on randomised trials in education, policing and everywhere else is held back, as my book on this topic will come out in due course.

  There is, however, some structure to this school reunion. In How SCIENCE WORKS we cover peer review, how research is unpicked and critiqued after publication, how we deal with contradictory research, the importance of methods and results being freely available, whether it matters who a researcher is, how cherry-picking harms science, and how myths are made when inconvenient results are ignored.

  In BIOLOGISING we cover crass reductionism, including the peculiar beliefs that pain is only real when we scan see it on a brain scanner, that misery is best thought of as molecular, and that girls like pink ‘because they evolved to look for berries’. In STATISTICS we start with easy maths and accelerate painlessly to some fairly advanced notions. We cover why the odds of three siblings sharing a birthday is not 48,627,125 to 1, why spying on us all to spot the occasional terrorist is highly unlikely to work, how statistical tools for fraud helped catch Greece faking its national economic data, what you can tell from a change in abortion rates for Down’s syndrome, the many ways you can slice data to get the answer you want, the hazards of looking for spatial patterns on maps, and the most core statistical skill of all: how we can detect a true signal from everyday variation in background noise.

  Then we go on to the glory of BIG DATA, the battles with government to get hold of it, the risks of sharing medical records with all and sundry, and the magical way that patterns emerge from the formless static of everyday life when you have huge numbers. In SURVEYS we learn the tricks of a sticky trade, and then we shift up a gear to cover EPIDEMIOLOGY, my day job, the science of spotting patterns in disease. Here we see how clever things called funnel plots can help to show whether one area’s healthcare really is any worse than another’s, whether an increase in antidepressant prescriptions really does mean more people are depressed (or even whether more people are taking antidepressants), and the core skill of all epidemiology: how to correct for ‘confounding variables’, or rather: how to make sure that apparent correlations in your data are real. In an overview of bicycle helmet research, we review every epidemiological error in the textbooks, and a grand claim about the benefits of screening for diseases helps show that doing something – even something small – can often be worse than doing nothing at all. We see why different study designs are needed to research common and rare diseases, and how frail memories can distort the findings, why we should never assume that laboratory tests are correlated with real patients’ suffering, and how simple blinded experiments can spot if a £70 wine magnetiser really does change the flavour.

  In the section on BAD ACADEMIA, we see how whole fields have been undermined by the simple misuse of statistics. We find one simple statistical error made in half of all neuroscience papers, and, by using forensic methods, we can see that brain-imaging researchers must be up to no good, because collectively they are publishing f
ar more positive findings than the overall numbers of participants in their research could possibly, plausibly, statistically, sustain. We see bad behaviour around journals retracting papers, and appallingly poor standards in animal research, alongside academic journals publishing wildly crass papers on how, for example, people with Down’s syndrome really are a bit like the Chinese.

  In GOVERNMENT STATISTICS we see ludicrous overclaiming around public and private sector salaries (where commentators fail to compare like with like), Home Office figures on child abuse pulled almost from thin air, a government figure on the cost of piracy that assumes everyone in the country should be spending £9,700 each on DVDs and music every year, crime prevention numbers to support a national DNA database that simply do not add up, and a headline figure on local council overspending, from the Department for Communities and Local Government, whose derivation is so offensively stupid it almost defies belief. We also see there is no evidence that hosting events like the Olympics has any health benefit for the host nation.

  EVIDENCE-BASED POLICY is a slightly different fish: is there really good evidence for the policies that governments choose? Here we see that the evidence supporting the redisorganisation of the NHS is weak and that the figures on poor performance in the NHS used to justify it are over a decade old, and when the minister tries to argue back, he digs a very deep hole. We see how a historic failure to run simple randomised trials on policy issues has left us ignorant on basic questions about what works, and then whizz through a few simple questions, showing how evidence can be checked for each one: is porn in sperm donor clinics a good idea, is organic food really better, is it wise for the Catholic Church to campaign against condoms, and are exams really getting easier? We see a thinktank report on maths, promoted by a TV maths professor, that gets its own maths catastrophically wrong, and a select committee misleading, and being misled. After all this carping, in a report for the Department For Education I set out how the teaching profession could have its own evidence-based practice revolution to mirror what we’ve seen in medicine (and review, along the way, how senior doctors as late as the 1970s fought back, to defend that favourite of the old and powerful: eminence-based medicine).

  Recreational DRUGS are a magnet for bad policy, because ideology often conflicts with the evidence, so the temptation to distort the data is powerful. Here we see wildly inflated government figures for crop captures in Afghanistan (with a minister claiming that peasant farmers receive the entire street price from every £10 bag of heroin sold in London), and ask why death was quietly dropped from the government’s measures of drug-policy success, before an essay explaining why the UK prescribed heroin for heroin addicts from the 1920s onwards, why we stopped and why we should start again.

 

‹ Prev