The Scientific Attitude
Page 24
Meanwhile, a few stunning facts about Wakefield’s original study came to light. In 2004, it was discovered that Wakefield had been on the payroll of an attorney who was planning a massive lawsuit against an MMR vaccine manufacturer. Worse, it turned out that almost half the children who had been reported on in Wakefield’s study had been funneled to him through the lawyer. Finally, it was learned that just before Wakefield published his study, he had filed a patent for a competing vaccine to the classic MMR shot.32 Far from mere selection bias, this was a massive undisclosed conflict of interest that raised numerous questions over Wakefield’s motives. Within days, ten of Wakefield’s coauthors took their names off the study.
But by this point it was too late. The public had already heard the rumors and vaccination rates had begun to drop. In Ashland, Oregon, there was a 30 percent vaccination exemption rate. In Marin County, California, the exemption rate was more than three times the rest of the state.33 With such pockets of vaccine resistance, doctors began to worry about “herd immunity,” which is when the vaccination rate falls so low that one can no longer count on the “free rider” benefit of remaining unvaccinated in a community where most others have been vaccinated. And the results were devastating. After being beaten to a standstill, measles, whooping cough, diphtheria, and other diseases began to make a comeback:
[Measles] is the most infectious microbe known to man and has killed more children than any other disease in history. A decade after the World Health Organization (WHO) declared the virus effectively eradicated everywhere in the Americas save for the Dominican Republic and Haiti, declining vaccination rates have led to an explosion of outbreaks around the world. In Great Britain, there’s been more than a thousandfold increase in measles cases since 2000. In the United States, there have been outbreaks in many of the country’s most populous states, including Illinois, New York, and Wisconsin.34
It didn’t help that many in the media were whipping up the story, trying to tell “both sides” of the vaccine “controversy.”35 Meanwhile, many parents of autistic children didn’t care about any alleged irregularities in Wakefield’s work. He continued to speak at autism conferences worldwide, where he was treated as a hero. When the Lancet finally retracted his paper (in 2010), and Wakefield was stripped of his medical license in Britain, conspiracy theories began to run wild. Why was his work being suppressed? Angry parents (including a number of Hollywood celebrities) were already organized and furious with what they saw as a cover up. If thimerosal wasn’t dangerous, why had it been removed?
Then in 2011, definitive word came: Wakefield’s work was a fraud. In addition to the severe conflict of interest noted above, Brian Deer (an investigative journalist who had already broken a good deal of the earlier revelations in 2004) finally had a chance to interview the parents of Wakefield’s patients and examine their medical records. And what he found was shocking. “No case was free of misreporting or alteration.”36 Wakefield had altered the medical records of every single child in the study.
Three of nine children reported with regressive autism did not have autism diagnosed at all. Only one child clearly had regressive autism.
Despite the paper claiming that all 12 children were “previously normal,” five had documented pre-existing developmental concerns.
Some children were reported to have experienced first behavioural symptoms within days of MMR, but the records documented these as starting some months after vaccination. …
The parents of eight children were reported as blaming MMR, but 11 families made this allegation at the hospital. The exclusion of three allegations—all giving times to onset of problems in months—helped to create the appearance of a 14 day temporal link.
Patients were recruited through anti-MMR campaigners, and the study was commissioned and funded for planned litigation.37
The British Medical Journal (perhaps the second-most prestigious medical journal in Britain, after the Lancet) took the unprecedented step of accepting Deer’s work as definitive evidence of fraud and, after it had been peer reviewed, published his paper alongside their own editorial, which concluded that “clear evidence of falsification of data should now close the door on this damaging vaccine scare” and called Wakefield’s work an “elaborate fraud.”38 They concluded:
Who perpetrated this fraud? There is no doubt that it was Wakefield. Is it possible that he was wrong, but not dishonest: that he was so incompetent that he was unable to fairly describe the project, or to report even one of the 12 children’s cases accurately? No. A great deal of thought and effort must have gone into drafting the paper to achieve the results he wanted: the discrepancies all led in one direction; misreporting was gross.39
A few months later another commentator called Wakefield’s fraud “the most damaging medical hoax of the last 100 years.”40 Four year later, in early 2015, there was a measles outbreak with over a hundred confirmed cases across fourteen states in the US.41
As we can see, scientific fraud is ugly and the fallout can be massive.42 Yet one of the most interesting parts of the story is the enormous scorn of Wakefield’s work by the scientific community (juxtaposed, unfortunately, against public confusion and willful ignorance enabled by the media), before he was proven to be a fraud. Why did this occur? If fraud must be intentional, how did the scientific community seem to reach a consensus in advance of seeing proof that Wakefield had manipulated data? The answer is that although fraud is perhaps the most egregious form of intentional misconduct, it is not the only kind of cheating one can do. Once it had come to light that Wakefield had an enormous undisclosed conflict of interest, his intentions were suspicious. Even though no one had yet proven that his financial interests had colored his scientific work, where there was so much smoke, few in the scientific community could see how there was not a huge fire behind it. Since Wakefield had already forsaken a core principle of scientific practice—that one must disclose in advance all possible conflicts of interest—many concluded that he didn’t deserve the benefit of the doubt. And they were right. Yet one mourns that the scientific community’s self-correction in this case still has not made its way to all corners of the general population.43
On a Happier Note
I would like to end on a brighter note. In this chapter, we have encountered perhaps the ugliest face of science. But what should a scientist do if his or her theory isn’t working? Where the time and career pressures are massive and the data are coming out all wrong?
A few years before Andrew Wakefield’s paper, a little-known British astronomer named Andrew Lyne stood before several hundred colleagues at the American Astronomical Society meeting in Atlanta, Georgia. He had been invited to give a paper on his stunning discovery of a planet orbiting a pulsar. How could that happen? A pulsar is the result of a star that has exploded in a supernova, which theoretically should have destroyed anything even close to its orbit. And yet, after rechecking his results, the planet remained, so Lyne published his paper in the prestigious journal Nature. But now there was a problem. A few weeks before his trip to Atlanta, Lyne discovered a crucial error in one of his calculations: he had forgotten to account for the fact that the Earth’s orbit was elliptical rather than circular. This was a mistake from first-year physics. When he made the correction, “the planet disappeared.” But, standing there that day in front of his colleagues, Lyne made no excuse for himself. He told the audience what he had found and then told them why he had been wrong, after which they gave him a standing ovation. It was “the most honorable thing I’ve ever seen,” said one astronomer who was present. “A good scientist is ruthlessly honest with him- or herself, and that’s what you’ve just witnessed.”44
That is the true spirit of the scientific attitude.
Notes
1. Although there are different ways of defining it, the federal standard for research misconduct is intentional “fabrication, falsification, or plagiarism in proposing, performing, or reviewing research, or in reporting research results.” Of
special note is the further statement that “research misconduct does not include honest error or differences of opinion.” See https://www.aps.org/policy/statements/upload/federalpolicy.pdf. This matches most university policies. For a thoughtful history of one university’s attempt to come up with its own definition, in conformity with the federal guidelines, see David Goodstein, On Fact and Fraud: Cautionary Tales from the Front Lines of Science (Princeton: Princeton University Press, 2010), 67. Of particular note in Goodstein’s account is the subtle issue of why it may not be a good thing for a university to have an overly broad definition of prohibited behaviors, nor lump fraud in with all other forms of research misconduct. Caltech’s full policy can be found in Goodstein, On Fact and Fraud, 136.
2. See text accompanying chapter 2, note 29.
3. Note that as a consequence of this very precise logical definition we have defined not just fraud but also what is not fraud. If one has not intentionally fabricated or falsified data, then one has not committed fraud, and if one has not committed fraud, then one has not intentionally fabricated or falsified data.
4. And of course it does not follow from thesis (3).
5. In some special cases one probably could make a case that p-hacking constituted fraud—for instance, if one knew for a fact that there was no underlying correlation.
6. Naturally, we could change the definition, but see Goodstein, On Fact and Fraud, on the problems with including “questionable research practices” in the definition of fraud.
7. Again some of these might be fraud, depending on the individual circumstances (see Trivers, “Fraud, Disclosure, and Degrees of Freedom in Science,” Psychology Today, May 10, 2012), and there is also the fascinating question of whether things like this can evolve into fraud.
8. Goodstein was Caltech’s vice provost, who oversaw all cases of scientific misconduct, for nearly twenty years.
9. Goodstein, On Fact and Fraud, 2.
10. It is interesting to note that later in his book, Goodstein gives an account of what “self-correction” means that squares nicely with my account of the scientific attitude. He argues that self-correction can hardly mean that we expect individual researchers to question the validity of their own work. Rather, we mean something more like the methods of empirical scrutiny that are implemented by the “scientific community as a whole.” Goodstein, On Fact and Fraud, 79.
11. “Injecting falsehoods into the body of science is rarely, if ever, the purpose of those who perpetrate fraud. They almost always believe that they are injecting a truth into the scientific record … but without going through all the trouble that the real scientific method demands.” Goodstein, On Fact and Fraud, 2.
12. Although some philosophers of science have recently taken up the question of fraud, most have not yet graduated from the stereotypical view that all fraud is characterized by “knowingly enter[ing] falsehoods into the information stream of science.” See Liam Bright, “On Fraud,” Philosophical Studies 174, no. 2 (2017): 291–310.
13. See especially Plato, Theaetetus, 200d–201c; Meno, 86b–c.
14. In his book The Folly of Fools (New York: Basic Books, 2011), Robert Trivers offers a stirring discussion of self-delusion, where he offers that perhaps we learn to fool ourselves because it makes us more capable of fooling others.
15. Meno, 98b.
16. Perhaps they are worried about litigation, or fear that it will tarnish the reputation of the school where the fraudster worked. Goodstein proposes that one difficulty created by this is that in cases where someone is exonerated they get lots of press, but in cases of actual fraud there is untoward pressure to maintain confidentiality and protect the guilty (On Fact and Fraud, xii). (If true, one wonders whether this violates the spirit of science and makes it harder to police the line between mistakes and misconduct.) Goodstein notes, however, one recent case at Stanford where school officials pledged before the investigation was even complete to make the results public (99).
17. In one salient example, Iowa State University scientist Dong-Pyou Han was sentenced to over four years in prison for faking his AIDS research. See Tony Leys, “Ex-Scientist Sentenced to Prison for Academic Fraud,” USA Today, July 1, 2015, http://www.usatoday.com/story/news/nation/2015/07/01/ex-scientist-sentenced-prison-academic-fraud/29596271/. In another case, South Korean scientist Hwang Woo-suk was sentenced for fraudulent work on stem cells. See Choe Sang-Hun, “Disgraced Cloning Expert Convicted in South Korea,” New York Times, Oct. 26, 2009, https://www.nytimes.com/2009/10/27/world/asia/27clone.html.
18. Though this can happen. Consider the case of Thereza Imanishi-Kari, who was investigated by the federal government and charged with falsifying laboratory data; the case collapsed and she was exonerated. “The Fraud Case That Evaporated,” New York Times (Opinion), June 25, 1996, http://www.nytimes.com/1996/06/25/opinion/the-fraud-case-that-evaporated.html.
19. See Carolyn Y. Johnson, “Ex-Harvard Scientist Fabricated, Manipulated Data, Report Says, Boston Globe, Sept. 5, 2012, https://www.bostonglobe.com/news/science/2012/09/05/harvard-professor-who-resigned-fabricated-manipulated-data-says/6gDVkzPNxv1ZDkh4wVnKhO/story.html.
20. Goodstein, On Fact and Fraud, 65.
21. Goodstein, On Fact and Fraud, 60–61.
22. Once again, consider Trivers’s work on the possible connection between deception and self-deception.
23. Robert Park, Voodoo Science: The Road from Foolishness to Fraud (Oxford: Oxford University Press, 2000), 10.
24. Goodstein, On Fact and Fraud, 129.
25. Goodstein, On Fact and Fraud, 70.
26. Is there a possible analogy here between fraud and pseudoscience? Is self-delusion what we condemn in pseudoscientists? See text accompanying chapter 5, note 53, where I claim that the difference between techniques used by scientists and pseudoscientists is only a matter of degree.
27. Of course, it might depend on why the data were not stored correctly. Deleting one’s original data sets is an extremely bad practice, but if it is done to cover up an inquiry into irregularities in one’s research, it can constitute fraud.
28. Goodstein, On Fact and Fraud, xiii–xiv.
29. Some might argue, though, that this sort of preoccupation with the question of “intent” could have the opposite effect and make the distinction between good and bad science even more impenetrable. Yet do we not already do this in cases of fraud? Sometimes intent has to be inferred from action, but this is no excuse for focusing only on a proxy, like article retraction.
30. Seth Mnookin, The Panic Virus: The True Story Behind the Vaccine–Autism Controversy (New York: Simon and Schuster, 2011), 109.
31. Michael Specter, Denialism (New York: Penguin, 2009), 71.
32. Mnookin, Panic Virus, 236.
33. Mnookin, Panic Virus, 305.
34. Mnookin, Panic Virus, 19.
35. Some were getting the word out: Jennifer Steinhauser, “Rising Public Health Risk Seen as More Parents Reject Vaccines,” New York Times, March 21, 2008.
36. Brian Deer, “How the Case against the MMR Vaccine Was Fixed,” British Medical Journal 342 (2011): c5347.
37. Deer, “How the Case against the MMR Vaccine Was Fixed,” c5347.
38. F. Godlee et al., “Wakefield Article Linking MMR Vaccine and Autism Was Fraudulent,” British Medical Journal 342 (2011): c7452.
39. Godlee et al., “Wakefield Article,” 2.
40. D. K. Flaherty, “The Vaccine–Autism Connection: A Public Health Crisis Caused by Unethical Medical Practices and Fraudulent Science,” Annals of Pharmacotherapy 45, no. 10 (2011): 1302–1304.
41. Mark Berman, “More Than 100 Confirmed Cases of Measles in the U.S., CDC Says,” Washington Post, Feb. 2, 2015.
42. All of his coauthors claimed not to have known of Wakefield’s deep conflict of interest or manipulation of data, though two of them were later investigated by the General Medical Council in Britain, and one was found guilty of misconduct.
43. This has been exacerbated by continued
publicity for Wakefield’s discredited hypothesis by public figures such as Robert F. Kennedy, Jr., Thimerosal: Let the Science Speak: The Evidence Supporting the Immediate Removal of Mercury—a Known Neurotoxin—from Vaccines (New York: Skyhorse Publishing, 2015), and actor Robert De Niro’s 2016 decision to screen (and then pull) the film Vaxxed: From Cover-Up to Conspiracy at the Tribeca Film Festival. De Niro later said that he regretted pulling the film.
44. Michael D. Lemonick, “When Scientists Screw Up,” Science, Oct. 15, 2002.
8 Science Gone Sideways: Denialists, Pseudoscientists, and Other Charlatans
We turn now from fraud—which is a matter of accepting the standards of science, then intentionally violating them—to the case of denialists and pseudoscientists, who may misunderstand or not care at all about the standards of scientific evidence or, to the extent that they do, not enough to modify or abandon their ideological beliefs.
Many scientists have found it incredible in recent years that their conclusions about empirical topics are being questioned by those who feel free to disagree with them based on nothing more than gut instinct and ideology. This is irrational and dangerous. Denialism about evolution, climate change, and vaccines has been stirred up in recent years by those who have an economic, religious, or political interest in contradicting certain scientific findings. Rather than merely wishing that particular scientific results weren’t true, these groups have resorted to a public relations campaign that has made great strides in undermining the public’s understanding of and respect for science. In part, this strategy has consisted of attempts to “challenge the science” by funding and promoting questionable research—which is almost never subject to peer review—in order to flood news outlets with the appearance of scientific controversy where there is none. The result has been a dangerously successful effort to subvert the credibility of science.