Credulity doesn’t have to be a permanent trait; it’s been shown that not everyone who starts off that way remains gullible for life. While it may be the case that some people are simply more gullible and others are more likely to doubt and look for evidence, there are always ways to get better at being skeptical. One concept that might help you hone your skeptical skills is known as the zebra rule. If you’ve ever been to medical school (or watched Grey’s Anatomy or House, M.D.), you know that Dr. Theodore Woodward of the University of Maryland School of Medicine told his interns that, when they hear hoof beats, they should not expect zebras.71 This analogy is commonly used in the medical field to ensure new doctors look for common illnesses (horses) presenting in uncommon forms prior to diagnosing a rare disorder (zebra) of some kind, but it can also be helpful when learning to be skeptical.
Do you hear a loud noise coming from the attic at night? If you assume a ghost is the culprit, you’re falling into the “zebra” trap or—perhaps more appropriately in this supernatural instance—the unicorn trap. You should consider some horses, first. Is it possible that you have a rodent infestation? Could there be a person in your attic? Could something have fallen onto your roof? These are all perfectly reasonable explanations—common horses that should be ruled out before you start hunting for unicorns. Still, this zebra and horse issue isn’t always black and white. It’s important to note that, especially in medicine, zebras, or rare diagnoses, do exist and shouldn’t be ignored. It all depends on the circumstances. That’s why I like to say: when you hear hoof beats, think of horses, not zebras … unless you are in Southern Africa or another region where there are no native horses. In that case, it’s probably a zebra, so act accordingly. This method of deduction is related to Occam’s razor and the parsimony principle, which states that the simplest scientific explanation is likely the correct one (see chapter 10).72
Remembering helpful maxims can certainly make it easier to practice skepticism in daily life, but it’s even more crucial to understand and adjust for our many known and unknown biases. We’ve already learned how the feeling of cognitive dissonance can discourage attempts to learn new things, but there are a lot of other mechanisms that distort our perception and actively cause false beliefs. For starters, there’s confirmation bias, which can be considered a result of cognitive dissonance and describes our tendency to agree with those who agree with us and to read mostly material that supports our positions.73 Sheltering ourselves from others’ ideas doesn’t help us get closer to the truth, but for some, it can be comforting. There’s also the human impulse to believe things that are already believed by others, which is considered a part of the bandwagon effect that broadly applies to beliefs and even large-scale group preferences, such as fads and fashion trends. This effect, which can cause people to accept popular beliefs regardless of underlying support, is closely related to groupthink (see chapter 10). We could also be affected by the gambler’s fallacy, the faulty perception that the probabilities of future occurrences are changed by events in the past. In other words, you roll five dice five times, but not one of them turns up the number six. If you roll a sixth time, you’re due for a six, right? Wrong. The odds are exactly the same with each roll.74 People are prone to all sorts of biases, and learning to avoid or compensate for them is key to thinking like a scientist. Canadian bioethicist George Dvorsky, who defines a cognitive bias as “a genuine deficiency or limitation in our thinking” or “a flaw in judgment that arises from errors of memory, social attribution, and miscalculations,” says these perceptual errors reveal the brain’s major limitations.
“The lowly calculator can do math thousands of times better than we can, and our memories are often less than useless—plus, we’re subject to cognitive biases, those annoying glitches in our thinking that cause us to make questionable decisions and reach erroneous conclusions,” Dvorsky wrote. “Some social psychologists believe our cognitive biases help us process information more efficiently, especially in dangerous situations. Still, they lead us to make grave mistakes.”
Carl Sagan, in The Demon-Haunted World, discussed how humans in general are susceptible to many different kinds of deception. To make people think more like scientists and prevent them from being duped, Sagan put forth a set of cognitive tools called the “baloney detection kit.” He said the kit, if adopted and utilized regularly, can help people avoid buying into false ideas, even when they are reassuring by their very nature. Here are Sagan’s nine techniques that make up the baloney detection kit, excerpted from the book:
1. Wherever possible there must be independent confirmation of the “facts.”
2. Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.
3. Arguments from authority carry little weight—“authorities” have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts.
4. Spin more than one hypothesis. If there’s something to be explained, think of all the different ways in which it could be explained. Then think of tests by which you might systematically disprove each of the alternatives. What survives, the hypothesis that resists disproof in this Darwinian selection among “multiple working hypotheses,” has a much better chance of being the right answer than if you had simply run with the first idea that caught your fancy.
5. Try not to get overly attached to a hypothesis just because it’s yours. It’s only a way station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don’t, others will.
6. Quantify. If whatever it is you’re explaining has some measure, some numerical quantity attached to it, you’ll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations. Of course there are truths to be sought in the many qualitative issues we are obliged to confront, but finding them is more challenging.
7. If there’s a chain of argument, every link in the chain must work (including the premise)—not just most of them.
8. Occam’s Razor. This convenient rule-of-thumb urges us when faced with two hypotheses that explain the data equally well to choose the simpler.
9. Always ask whether the hypothesis can be, at least in principle, falsified. Propositions that are untestable, unfalsifiable are not worth much. Consider the grand idea that our Universe and everything in it is just an elementary particle—an electron, say—in a much bigger Cosmos. But if we can never acquire information from outside our Universe, is not the idea incapable of disproof? You must be able to check assertions out. Inveterate skeptics must be given the chance to follow your reasoning, to duplicate your experiments and see if they get the same result.
This list of tools, while extensive and helpful to anyone examining potentially dubious claims, isn’t (and will likely never be) complete. There is just too much baloney in the world, and too few high-quality detectors. Science writer Michael Shermer, author of The Believing Brain and other titles, recognized this need for an expanded baloney detection kit that could shield people from additional lies and manipulated truths. He added 10 questions to accompany Sagan’s nine pieces of advice:
1. How reliable is the source of the claim?
2. Does the source make similar claims?
3. Have the claims been verified by somebody else?
4. Does this fit with the way the world works?
5. Has anyone tried to disprove the claim?
6. Where does the preponderance of evidence point?
7. Is the claimant playing by the rules of science?
8. Is the claimant providing positive evidence?
9. Does the new theory account for as many phenomena as the old theory?
10. Are personal beliefs driving the claim? 75
There are a lot of tools and detection kits that help people distinguish fact from fiction, but no workshe
et or script is guaranteed to make you think like a scientist. What’s most important when practicing skepticism is that you ask questions, test answers, think of alternative solutions, and follow the hard evidence.
MAKING SCIENCE POPULAR
Science is an incredibly important process, leading to world-changing discoveries and life-extending medical advancements, but there are millions of people who actively oppose it and even more who ignore it entirely. This problem hasn’t gone unrecognized. In 2014, comedian and TV show host Stephen Colbert asked Neil deGrasse Tyson what would most surprise Carl Sagan, who died in 1996, about the world today. Tyson’s response was, “I think that what would surprise him the most is that we still have to argue that science is something important in society.”
So, how do we get everyone to realize the importance of science and take it seriously? In my opinion, we have to show them that the scientific method isn’t just for people with PhDs and lab coats, and that it can be used to make all sorts of decisions in life. Consider corporal punishment, for instance. If you have a young child, and you’re considering methods of discipline, you have two options: you could use your personal experiences, perhaps justifying spankings by telling yourself you got them and turned out fine, or you could look at the data to determine which method is actually best in the long run. In this case, the evidence suggests deliberately inflicting pain on children as retribution doesn’t work very well as a behavioral modification method. It may at times make a child immediately comply, but it can also have negative effects on moral internalization, quality of relationship with the parent, and mental health, and has been associated with future aggression and criminal or antisocial behavior. That is not to say that everyone who is spanked has lasting negative consequences, but, scientifically, physical punishments can have some pretty severe repercussions. This isn’t my opinion, though; it’s the conclusion of a thorough analysis of 88 high-quality scientific studies on the topic of corporal punishment.76 Now, will there still be people who hit their children? Of course. But that decision won’t be based on the best scientific information available.
Most people probably don’t look at scientific data when making personal choices, including corporal punishment and many other things, and that’s the real problem. It’s not just that they look at the information available and—interpreting data in their own way—come to their own conclusions. That would still be an informed decision, which is always better than one made out of ignorance. Instead, they don’t think science comes into play at all. Maybe they don’t have the curiosity, the know-how, or the understanding of the importance of verifiably true answers, but whatever the reason, the scientific spirit that exists within all of us from childhood has wilted for them. We need to reignite that innate curiosity about the world, encourage scientific pursuits wherever they’re possible (and practical), and tell everyone who will listen about the importance of discovery. Vocal activism is important, but the best way to accomplish this task, in my opinion, is through quality, science-centered education. Author John Green explains that public education isn’t just for the benefit of the students or their parents. It exists “for the benefit of the social order,” he said.
“We have discovered as a species that it is useful to have an educated population. You do not need to be a student or have a child who is a student to benefit from public education. Every second of your life, you benefit from public education,” Green said. “So let me explain why I like to pay taxes for schools, even though I don’t personally have a kid in school: it’s because I don’t like living in a country with a bunch of stupid people.”
Scientific advancement is still happening. We are seeing new discoveries all the time despite the fact that many people simply don’t care, but such advancement is not enough. In my opinion, what we need is a cultural paradigm shift, as opposed to a scientific one. This change would include altered perceptions toward science and education by the majority of citizens, and a societal urgency toward technological advancement. There would be revised priorities for the majority of individuals, and they would prefer to spend their time learning about the world and contributing to its greatness instead of doing it harm. Soap opera and NASCAR viewership may decrease, but book sales would skyrocket! It would be an entirely different world, but one of which I would be proud to be a part. Writer and journalist Walter Isaacson would likely agree. He is quoted saying, “I hope that someday scientists can be considered heroes again, instead of Paris Hilton.”
“What do you think science is? There’s nothing magical about science. It is simply a systematic way for carefully and thoroughly observing nature and using consistent logic to evaluate results. Which part of that exactly do you disagree with? Do you disagree with being thorough? Using careful observation? Being systematic? Or using consistent logic?”
—Steven P. Novella
NOTES
1. Mary Purugganan and Jan Hewitt, “How to Read a Scientific Article,” Rice University, Cain Project in Engineering and Professional Communication, 2004, http://www.owlnet.rice.edu/~cainproj/courses/HowToReadSciArticle.pdf.
2. Martin Ryder, “Scientism,” University of Colorado, Denver, 2013, carbon.ucdenver.edu/~mryder/scientism_este.html.
3. Susan Haack, “Six Signs of Scientism,” Logos & Episteme 3, no. 1 (2012): 75–95.
4. Public Broadcasting Service, “Scientism,” PBS Online, www.pbs.org/faithandreason/gengloss/sciism-body.html.
5. Robert T. Carroll, “Scientism,” The Skeptic’s Dictionary, October 27, 2015.
6. Jamie Holmes, “Be Careful, Your Love of Science Looks a Lot like Religion,” Quartz, August 11, 2015.
7. Miguel Farias et al. “Scientific Faith: Belief in Science Increases in the Face of Stress and Existential Anxiety,” Journal of Experimental Social Psychology 49, no. 6 (2013): 1210–1213.
8. Daniel C. Dennett, Darwin’s Dangerous Idea: Evolution and the Meanings of Life (New York: Simon & Schuster, 1995).
9. Nick Anthis, “Why Are Veins Blue?” The Scientific Activist, ScienceBlogs, April 17, 2008.
10. David A. Thompson and Stephen L. Adams, “The Full Moon and ED Patient Volumes: Unearthing a Myth,” American Journal of Emergency Medicine 14, no. 2 (1996): 161–164.
11. Wendy Coates, Dietrich Jehle, and Eric Cottington, “Trauma and the Full Moon: A Waning Theory,” Annals of Emergency Medicine 18, no. 7 (1989): 763–765.
12. “Some nurses ascribe the apparent chaos to the moon, but dozens of studies show that the belief is unfounded,” Jean-Luc Margot, a University of California, Los Angeles, professor of planetary astronomy, said in a statement on March 30, 2015. “The moon is innocent.”
13. Robynne Boyd, “Do People Only Use 10 Percent of their Brains,” Scientific American, February 7, 2008.
14. Kenneth L. Higbee and Samuel L. Clay, “College Students’ Beliefs in the Ten-Percent Myth,” Journal of Psychology 132, no. 5 (1998): 469–476.
15. Scott O. Lilienfeld et al. 50 Great Myths of Popular Psychology: Shattering Widespread Misconceptions about Human Behavior (New York: John Wiley & Sons, 2011).
16. Steven D. Hales, “Thinking Tools: You can Prove a Negative,” Think 4, no. 10 (2005): 109–112.
17. Isaac Asimov, The Gods Themselves (Westminster, MD: Spectra, 2011).
18. “We must always be on guard against errors in our reasoning. Eternal vigilance is the watchphrase not just of freedom, but also of thinking. That is the very nature of skepticism.”—Michael Shermer
19. Stephen Edelston Toulmin, Human Understanding (Princeton: Princeton University Press, 1972).
20. If you’re in need of recovery and looking for a secular alternative to “twelve step” programs, Secular Organizations for Sobriety (S.O.S.) and SMART Recovery are two groups that provide non-faith-based systems that support all recovery, regardless of your particular spiritual path or lack thereof.
21. Marcus A. Bachhuber et al., “Medical Cannabis Laws and Opioid Analgesic Overdose Mortality in the United States, 1999�
��2010,” JAMA Internal Medicine 174, no. 10 (2014): 1668–1673.
22. Wayne Hall and Nadia Solowij, “Adverse Effects of Cannabis,” The Lancet 352, no. 9140 (1998): 1611–1616.
23. Robert C. Bailey et al., “Male Circumcision for HIV Prevention in Young Men in Kisumu, Kenya: A Randomised Controlled Trial,” The Lancet 369, no. 9562 (2007): 643–656.
24. K. J. S. Anand, “Consensus Statement for the Prevention and Management of Pain in the Newborn,” Archives of Pediatrics & Adolescent Medicine 155, no. 2 (2001): 173–180.
25. Dan Bollinger, “Lost Boys: An Estimate of US Circumcision-Related Infant Deaths,” Thymos 4, no. 1 (2010): 78.
26. S. Todd Sorokan, Jane C. Finlay, and Ann L. Jefferies, “Newborn Male Circumcision,” Paediatrics & Child Health 20, no. 6 (2015): 1.
27. British Medical Association, “The Law and Ethics of Male Circumcision: Guidance for Doctors,” Journal of Medical Ethics 30, no. 3 (2004): 259–263.
28. “More People Would Learn from Their Mistakes If They Weren’t So Busy Denying Them.”—Harold J. Smith
29. Stefano Palminteri, Mehdi Khamassi, Mateus Joffily, and Giorgio Coricelli, “Contextual Modulation of Value Signals in Reward and Punishment Learning,” Nature Communications 6 (2015).
30. “I know one thing: that I know nothing.”—attributed to Socrates
31. Isaac Asimov, “The Relativity of Wrong,” Skeptical Inquirer 14, no. 1 (1989): 35–44.
32. Thomas S. Kuhn, The Structure of Scientific Revolutions (Chicago: University of Chicago Press, 2012).
33. David Michaels and Celeste Monforton, “Manufacturing Uncertainty: Contested Science and the Protection of the Public’s Health and Environment,” American Journal of Public Health 95, no. S1 (2005): S39–S48.
34. Karen W. Arenson, “What Organizations Don’t Want to Know Can Hurt,” New York Times, August 21, 2006, www.nytimes.com/2006/08/22/business/22mistakes.html.
35. Shunsuke Ishii et al., “Report on STAP Cell Research Paper Investigation,” RIKEN, March 31, 2014
No Sacred Cows Page 17