Book Read Free

No Sacred Cows

Page 14

by David G. McAfee


  Not having faith in extraordinary claims without substantial scientific evidence should sound like common sense, but for too many people it is not. In fact, some believers would argue that someone with an evidence-based worldview “worships” science and that scientism is their “religion.” For example, Jamie Holmes, author of Nonsense: The Power of Not Knowing, argues that a love of science “looks a lot like religion.”6

  Holmes, who defines scientism as “the notion that science has exclusive access to the truth,” pointed to a 2013 study in The Journal of Experimental Social Psychology.7 In that study, researchers said they found that stressed subjects were more likely to agree to statements expressing scientism, such as, “the scientific method is the only reliable path to knowledge.”

  “When people felt anxious, they esteemed science more highly than calmer subjects did, just as previous experiments have shown to be the case with religious ideals,” Holmes explained, suggesting deep faith can be just another form of irrational extremism. “In these cases, beliefs about science may be defended emotionally, even if they are false, as long as they provide a reassuring sense of order. That is to say, beliefs about science may be defended thoughtlessly—even unscientifically.”

  I understand that there may be people who feel this way about science, but for me, this assertion is untrue. In fact, I think science is so amazing specifically because it is fluid and not a clear, all-encompassing answer to every question. Contrary to what some have suggested, I don’t worship or even believe in science; I just find it easier to accept outcomes that are observed repeatedly under controlled conditions. Even if you disagree with me on one or more topics, or you dislike me as a person, I think most would agree that looking for verifiable facts is important and that the scientific method and reasonable thinking should be spread to everyone in the world. Frankly, I think it’s sad that I even have to say I have an “evidence-based worldview.” For any person who suggests scientific evidence isn’t important in shaping their beliefs, I’d just ask, “Why?” Shouldn’t we all strive to believe that which is supported by verifiable facts? What does anyone think they could gain by having false beliefs? Belief without evidence is how blind faith is born, and how people start to unquestioningly accept any idea that sounds remotely possible or aligns with their preconceived ideas.

  To me, rational skepticism seems like a reasonable position because, to date, everything that has ever been discovered and quantified has been linked to a natural cause—with ideas based on the supernatural being rendered completely unnecessary and speculative at best, and harmful at worst. I “believe in” what’s been shown to exist and I’m on an ongoing mission to purge my mind of any unjustified beliefs. I’ve taken on this quest for facts not due to scientism, but because I don’t see the value in believing in something that could easily be untrue. Intellectual honesty is too important to me. Daniel Dennett shot back at those who would accuse him of scientism in his book, Darwin’s Dangerous Idea: Evolution and the Meanings of Life.8

  “It is not ‘scientism’ to concede the objectivity and precision of good science, any more than it is history worship to concede that Napoleon did once rule in France and the Holocaust actually happened,” he wrote. “Those who fear the facts will forever try to discredit the fact-finders.”

  To think like a scientist, you have to stop yourself from accepting rumors as facts and look for scientific evidence only when appropriate. Perhaps most importantly, however, you have to be willing to be proven wrong about anything at any moment.

  WHEN YOU’RE WRONG, ADMIT IT AND MOVE ON

  We all have false beliefs. Yes, even you, the one reading this. You’re wrong about something, some assumption or idea, because everyone is—it’s impossible to escape. Perhaps you were taught as a child that blood inside our bodies is blue and never researched it further.9 Or maybe you bought into the well-studied and debunked (yet incredibly pervasive) myth that full moons cause increased patient volume in hospitals and emergency rooms.101112 You might be clinging to the false notion that humans only utilize 10 percent of our brains,131415 or that it is always impossible to prove a negative claim.16 Or it could be something much simpler, like believing you locked a door when you did not. It’s not important at the moment to nail down your particular false beliefs, but it is crucial that you accept the fact that they exist so you can more easily correct them when the time arises. As author and biochemistry professor Isaac Asimov famously noted, when you are wrong and you want to be right, changing your opinions is the only real option.

  “So the universe is not quite as you thought it was,” Asimov wrote.17 “You’d better rearrange your beliefs, then. Because you certainly can’t rearrange the universe.”

  It is truly an important virtue to be able to admit when you’ve been misled and move past a firmly held belief to become a stronger person that much closer to the truth. But the process of separating facts from falsehoods isn’t something that takes place in a day; it’s a lifetime endeavor on which you can embark as long as you begin with an honest and objective approach.18 You have to be ready to follow the evidence wherever it goes—even if that means admitting everything you’ve ever been taught was a lie. Stephen Toulmin, who was a British philosopher and author, once noted that a person’s rationality can actually be measured by his or her willingness to admit errors.

  “A man demonstrates his rationality, not by a commitment to fixed ideas, stereotyped procedures, or immutable concepts, but by the manner in which, and the occasions on which, he changes those ideas, procedures, and concepts,” Toulmin wrote in his book, Human Understanding.19

  I’ve been wrong more times than I can recall, but it doesn’t upset me like it does many others I’ve met. I actually like to look at each new instance of enlightenment as a win, rather than a loss. After all, my intention is not to be right all the time, but to acquire as much knowledge as I can. So, if someone corrects a misconception of mine, I don’t get angry; I thank them for helping me. This is, in my opinion, the healthy and reasonable way to handle errors in our own thinking. My ability and desire to change my stance when confronted with new information is so important to me that I began compiling a list of all the times I did just that. This is only a small sample, but here are a couple of instances in which evidence reshaped my prior outlook:

  Medicinal Cannabis: I was raised by parents who, for much of my childhood (though not before I was born), were addicted to methamphetamine. They used heavily and often, and I saw the effects the drug had on them. Once they started trying to quit, I found myself attending a lot of Narcotics Anonymous (N.A.) meetings,20 which was a drastic improvement for me when it came to how I spent my nights. In those rooms, I learned about the (perceived) dangers of all drugs, and how, to many, marijuana was just as terrifying as meth. Shortly thereafter, I took a hardline stance against all mind-altering substances, including cannabis, and frequently lectured my friends who used it. I was basing my belief in its inherent danger on testimonies and experiences from N.A. members and not on scientific evidence. It wasn’t until the end of college that, after suffering from chronic back pain for years, I eventually allowed my physician to prescribe medical cannabis in pill form. It helped relieve some of the pain, so I wanted to research its risks now that I was old enough to make decisions for myself. I soon discovered that, unlike the pain pills I was taking before, it was impossible to overdose on marijuana.21 I also learned that, while there are some side effects to cannabis use, as is the case with any medicine, they don’t compare in severity or frequency to those of traditional pain-relieving pharmaceuticals.22 I effectively changed my position and became an advocate for the legalization of cannabis, especially for medicinal purposes.

  Routine Infant Circumcision: When I was growing up, I never really thought much about the issue of routine infant circumcision. If you asked me my opinion, I probably would have said I approved of it, but my reasoning would have been flawed. I had never experienced any problems as a man who had been circumc
ised at birth, so I didn’t think it was an issue. That was “normal” to me, and I didn’t give it much more thought than that. I knew that there were some nonreligious arguments in favor of the practice—such as the potential for reduced transmission of HIV and sexually transmitted infections23—and that the first circumcisions were likely more closely linked to preventative healthcare than faith, so I understood it wasn’t just a cut-and-dry case of religious barbarism and I promoted that idea. At the same time, however, I was conflicted when it came to elective newborn circumcision because I cared deeply about bodily autonomy, consent, and individual liberty. Over the course of a few years, I learned more about circumcision, including that newborns may actually feel more pain than older groups and that about 115 children die during routine cuts in the United States each year. 2425 After exposing myself to the positions and statements of international science and medical groups, I also discovered that much of the developed world opposes the American practice of performing regular newborn circumcisions that aren’t medically necessary.2627 I ultimately concluded that the decision to execute cosmetic surgery should be left up to the patient himself when he is old enough. There are exceptions, however. In rare cases, it is medically beneficial to circumcise a baby’s penis. We can’t ignore that fact, but we can’t pretend those conditions are representative of the majority, either. I think that choice should be left to the individual once they are older, unless it is recommended by a physician for medical reasons.

  In conclusion, I was entirely opposed to marijuana use because of my own experiences with family members and drugs, but my mind was changed with new information and now I support cannabis legalization—particularly when it is used as a medicine. I also used to think that, if I had a son, I would have him circumcised because I was and I didn’t have any problems. This was a flawed argument from personal experience, and I now think that it should be the child’s choice unless there is a legitimate medical need. To me, these changes are positive because good ideas evolve over time. These are just two examples of me admitting I was wrong and changing my position, but there are many more. I find it’s beneficial, and maybe even a little fun, to keep a list of times that my ideas evolved with new data. It helps me more fully understand the importance of fluid opinions and, at the same time, it gives me a window through which I can see my previous self. According to Malcolm Gladwell, journalist and author of Blink: The Power of Thinking Without Thinking, this process of adjusting our beliefs is a human “responsibility.”

  “I feel I change my mind all the time. And I sort of feel that’s your responsibility as a person, as a human being—to constantly be updating your positions on as many things as possible,” Gladwell said. “And if you don’t contradict yourself on a regular basis, then you’re not thinking.”

  Correcting your own misconceptions isn’t just intellectually honest and, as Gladwell says, a responsibility. It can also be a personally rewarding experience. In fact, recent scientific research suggests failure and the ability to learn from it can actually be good for the brain.28 A University of Southern California magnetic resonance imaging (MRI) study,29 conducted alongside researchers from the University College London, the University Pierre and Marie Curie, the Ecole Normale Superieure, and the University of Lyon, quizzed 28 subjects and analyzed how their ventral striatum—or reward circuit of the brain—activated when the participants showed they learned from their incorrect answers.

  “We show that, in certain circumstances, when we get enough information to contextualize the choices, then our brain essentially reaches towards the reinforcement mechanism, instead of turning toward avoidance,” said Giorgio Coricelli, an associate professor of economics and psychology at the University of Southern California and coauthor of the study, in a statement upon the its publication in August 2015.

  He added that what we see with failure-based learning is similar to what might occur when a person experiences regret.

  “With regret, for instance, if you have done something wrong, then you might change your behavior in the future,” Coricelli said.

  If you’re asked whether you could possibly be mistaken about something and you say “No,” you might be affected by perception biases. If it is impossible, in your mind, for you to be wrong, then you have an inherently unscientific mindset and you likely won’t be able to uncover (or accept) the truth if it contradicts your opinion.

  THE IMPORTANCE OF IGNORANCE

  As important as it is to admit when you are wrong, simply acknowledging that you don’t know in the first place can be even more beneficial … and more difficult. People often use “ignorant” as an insult—sometimes as a synonym for dumb or unwilling to learn—but not every uneducated person is stupid and not everyone with a wealth of information is intelligent. As a result of this common misusage, ignorance itself has an underserved negative stigma, which means fewer people recognize that “I don’t know” is always an acceptable answer, and that sometimes it’s the only honest and reasonable one. It’s perfectly acceptable to not know, especially if you are still looking for answers. What is dangerous is blindly believing something that’s false because you’re too afraid to declare your ignorance.

  It’s important to note that accepting the fact that we still don’t know a lot of things doesn’t necessarily mean we must disregard what we really do understand. We don’t have to toss away everything we’ve learned; we just need to recognize that there may be much more to come. Isaac Asimov had a lot to say about this particular issue, and about how there is a “relativity of wrong.” He often told a story about a letter he received from an English literature major who was critical of the scientist’s statement that he was glad to live in a century in which people finally discovered the basic rules governing the universe, as well as those governing subatomic particles. The student (rightly) told Asimov that people who lived in every other century thought they were right yet were proved wrong, and that the same was likely to happen to him.30 In his reply, Asimov explained that “right” and “wrong” are not necessarily absolutes and are often “fuzzy concepts.”31

  “John, when people thought the earth was flat, they were wrong. When people thought the earth was spherical, they were wrong,” Asimov wrote in response to the student. “But if you think that thinking the earth is spherical is just as wrong as thinking the earth is flat, then your view is wronger than both of them put together.”

  Don’t get me wrong. I’m not saying ignorance is good. I am, however, saying we should work to cure ignorance, as opposed to running from it or stigmatizing it. If you lack knowledge about something, that just means you don’t have the necessary data at that time—nothing more, nothing less. We are all ignorant when it comes to a great number of issues because we all lack certain information, so ignorance itself isn’t always a bad thing. What is bad, however, is claiming to know all the answers and/or remaining ignorant by choice (willful ignorance). Neil deGrasse Tyson explained this quite well when he said, “It’s okay not to know all the answers; it’s better to admit our ignorance than to believe answers that might be wrong. Pretending to know everything closes the door to finding out what’s really there.”

  A critic of my work once asked me, “Why are you an ignorant?” and I wrote a short, rhyming response that I think helps demonstrate the importance of admitting our own ignorance:

  I AM AN IGNORANT.

  I DON’T KNOW ABOUT QUANTUM PHYSICS AND YOU AREN’T AN EXPERT IN ENGLISH GRAMMAR, BUT WE BOTH PROBABLY KNOW ABOUT MC HAMMER.

  AND WHILE I MIGHT KNOW MORE ABOUT RELIGION AND SCIENCE AND REPTILES, YOU’RE THE ONLY ONE WHO KNOWS HOW TO BRING YOUR FAMILY SMILES.

  YOU SEE, SOME OF US KNOW SOME THINGS AND OTHERS KNOW OTHERS, BUT THAT DOESN’T MEAN WE CAN’T WORK TOGETHER AND ACT LIKE BROTHERS.

  BECAUSE IF AN IGNORANT IS SOMEONE WHO DOESN’T KNOW SOMETHING, AND YOU CAN’T FIX A CAR AND I CAN’T COOK A DUMPLING, THEN IT’S CLEAR TO ME THAT YOU ARE AN IGNORANT AND I AM AN IGNORANT.

  THIS IS NOT AN INS
ULT, HOWEVER, IT IS JUST MY TWO CENTS.

  I don’t claim to know all the answers—in fact, I don’t claim to “know” much of anything—but I do have a lot of questions. And when there aren’t satisfactory answers to those questions, I am more comfortable saying, “I don’t know,” than I am proclaiming belief in that which is easy or comforting.

  BUT SCIENCE IS WRONG SOMETIMES!

  Since admitting that you’re wrong can be an important piece of individual advancement, it stands to reason that beliefs once considered to be scientifically sound by large groups of educated people can likewise be shown to be flawed—and this is absolutely true. When this happens on a large scale, and a prevailing idea is replaced by a newer model within the mainstream scientific community, it is called a paradigm shift. These monumental changes don’t happen every day, but when they do, it has the potential to change how we view specific fields and even the process of science as a whole. American physicist and historian Thomas Kuhn, who coined the term paradigm shift in his 1962 book The Structure of Scientific Revolutions, pointed to the emergence of Copernican astronomy as a particularly relevant case of such a change. When its predecessor, the Ptolemaic system, was developed, it was “admirably successful in predicting the changing positions of both stars and planets,” according to Kuhn.32

  “No other ancient system had performed so well; for the stars, Ptolemaic astronomy is still widely used today as an engineering approximation; for the planets, Ptolemy’s predictions were as good as Copernicus’,” he wrote. “But to be admirably successful is never, for a scientific theory, to be completely successful. With respect both to planetary position and to precession of the equinoxes, predictions made with Ptolemy’s system never quite conformed with the best available observations.”

 

‹ Prev