Book Read Free

A Manual for Creating Atheists

Page 5

by Peter Boghossian


  Socrates and Nietzsche prescribed a different kind of interpretative experience, one in which we’re not just finding and confirming our existing biases, but also attacking them. Whenever we have a chance to peek at our prejudices and see our own biases and underexamined assumptions, we have an opportunity to attack those assumptions and to rid ourselves from the presumption of knowledge. Examining and thoughtfully criticizing our biases, our interpretations, and what we think we know, is an opportunity for wonder to reemerge.

  As a Street Epistemologist, one of your primary goals is to help people reclaim the desire to know—a sense of wonder. You’ll help people destroy foundational beliefs, flimsy assumptions, faulty epistemologies, and ultimately faith. As Austrian philosopher Ludwig Wittgenstein says in Philosophical Investigations (§118): “What we are destroying is nothing but houses of cards and we are clearing up the ground of language on which they stand.” When you destroy a house of cards you have a view of reality that’s no longer obstructed by illusions.

  Helping rid people of illusion is a core part of the Street Epistemologist’s project and an ancient and honorable goal. Disabusing others of warrantless certainty, and reinstilling their sense of wonder and their desire to know, is a profound contribution to a life worth living.

  REASONING AWAY THE UNREASONABLE

  “Faith and reason are often—and justly—treated as irreconcilable opposites, despite Pope John Paul II’s famous argument (in the aptly entitled Fides et Ratio) that the two are alternative ways of arriving at the same truths. After all, faith is by definition the belief in something regardless or even in spite of evidence, while as David Hume famously put it: ‘A wise man proportions his belief to the evidence.’

  It would seem to follow that someone’s faith couldn’t possibly be moved by reason. Thankfully, it turns out that this is empirically not the case. There is both ample anecdotal and occasional systematic evidence of people shedding their faith through—in part—a process of reasoning. For instance, in their book, Amazing Conversions: Why Some Turn to Faith & Others Abandon Religion, Bob Altemeyer and Bruce Hunsberger examine how the typical time course unfolds in both directions (i.e., acquiring or losing one’s faith) and find striking asymmetries. Non religious people who convert often do so as a result of a sudden, highly emotional event, either a personal one (e.g., the death of a loved person) or a societal one (e.g., the 9/11 attacks). However, the most likely path to un-faith is slower, taking years to unfold, and going through a lot of readings, conversations, and reflection.

  When I was living in the Bible Belt I knew several people who illustrated this latter path. Often the initial spark was provided by reading a secular author who wrote in a non-threatening manner (the typical example was Carl Sagan), or by being exposed to small but nonetheless disconcerting cracks in one’s own religious teachings (e.g., being told by your preacher that your friends will go to hell because of such a trivial thing as singing in church).

  There is clearly a need for more systematic psychological and sociological studies of the relation between faith and reason, but the evidence so far is clear: people can and do change their mind in response to reasonable argument. The problem is, it takes a long time, repeated exposure to similar ideas by different sources, and possibly also a particular personality that includes a propensity to reflect on things.”

  —Italian-American biologist and philosopher Massimo Pigliucci

  One of the premises of this book is that people can be reasoned out of unreasonable beliefs. Not all scholars agree. In “Why Is Religion Natural?” French anthropologist Pascal Boyer argues against the idea that people have religious beliefs because they fail to reason properly (Boyer, 2004).2 Ending his article with the famous quotation from Irish writer Jonathan Swift, “You do not reason a man out of something he was not reasoned into,” Boyer argues that it is unlikely that religious beliefs can be argued away.

  I disagree. Here’s the evidence and several counterarguments:

  Individuals have been argued away from religion. Many people who have recovered from religion have self-reported that they’ve been reasoned out of their religious belief. Former preachers have even gone on to become evangelical atheists: Hector Avalos, Dan Barker, Kenneth W. Daniels, Jerry DeWitt, Joe Holman, John W. Loftus, Teresa MacBain, Nate Phelps, Robert Price, Sam Singleton, etc. These individuals now successfully use lessons from their past, alongside reason and argument, to help others leave religion.

  If the focus is on religion, as opposed to faith, Boyer may be partially correct in stating that religion can’t be “reasoned away.” Trying to reason away religion would be like trying to reason away one’s social support, friends, hobbies, comforting songs, rituals, etc. This is why Street Epistemologists shouldn’t attempt to separate people from their religion, but instead focus on separating them from their faith. Reasoning away faith means helping people to abandon a faulty epistemology, but reasoning away religion means that people abandon their social support network.

  Subsequent to much of Boyer’s work, an interesting 2012 study, Analytic Thinking Promotes Religious Disbelief, showed, as the title states, that analytic thinking does in fact lead to religious disbelief (Gervais & Norenzyan, 2012). While mechanisms of religious disbelief and various factors that contribute to disbelief are not entirely understood at this time, the authors demonstrated that improvements in analytic processing translate into an increased likelihood of religious disbelief. In other words, if one gains a proficiency in certain methods of critical reasoning there is an increased likelihood that one will not hold religious beliefs.

  Finally, many apologists (especially American theologian William Lane Craig) have had considerable success reasoning people into holding unreasonable beliefs (Craig, n.d.). This is a despairing statement about the effectiveness of the faithful’s tactics. There are entire bodies of apologist literature detailing how to reason and persuade unbelievers into faith.

  Boyer’s criticisms notwithstanding, the problem of faith is at least partially a problem of reasoning. People can be reasoned out of unreasonable beliefs.3 In fact, people frequently change their religious beliefs independent of reason, moving with abandon from one faith tradition to another.

  BELIEVING THE PREPOSTEROUS

  “I believe because it is absurd.”

  —Tertullian (197–220)

  “I mean that we do not infer that our faith is true based on any sort of evidence or proof, but that in the context of the Spirit of God’s speaking to our hearts, we see immediately and unmistakably that our faith is true. God’s spirit makes it evident to us that our faith is true.”

  —William Lane Craig, Hard Questions, Real Answers (2003, p. 35)

  There are five reasons why otherwise reasonable people embrace absurd propositions: (1) they have a history of not formulating their beliefs on the basis of evidence; (2) they formulate their beliefs on what they thought was reliable evidence but wasn’t (e.g., the perception of the testament of the Holy Spirit); (3) they have never been exposed to competing epistemologies and beliefs; (4) they yield to social pressures; and (5) they devalue truth or are relativists.

  Most people like to think that in their epistemic lives they accord beliefs to reason and evidence. That is, the less reason and evidence they have, the less confident they are about their conclusions and what they believe. But sometimes reason and evidence are not closely connected to belief. That is, individuals formulate their beliefs on the basis of other influences like parochial education, peer pressure, or community expectations—all potent forces not subjective to the pressure of evidence.

  In some cases, individuals have damaged their thinking not only because they’ve habituated themselves to not proportioning their beliefs to the evidence, but also because they actually celebrate the fact that they don’t do so. For example, in matters relating to religion, God, and faith, believers are often told ignorance is a mark of closeness to God, spiritual enlightenment, and true faith. (The Street Epistemologist
should spend considerable time working within these contexts. This is where you’re needed most. These interventions will be challenging but can be profoundly rewarding.)

  Over time, you’ll develop diagnostic tools that will enable you to quickly place someone in one of the above five categories. You’ll then be able to tailor the intervention accordingly.

  DOXASTIC CLOSURE

  The word “doxastic” derives from the Greek doxa, which means “belief.” I use the phrase “doxastic closure,” which is esoteric even among seasoned epistemologists and logicians, in a different and less technical way than it’s used in philosophical literature. I use the term to mean that either a specific belief one holds, or that one’s entire belief system, is resistant to revision.4 Belief revision means changing one’s mind about whether a belief is true or false.

  There are degrees of doxastic closure. At the most extreme degree of closure, one has a belief and/or a belief system that is fixed, frozen, and immutable, and therefore is less open to revision. The less one is doxastically closed, the more one is willing and capable of changing one’s belief.

  One can become doxastically closed with regard to any belief, regardless of the content of the belief. One can be closed about a moral belief (“We shouldn’t torture small children for fun”), an economic belief (“Markets don’t need regulation”), a metaphysical belief (“I am not a brain in a vat”), a relational belief (“My boyfriend loves me”), a scientific belief (“Global climate change is anthropogenic”), a faith-based belief (“A woman without a husband is like a dead body,” rmad Bhgavatam 9.9.32), etc.

  A Recipe for Closure

  In The Big Sort, American sociologist Bill Bishop argues that we cluster in politically like-minded communities (Bishop, 2008). That is, we seek out people and groups with ideologies similar to own—we like to be around people who value what we value. One consequence of clustering is to further cement the process of doxastic closure; when surrounded by “ideological likes,” even far-fetched beliefs become normalized. It is assumed, for example, “It’s normal to believe what I believe about polygamy. Everyone believes this about polygamy, and those who don’t are just wackos.” Clustering thus increases the confidence value that we implicitly assign to a belief—we become more certain our beliefs are true. Further complicating this clustering phenomenon is what American online organizer Eli Pariser terms “filter bubbles” (Pariser, 2012). A “filter bubble” describes the phenomena of online portals—like Google and Facebook—predicting and delivering customized information users want based upon algorithms that take preexisting data into account (e.g., previous searches, type of computer one owns, and geographical location).

  Consequently, and unbeknownst to the user, the information users see is in ideological conformity with their beliefs. For example, if you’ve been researching new atheism by reading or watching Horsemen Hitchens and Dawkins, and you Google “Creationism,” the search algorithm takes into account your previous searches, then gives you very different search results from someone who’s previously visited Creationist Web pages, researched Christian apologist videos, or lives in an area of the country with high rates of church attendance (e.g., Mississippi).

  This puts users in a type of bubble that filters out ideologically disagreeable data and opinions. The result is exclusive exposure to skewed information that reinforces preexisting beliefs. This is doxastic entrenchment. “It’s all over the Internet,” or “I’m sure it’s true, I just Googled it this morning and saw for myself,” gains new meaning as one is unwittingly subject to selective information that lends credence to one’s beliefs as confirming “evidence” appears at the top of one’s Google search.

  Combine clustering in like-minded communities with filter bubbles, then put that on top of a cognitive architecture that predisposes one to belief (Shermer, 2012) and favors confirmation bias, then throw in the fact that critical thinking and reasoning require far more intellectual labor than acceptance of simple solutions and platitudes, then liberally sprinkle the virulence of certain belief systems, then infuse with the idea that holding certain beliefs and using certain processes of reasoning are moral acts, and then lay this entire mixture upon the difficulty of just trying to make a living and get through the day with any time for reflection, and voilà: Doxastic closure!5

  DOXASTIC OPENNESS AND THE SELF-CONSCIOUSNESS OF IGNORANCE

  Doxastic openness, as I use the term, is a willingness and ability to revise beliefs.6 Doxastic openness occurs the moment one becomes aware of one’s ignorance; it is the instant one realizes one’s beliefs may not be true. Doxastic openness is the beginning of genuine humility (Boghossian, under review).

  Awareness of ignorance is by definition doxastic openness. Awareness of ignorance makes it possible to look at different alternatives, arguments, ways of viewing the world, and ideas, precisely because one understands that one does not know what one thought one previously knew. The tools and allies of faith—certainty, prejudice, pretending, confirmation bias, irrationality, and superstition—all come into question through the self-awareness of ignorance.7

  In your work as a Street Epistemologist you’ll literally talk people out of their faith. Your goal is to help them by engendering doxastic openness. Only very rarely will you help someone abandon their faith instantly. More commonly, by helping someone realize their own ignorance, you’ll sow seeds of doubt that will blossom into ever-expanding moments of doxastic openness.

  IMMUNE TO STREET EPISTEMOLOGY?

  As a Street Epistemologist, you will encounter individuals whose beliefs seem immune to reason. No matter what you say, it will appear as if you’re not breaking through—never creating moments of doxastic openness.

  This section will unpack the two primary reasons for this appearance of failure: either (1) an interlocutor’s brain is neurologically damaged, or (2) you’re actually succeeding. In the latter case, an interlocutor’s verbal behavior indicates that your intervention is failing—for example, they’re getting angry or raising their voice, or they seem to become even more entrenched in their belief. Such protests may actually indicate a successful treatment. (Of course, it’s possible that the believer has an argument that she has not yet raised in the conversation, but there’s no way to address an unvoiced argument.)

  1. Delusion and Doxastic Closure

  Some delusions are not beliefs (Bortolotti, 2010). For example, some people who have experienced a traumatic head injury suffer from Capgras delusion—they believe that familiar people, like their husband, and sisters and brothers, are really imposters. Other individuals are afflicted by Cotard delusion—they believe they are dead (literally). It is not possible to talk people out of these delusions.

  Street Epistemologists should set the realistic goal of helping the faithful become more doxastically open. Sow the seeds of doubt. Help people to become less confident in what they claim to know, and help them to stop pretending to know things they don’t know. In time, with more interventions behind you, you’ll hone your skills and increase your ambitions. Ultimately, in your wake you’ll have created not only people freed from the prison of faith, but also more Street Epistemologists.

  In instances of damage to the brain, no dialectical inter-vention will be effective in eliciting cognitive and atti-tudinal change. These and other conditions like some strokes, intracranial tumors, or Alzheimer’s disease affect the brain and are beyond the reach of nonmedical inter-ventions. In short, if someone is suffering from a brain-based faith delusion your work will be futile.

  2. Primum non nocere (“First, do no harm”)

  “[Faith] causes us to distort or even ignore objective data [as such] we often ignore all evidence that contradicts what we want to believe.”

  —John W. Loftus, The Outsider Test for Faith (2013)

  When people are presented with evidence that contradicts their beliefs, or are shown that they don’t have sufficient evidence to warrant beliefs, or learn that there’s a contradiction in their be
liefs (the trees could not both come before Adam, Genesis 1:11–12 and 1:26–27, and after Adam, Genesis 2:4–9), or come to understand that their reasoning is in error, they seem to cling to their beliefs more tenaciously.

  Doxastic pathology is especially evident in faith-based beliefs. That is, faith-based beliefs occupy a special category of beliefs that are particularly difficult to revise. Helping people revise a faith-based belief, or to abandon faith entirely, presents a host of challenges not usually encountered in other belief domains; even with politics, which trades in competing ideologies, a belief change can be facilitated more readily. This is because many factors are working to cement doxastic closure with regard to faith-based belief systems: society treats faith as a virtue, religious organizations actively spread faith, faith has evolved mechanisms to shield it from analysis, there are cultural taboos with regard to challenging people’s faith, and faith communities actively support members’ beliefs. (Tax-exempt status has allowed faith to become big business, but unlike faith, big business is always in the spotlight and under constant criticism.)

  Does this mean your intervention has backfired? Have you unintentionally made their epistemic situ-ation worse? Have you cemented doxastic closure? No.

  Interesting lines of research by Sampson, Weiss, and colleagues illustrate this phenomenon in the context of psychotherapy (Curtis, Silberschatz, Samp-son, Weiss, & Rosenberg, 1988; Gassner, Sampson, Weiss, & Brumer, 1982; Horowitz, Sampson, Siegel-man, Weiss, & Goodfriend, 1978; Norville, Sampson, & Weiss, 1996; Sampson, 1994; Silberschatz, Curtis, Sampson, & Weiss, 1991; Weiss & Sampson, 1986.) Researchers posited that short-term psychotherapy helps individuals escape from pathogenic beliefs. A pathogenic belief is a belief that directly or indirectly leads to emotional, psychological, or physical pathology; in other words, holding a pathogenic belief is self-sabotaging and leads one away from human well-being. Examples of such beliefs are, “I’m unlovable, I’ll always fail in romance,” “I’m pathetic, weak, and worthless, and without Christ’s love I couldn’t quit drinking on my own,” and “Without Scientology and auditing, I’ll never be able to limit the effects of the trauma ruining my life.”

 

‹ Prev