Another type of doxastic closure involves ego and narcissism. For example, there are times when we’re too narcissistically involved in our conclusions, when our egos have been involved in a judgment, when we’ve spat out an opinion we’ve held for a long time, or when we don’t want to consider objections because we like to win arguments. It’s important not just to be aware, but also to be sincere when asking ourselves why we hold the beliefs we do. Sincerity is indispensible not just in leading an examined life, but also in having meaningful relationships.
The less pedantic sounding terms “snapping” (Dubrow-Eichel, 1989, p. 195) and “unfreezing” (Kim, 1979, p. 210) have been used to describe the moment of successful deprogramming when subjects cognitively leave religious cults.
In an interesting study, Orenstein relates religious belief and church attendance to belief in the paranormal (Orenstein, 2002). Orenstein makes this interesting and overlooked observation: “A particularly intriguing comparison in these data is between religious variables and educational attainment. There have been numerous calls for upgrading science education in order to combat paranormal beliefs… . However, the effects of education are so small that it appears that values and faith rather than rationality are the driving factors in paranormal belief. Moreover, if paranormal beliefs are as closely attached to religious beliefs as these data indicate, were the schools to present a skeptical position regarding the paranormal, they would run the risk of arousing a religiously-based opposition. Some observers suggest that the legitimacy of science itself is under attack by supporters of the paranormal” (Orenstein, 2002, p. 309). In The Believing Brain, American author Michael Shermer discusses how science education does next to nothing to undermine belief in the paranormal (Shermer, 2011). Shermer demonstrates that this effect is related to science content, not process. This is why, Shermer argues and I agree, that we need to teach people how to think like scientists (see Shermer’s Skepticism 101 program: http://www.skepticblog.org/2011/08/30/skepticism101/) and not just have them memorize science facts (Shermer, personal correspondence, August 16, 2012).
CHAPTER 4
INTERVENTIONS AND STRATEGIES
“There’s evidence that religious belief is something that people go into quite quickly, but come out of rather more slowly. True, almost no one is instantly reasoned out of belief. But that’s not to say people cannot be reasoned, or cannot reason themselves, out of a particular religious belief. I have talked to many who have left religious belief behind, and it turns out that a willingness to think critically and independently has almost always played a pivotal role.”
—Stephen Law
“As Christian teachers, students, and laymen, we must never lose sight of the wider spiritual battle in which we are all involved and so must be extremely wary of what we say or write, lest we become the instruments of Satan in destroying someone else’s faith.”
—William Lane Craig, Hard Questions, Real Answers (2003, p. 34)
This chapter will provide you with tools and intervention strategies to begin your work as a Street Epistemologist. It covers basic principles of effective dialectical interventions designed to help people abandon their faith. These tools and strategies are pulled from diverse bodies of peerreviewed literature, including those dealing with exiting cults, effective treatments for alcoholics and drug addicts, and even salient pedagogical interventions. In my work as a Street Epistemologist, I deploy the general strategies described in this chapter in conjunction with my principal treatment modality, the Socratic method, discussed in the following chapter. Ultimately, you’ll need to personalize and tailor these techniques and strategies to account for the person with whom you’re speaking, the context, and your own personality and style.
In the United States alone, we have a standing “army” of more than half a million potential Street Epistemologists ready to be empowered, given the tools, and let loose to separate people from their faith. Approximately 312 million people live in the United States. Five percent of the U.S. population does not believe in God (CBS News, 2012). If only five percent of these 15.6 million nonbelievers become Street Epistemologists and actively try to rid the faithful of their faith affliction, then 780,000 Street Epistemologists can be informally deployed to deliver millions of micro-inoculations (of reason) to the populace on a daily basis.
PART I: INTERVENTIONS
Your New Role: Interventionist
“The deprogrammer is like a coach, or a ‘horse whisperer’ who convinces the wary animal that crossing a creek to leave an enclosed area is not so dangerous.”
—Joseph Szimhart, “Razor’s Edge Indeed” (2009)
“Deprogramming, a methodology of inducing apostasy, relies heavily on this need for alternatives to the cultic interpretation of reality. After dissonance has been induced, or even as a method for inducing it, deprogrammers typically present a brainwashing model of conversion and membership in religious cults. This is a type of medical model which essentially absolves individuals of responsibility for making their original commitment, for staying with the movement… . It also holds out … the promise of a viable existence apart from the movement in which the individual will come to again experience independence and intellectual freedom. This facilitates apostasy similar to the way the adoption of a competing religious world view does. Such a model or paradigm provides a cognitive structure with which individuals can reinterpret the cultic world view and their respective experiences in it as well as anticipate a life outside it.”
—L. Norman Skonovd, Apostasy (1981, p. 121)
Your new role is that of interventionist. Liberator. Your target is faith. Your pro bono clients are individuals who’ve been infected by faith.
Street Epistemologists view every conversation with the faithful as an intervention. An intervention is an attempt to help people, or “subjects” as they’re referred to in a clinical context, change their beliefs and/or behavior. Subjects start with a faith-based belief or a faith-based epistemology. You administer a dialectical treatment with the goal of helping them become less certain and less confident in their faith commitment (or perhaps even cured of faith entirely).
You will, in a very real sense, be administering a dialectical treatment to your conversational partners in a similar way that drug addicts receive treatment for drug abuse. Drug addicts come into the detox center in state X, undergo treatment, and then leave the facility in state Y, hopefully improved. You will not be treating drug addicts—you will be treating people who have been infected with the faith virus.
I view nearly every interaction as an intervention.1 I am intervening in my interlocutor’s thought process to help him think more clearly and reason more effectively. Socrates says thought is a silent conversation of the soul with itself. This means that when I’m intervening in someone else’s thinking, it’s not that different from thinking things out in my own head.
Talking to myself and talking to other people are alike—they are both interventions and opportunities. Even if my interactions are only three or four minutes, they still present an opportunity to help someone jettison faith and live a life free of delusion.
When you view your interactions as interventions, as opposed to confrontations or debates, you gain the following:
It will help you to step back and exhibit more objectivity in your interventions. This will be useful because your passions won’t drive the treatment and you’ll be more likely to model behaviors you want others to emulate, such as being trustful of reason and willing to reconsider a belief.
It’s more likely you’ll earn success if you view what you’re doing as an intervention and consider a person of faith as someone who needs your help—as opposed to passing judgment. A positive, accepting attitude will translate into an increased likelihood of treatment effectiveness.
You’re less likely to be perceived as an “angry atheist,” upset with the believer because you—and by extension all atheists—are “angry without God(s).” Apologists have gone to
great length to advance this narrative, and if your subject even senses a hint of anger it could complicate their treatment, significantly slow their progress, and even calcify or feed the faith virus. American author Greta Christina writes about atheism and anger in her book, Why Are You Atheists So Angry? 99 Things That Piss Off the Godless (Christina, 2012).
Viewing conversations as interventions will also help you listen closely and learn from each intervention. In turn, this increases the likelihood that the subject will change her behavior. It can also make you a better Street Epistemologist because you’ll be able to improve future interventions.
Anyone witness to the intervention sees the proper treatment modality in action, and may go on to help others. Everyone is a potential Street Epistemologist.
Interventions are not about winning or losing, they’re about helping people see through a delusion and reclaim a sense of wonder. On a personal level, you’ll likely find deeper satisfaction in helping people than in winning a debate.
Model Behavior
“If we could change ourselves, the tendencies in the world would also change.”
—Mahatma Gandhi
“Don’t just tell me what you want to do, show me.”
—Matt Thornton, community activist
If you are reading this book you probably already possess attitudes that predispose you to rationality, like a trustfulness of reason (American Philosophical Association, 1990, p. 2) and a willingness to reconsider (American Philosophical Association, 1990, p. 2). (For a list of the attitudinal dispositions and a definition of the ideal critical thinker, please see appendix A.) These are the core attitudinal dispositions, along with the creation of nonadversarial relationships (Muran & Barber, 2010, p. 9), which Street Epistemologists should model for the faithful.2
Don’t portray the universe as a binary value—you’re either with us or against us. Helping people to think clearly and to reject unreliable epistemologies is not another shot across the bow in the culture wars. Your discussions with the faithful are a genuine opportunity for you to help people reason more reliably and feel less comfortable pretending to know things they don’t know. They also present an opportunity for you to further develop a disposition conducive to anchoring beliefs in reality.
Keep in mind the possibility the faithful know something you don’t, that they may have a reliable method of reasoning that you’ve overlooked, that there’s a miscommunication, or that they can somehow help you to think more clearly. As long as you keep in mind the possibility someone may know something you don’t, and as long as you’re open to changing your mind based upon evidence and reason, you’ll eliminate much of the potential for creating adversarial relationships, and avoid becoming that against which you struggle.
In the middle to latter stages of one’s journey to reason, many people often ask themselves, “What now? What do I do now that my faith has been shown to be false?” The Street Epistemologist’s attitude, language, and behavior model what former believers can do: trust reason, stop pretending to know things they don’t know, be open to saying, “I don’t know,” be comfortable with not knowing, and allow for the possibility of belief revision.
DOXASTIC OPENNESS
“If you are a person of the same sort as myself, I should be glad to continue questioning you: If not, I can let it drop. Of what sort am I? One of those who would be glad to be refuted if I say anything untrue, and glad to refute anyone else who might speak untruly; but just as glad, mind you, to be refuted as to refute, since I regard the former as the greater benefit.”
—Socrates in Gorgias
Whenever one is arguing against X, the danger is that one becomes unreflectively counter-X. One of the most insufferable things in discussions with the faithful is the experience of speaking to someone who’s doxastically closed. When someone suffers from a doxastic pathology, they tend to not really listen to an argument, to not carefully think through alternatives, and to lead with their conclusion and work backward (this is called “confirmation bias”).
The moment we’re unshakably convinced we possess immutable truth, we become our own doxastic enemy. Our epistemic problems have begun the moment we’re convinced we’ve latched on to an eternal, timeless truth. (And yes, even the last two sentences should be held as tentatively true.) Few things are more dangerous than people who think they’re in possession of absolute truth. Honest inquirers with sincere questions and an open mind rarely contribute to the misery of the world. Passionate, doxastically closed believers contribute to human suffering and inhibit human well-being.
Street Epistemologists enter into discussions with an open and genuine attitude from the start—even if there’s no reciprocity.3 If someone knows something you don’t know, acknowledge that you don’t know. The Street Epistemologist never pretends to know something she doesn’t know. Often the faithful will attempt—intentionally or otherwise—to make you feel “less” because you don’t know what they’re pretending to know.
There is one piece of advice I can provide to help you overcome this social or personal feeling of inadequacy—the kind of feelings some beginning Street Epistemologists may feel in their initial interventions with the faithful. You need to become comfortable with not knowing and not pretending to know, even though others may ridicule you or attempt to make you feel inadequate for not pretending to know something they themselves are only pretending to know.
PART II: STRATEGIES
Avoid Facts
“Facts don’t necessarily have the power to change our minds. In fact, quite the opposite … when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts … were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.”
—Joe Keohane, “How Facts Backfire” (2010)
People dig themselves into cognitive sinkholes by habituating themselves to not formulate beliefs on the basis of evidence. Hence the beliefs most people hold are not tethered to reality.4 For an individual with a personal history of not placing a high value on the role of evidence in belief formation, or not scrutinizing evidence, it is extremely difficult to subsequently engender a disposition to believe on the basis of evidence. Thus, it is of little use to bring in facts when attempting to disabuse those in the precontemplative stage of their faith-based beliefs. If people believed on the basis of evidence then they wouldn’t find themselves in their current cognitive quagmire.
When I teach beginning Street Epistemologists how to help rid the faithful of their affliction and anchor their beliefs in reality, one of the most difficult strategies to get across is: do not bring particular pieces of evidence (facts, data points) into the discussion when attempting to disabuse people of specific faith propositions. Many rational, thoughtful people think that somehow, magically, the faithful don’t realize they are not basing their beliefs on reliable evidence—that if they were only shown solid evidence then voilà, they’d be cured! This is false. Remember: the core of the intervention is not changing beliefs, but changing the way people form beliefs—hence the term “epistemologist.” Bringing facts into the discussion is the wrong way to conceptualize the problem: the problem is with epistemologies people use, not with conclusions people hold.5
The futility of trying to persuade the faithful by way of evidence is particularly conspicuous in fundamentalists and in people who experience severe doxastic pathologies. For example, if a fundamentalist believes the planet is 4,000 years old, there’s absolutely no evidence, no set of facts, no data, one can show her to disabuse her of this belief.6 The belief the planet is 4,000 years old is based on another belief. That is, one doesn’t believe the Earth is 4,000 years old without a supporting belief structure, for example, the Bible. The supporting belief structure acts as the soil in which individual beliefs are germinated and eventually grow roots.
&n
bsp; The introduction of facts may also prove unproductive because this usually leads to a discussion about what constitutes reliable evidence.7 This is a reasonable and important issue, but one not often encountered in the context of a Street Epistemologist’s intervention.
Nearly all of the faithful suffer from an acute form of confirmation bias: they start with a core belief first and work their way backward to specific beliefs. For example, if one starts with a belief in Christ as divine, any discussion of evidence—tombs, witnesses, etc.—will almost always be futile. Any piece of contradictory evidence one brings into the discussion will never be sufficient to warrant a change in belief.
Contradictory evidence will be discarded as anomalous, offensive, irrelevant, preposterous, or highly unlikely.
Every religious apologist is epistemically debilitated by an extreme form of confirmation bias.8 9 Gary Habermas, for example, exemplifies this cognitive malady. Habermas (Habermas, 1996, 2004) alleges to believe—and I think he actually does believe—that there’s sufficient evidence to warrant belief in an historical Jesus, and the miracles attributed to him (Habermas, 1997), and that Jesus rose from the dead. Yet when confronted by basic, rudimentary objections (people lied, someone ransacked the tomb, the witnesses were unreliable), he takes the most remote logical possibility and turns that into not just a probability but an actuality. He does this because he starts with a foundational belief first—Christ’s divinity and the truth of scripture—then conveniently sidesteps logic and reasons backward from his belief. By starting with a belief first and working backward, his beliefs make perfect sense to him as well as those who begin with the same belief.
Another example of confirmation bias occurs when someone tells their pastor, for example, that they’re having doubts about their faith. Their pastor in turn tells them to read the Bible and pray about it. This is asking someone to start with their belief first and see what happens—what will happen is that their belief will strengthen. Similar advice is given to Muslims, called dhikr or zikr, which translates to remembering Allah in one’s heart. Muslims “achieve” this is by continuously repeating a phrase, like “Allah Akbar” (God is Great), “Subhan’ Allah” (Glory be to Allah), or other such phrases, to strengthen their devotion.
A Manual for Creating Atheists Page 7