Book Read Free

Cults Inside Out: How People Get in and Can Get Out

Page 14

by Rick Alan Ross


  In this light some cults have claimed that they have changed and ceased to behave destructively. The founding leader may have died or been deposed, and the organization claims it has gone through a reformation. Critics of such groups, however, have often expressed skepticism and at times urged caution when evaluating such claims.

  When commenting on one group’s claims of reformation, Michael Langone, executive director of the International Cultic Studies Association, said, “It seems very unlikely to me that the psychological abuse of members will end without eliminating the cultic dynamics that underlie it.” Langone specifically points out the dilemma of “organizational leaders” the founder trained who are part of the historical hierarchy of the group. Though responsible for abuses, they still remained in positions of authority. In this sense Langone sees a kind of group “pathology,” which a group founder or past leader develops and nurtures. This pathology is embedded in the group even after its originator is gone. Langone offers as an example of such an embedded pathology a “foundational structure of secrecy [that] probably set the stage for the manipulation and abuse.”525 Noting one group, Langone says that the “tree was rotten from inception. No amount of pruning will eliminate the poison in the seed.”526

  What we can see, based on the various definitions offered for groups called “cults,” is that a nucleus for a definition exists and emerges from the various perspectives of researchers and experts. This nucleus has three central categories of characteristics, as Lifton identifies them, and they determine whether a group is a destructive cult. The lengthier definitions experts offer essentially expand upon those three central themes, which are based on the group’s type of leadership; the group process of coercive persuasion, which largely shuts down critical thinking; and the inherent destructive nature of the group, which is directly related to its mandates and unchecked leadership.

  CHAPTER 5

  “CULT BRAINWASHING”

  How do cult leaders convince people to become compliant and obey them? Is there some secret social skill they employ that renders their followers docile and suggestible? In news reports about destructive cults, a crucial element is often left out. How were people persuaded to join a cult group? And how did the cult subsequently convince them to comply with cult teachings and corresponding behavior?

  News reports about cult tragedies may use the word brainwashed as a means of explaining the unsettling and seemingly mindless obedience of some cult members. This often seems to satisfy disturbing questions about the frequently self-destructive nature of cult behavior—for example, why parents in some faith healing groups have allowed their children to suffer and die needlessly or why followers of David Koresh preferred to be burned to death than to peacefully surrender to authorities at Waco.

  Why do some cult members act against their own best interests while consistently behaving in accordance with the wishes of cult leaders? Again, the glib answer often reported is that they were somehow “brainwashed.” Otherwise how could the people connected to a reported cult tragedy be so completely persuaded to set aside their common sense, compassion, and priority of self-preservation?

  Benjamin Zablocki, a professor of sociology at Rutgers University, did a study of cultic coercive persuasion. In it he has expressed concern regarding “the polarization that has occurred amongst scholars of new religious movements,” which have often been called “cults.” Zablocki argues that ongoing effort has been exerted to “block attempts to give the concept of brainwashing a fair scientific trial.”527 The researcher laments, “This campaign has resulted in distortion of the original meaning of the concept so that it is generally looked at as having to do with manipulation in recruitment of new members to religious groups.” As Zablocki points out, however, the historic understanding of the term brainwashing, “on the contrary, should be in connection with the manipulation of exit costs for veteran members.”528

  Brainwashed or brainwashing is an imprecise and ambiguous description; nevertheless, it still has currency in contemporary popular culture. Journalist Edward Hunter first used the word brainwashing in an article he wrote for the Miami News in 1950.529 Hunter worked as a propaganda specialist for the Office of Strategic Services (OSS) during World War II and later reported about psychological warfare used during the Cold War. Despite its ambiguity, brainwashing persists as the most universally understood generic means of expressing the sort of undue influence some extreme forms of leadership may exercise over others. Brainwashing has been used to explain why people enthralled with a particular cause, group, or leader will apparently act against their own interests and consistently in the best interest of those who have influenced them. This metaphorical description implies the existence of a process that can potentially wash the brain clean of its original individual thinking and then supplant and suffuse it with a new mind-set a group, movement, or leader prescribes.

  The process is not, however, that simple. And there is a need to go beyond the catch phrases of popular culture to gain a better and more detailed understanding of the far more subtle process destructive cults have typically used to gain undue influence over members. What’s also important to note is that brainwashing isn’t exclusively used in a religious context. Zablocki says, “I do not mean to imply that there is anything about religion per se that is especially conducive to brainwashing or that brainwashing is not also to be found in political, psychotherapeutic, military or other totalist collectives.”530

  Researchers in the fields of mental health and sociology have developed or delineated a more precise terminology to describe the principal process of conversion destructive cults have used to engender obedience and conformity. Sociologist Zablocki defines brainwashing as “as an observable set of transactions between a charismatically-structured collectivity and an isolated agent of the collectivity with the goal of transforming the agent into a deployable agent. Brainwashing is thus a process of ideological resocialization carried out within a structure of charismatic authority.”531 Psychiatrist Robert Jay Lifton specifically called the process he observed “thought reform” in his seminal book Thought Reform and the Psychology of Totalism.532 Psychologist and eminent cult expert Margaret Singer categorized the extreme methods cults have used to gain compliance as a thought-reform process, which she said was most often inherently deceptive.533

  What these researchers have hypothesized, and in many instances confirmed, is that the human mind is far more fragile, persuadable, and malleable than we would like to think. Especially when people are in a state of distress or depression, are experiencing hardships, or are passing through major transition phases in their lives, they are typically more vulnerable to persuasion and other techniques of those who offer appealing answers and seemingly a way out of their difficulties.

  Authors Flo Conway and Jim Siegelman presented a communication perspective based on the sudden personality changes and other cognitive alterations associated with cult mind control techniques, described in their book Snapping: America’s Epidemic of Sudden Personality Change.534They explained how many cults completely distort, manipulate, and control the process of communication in ways that may have a lasting impact on cult members’ minds and give rise to a new category of cognitive disorders they termed “information disease.” In their later book, Holy Terror: The Fundamentalist War on America’s Freedoms in Religion, Politics, and Our Private Lives,535 Conway and Siegelman went on to examine the ritual practices of emotional control cults and many religious groups use. They explained how specific messages and ritualized instructions tied to cult beliefs, scriptures, and images can be manipulated to suppress a person’s bedrock emotional responses and everyday feelings in a systematic effort to promote obedience and compliance in the group. Conway and Siegelman’s concepts of information disease and emotional control will be discussed in more detail later in this chapter.

  This kind of information control was evident in the polygamist compound Warren Jeffs ran; there school and television were banned, a
nd cult members’ communication was also carefully regulated. Jeffs dictated everything his followers read and heard, including their music. He created a somber, cocooned, and controlled world where even the color red and the word fun were prohibited.536 Another example of information control was the isolated community of Colonia Dignidad, which Paul Schaefer led. Within this self-contained compound, built at the foot of the Andes Mountains in Chile, no one was allowed to listen to the radio, read newspapers, or walk alone.537

  Professor of psychology Robert Cialdini researched the basic techniques commonly used to influence people in everyday life.538 Cialdini’s groundbreaking book Influence specifically identified these techniques and explained how they could be used through venues as varied as advertising and political propaganda. A destructive cult can also use the same principles of influence in more deceptive and manipulative ways to gain undue influence over its adherents. Later in this chapter Cialdini’s principles will be detailed and correlated more broadly with influence techniques cults use. Some cult leaders have researched influence techniques, including Jim Jones, who studied the methods of mind control George Orwell described in his book 1984.539 But most cult leaders appear to assemble and refine their methods through a process of trial and error.

  Together programs of thought reform, coercive persuasion, and information control can produce the intensified modes of influence we see in destructive cults. This process of systematically applying manipulative techniques of influence, persuasion, and communication to produce persisting states of impaired thinking, feeling, and decision making in cult members has been widely recognized as one of mind control.

  Not all forms of influence and persuasion are the same. Psychologist Margaret Singer made distinctions between various types of persuasion, such as education and advertising, and more manipulative coercive methods, such as propaganda, indoctrination, and thought reform.540 Singer saw the process of thought reform as especially and uniquely rigid and distinctly different from other modes of persuasion. One example, Singer pointed out, is that thought reform effectively precluded any genuine or meaningful exchange of ideas and was instead “one sided” and actually expressed no sincere respect for differences.541

  Perhaps the most notable distinction between the thought-reform schemes perpetuated by destructive cults, in contrast to other types of persuasion, such as education and advertising, is that they are frequently deceptive. Singer said such programs center “on changing people without their knowledge.” She further explained that the structure of this coercive persuasion process takes an “authoritarian” and “hierarchical stance,” with no full awareness on the part of the “learner.”542 And unlike advertising, which is persuasive but regulated, cultic thought-reform methods are unregulated, unaccountable, and devoid of respect for the individual.

  The deceptive nature of such persuasion, combined with the group’s hidden agenda of “changing people without their knowledge,” means that people are often deceptively recruited into destructive cults without informed consent.

  Michael Lyons, who went by the name Mohan Singh, lured followers into his group, Friends of Mohan, by using the guise of a “naturopathic” healer. He claimed he was “chiropractor to the Queen” and an osteopath.543 Instead of healing, however, Lyons reportedly subjected his victims to “psychological and emotional control, brainwashing and isolation from families.”544 The Unification Church, commonly called “the “Moonies” and once led by Rev. Sun Myung Moon, has frequently been cited for its “deceptive tactics in recruiting followers.”545 Often targeting students on college campuses, the church operated through a number of front organizations such as the Creative Community Project, Students for an Ethical Society, and the Collegiate Association for the Research of Principles. Students who are initially approached might not even realize that the agenda of the group is religious.

  Coercive Persuasion

  The pioneering work of MIT professor Edgar Schein, who determined there are three basic stages of “coercive persuasion,”546 summarized coercive persuasion. These stages are first “unfreezing” the person, then “changing” his or her perceptions, and finally “refreezing” the individual in the changed state. These parallel what many groups identify as the necessity of breaking people down before they can make them over or build them up again. This process can often be accomplished by creating an acute sense of urgency and/or crisis frequently through confrontational tactics and group pressure. This is one of the methodologies used to persuade the individual that change is necessary or imperative.

  Later, sociologist Richard Ofshe sought to draw attention to and distinguish the persuasion techniques commonly employed by cults. He expanded upon Schein’s earlier work in his study of coercive persuasion.547 Ofshe built upon Edgar Schein’s three stages of coercive persuasion,548 and identified “four key factors that distinguish coercive persuasion from other training and socialization schemes.”549

  The reliance on intense interpersonal and psychological attack to destabilize an individual sense of self to promote compliance

  The use of an organized peer group

  The application of interpersonal pressure to promote conformity

  The manipulation of the totality of the person’s social environment to stabilize behavior once it has been modified

  We can see an example of this process in practice in the rehabilitation community known as Synanon, led by Charles Dederich. Synanon used its seminars, which evolved into highly confrontational “attack therapy,” to coerce its members. This coercion was called “the game.”550 During such sessions participants were surrounded by peers, who barraged them with criticism and berated them. This attack ultimately caused people caught in the game to collapse and readily accept suggested change. Subsequently Synanon members became submissive and compliant in conforming to the group’s norms. In this way the community coercively controlled everything, including relationships and even infants in what was called the “hatchery.”551 Nothing was immune from Synanon‘s process of coercive persuasion. Hundreds of couples were even induced to divorce on demand.552 In a ritual practice he called the “squeeze,” this same group process was used as a tool to purge those members who somehow managed to harbor doubts about Dederich.553

  Singer expanded on three basic steps of coercive persuasion, which Schein had outlined.554

  “Unfreezing” or what Singer described as “the destabilizing of a person’s sense of self.” This process often includes “keeping the person unaware of what is going on and the changes taking place. Controlling the person’s time and if possible their physical environment. Creating a sense of powerless covert fear and dependency. And suppressing much of the person’s old behavior and attitudes.”

  “Changing” or what Singer explained as “getting the person to drastically reinterpret his or her life’s history and radically alter his or her worldview and accept a new version of reality and causality.”

  “Refreezing,” or as Singer said, “Put forth a closed system of logic; allow no real input or criticism.” Ultimately this culminates in what Singer described as a person frozen or “dependent upon the organization…a deployable agent.”

  Thought Reform

  Robert Jay Lifton first explained these basic building blocks of the cult-control process in detail in his eight criteria for identifying the existence of a thought-reform program.555 Lifton said, “In identifying, on the basis of this study of thought reform, features common to all expressions of ideological totalism, I wish to suggest a set of criteria against which any environment may be judged, a basis for answering the ever-recurring question: ‘Isn’t this just like ‘brainwashing‘?”556

  Lifton’s eight defining criteria are as follows:

  “Milieu Control,” which Ofshe later interpreted as “the control of communication” within an environment.557 This may include virtually everything a person might potentially see, hear, or read—as well as his or her personal associations. By controlling whatever ente
rs the mind, destructive cults can largely control the mind itself.

  “Mystical Manipulation,” which Ofshe interpreted as “emotional and behavioral manipulation” That is typically accomplished in the guise of group beliefs and practices and often uses elements of deception in the controlled environment. This step could potentially be done by manipulating the daily news, anecdotal information, meditation, or religious writings in an effort to influence the thinking and feelings of group members.

  “The Demand for Purity” or what Ofshe described as “demands for absolute conformity to behavior as prescribed and derived from the group ideology.” 558 People subject to such demands may in a sense take personal inventory and relentlessly categorize their thoughts, emotions, and behavior per the group or leader’s dictates. Lifton called this a “spurious cataloguing of feelings,” which reflects the “peculiar aura of half-reality” within a “totalist environment” such as a destructive cult. The net result of this criterion is often “black and white thinking” or pushing people into a position where they feel they must pick between what the group labels “good” or “evil.”

  “The Cult of Confession” or what Ofshe sees as “obsessive demands for confession.” This confession may be done on an individual basis, group basis, or both. The underlying assumption or premise is that there really is no individual right to privacy and that whatever is known must be disclosed. This also constitutes what Lifton calls “symbolic self-surrender” to the absolute authority of the leadership. In groups where confession is strongly emphasized, people may confess in an exaggerated sense to what they have done or, as Lifton says, “to crimes one has not committed.” In the drug-rehabilitation community known as Synanon, group confession was formalized in what was called “the game.” An individual was seated in the center of the group on what was known as the “hot seat” and bombarded with confrontational questions and criticism until he or she was broken. Many destructive cults have historically used a format of breaking sessions based on confession.

 

‹ Prev