Lifton also describes how people influenced by a thought-reform program strip themselves of anything objectionable or at variance with the preferred prototype of the true believer. He says, “Yet one underlying assumption makes this arrogance mandatory: the conviction that there is just one path to true existence, just one valid mode of being, and that all others are perforce invalid and false. Totalists thus feel themselves compelled to destroy all possibilities of false existence as a means of furthering the great plan of true existence to which they are committed.” 613
Yeakley‘s book warned of the potential consequences that could be linked to the “falsification of psychological type.” He opined that it might produce detrimental results such as a “serious midlife crisis” and “major burn-out problems, serious depression, and a variety of other psychological…problems.”614
One of the most reported cult personality transformations in history was Leslie Van Houten, a former follower of Charles Manson. Van Houten grew up in a comfortable middle-class neighborhood and was a high school cheerleader. But under Manson’s influence, at the age of nineteen in 1969, she stabbed one of the cult’s targeted victims fourteen to sixteen times in the back.615 After decades of imprisonment, the former cult member said, “Everything that was good and decent in me I threw away.” It was only through her imposed separation from the group through imprisonment and her father’s visits that she apparently came to realize “what had happened.”616 Repeatedly denied parole, Van Houten tried to explain her alleged rehabilitation before a parole board hearing in 2003. She said, “I was raised to be a decent human being. I turned into a monster and I have spent these years going back to a decent human being.” During her incarceration Leslie Van Houten became a model prisoner, earned college degrees, and worked for a prison supervisor.617
Less stigmatized and more sympathetic examples of radical cult transformations include Patty Hearst and Elizabeth Smart. Both women were kidnapped, and their captors radically transformed them. Only after leaving that influence did they both return to their normal lives. The cult persona Hearst and Smart had once been manipulated to embrace then became a historical anomaly.
Cognitive Dissonance
In his description of “Milieu Control,” Lifton notes that such an environment “never succeeds in becoming absolute, and its own human apparatus can, when permeated by outside information—become subject to discordant ‘noise.’”618 The source of that outside information, which can permeate a controlled environment, could potentially be the Internet, television, radio, newspapers, or people outside the group, such as old friends and family. This influence could include anything that might provide an outside frame of reference.
How then do destructive cults bent on psychological and emotional control shut out or deal with uncontrollable “discordant noise,” which can create profound internal conflict in the minds of their adherents?
Cognitive dissonance is a term used in psychology and is often included in the paradigm of cultic persuasion and control. This theory is frequently the basis for understanding how cult-involved individuals can continue to cling to beliefs even when facts contradict them. Cognitive dissonance theory explains that cult members can resolve such conflicts by essentially spinning or accepting rationalizations. It is this spinning process that then reconciles the dissonance between their cultic beliefs and reality. For example, a mother in a faith healing group may reconcile the needless death of her child by proclaiming that it was somehow “God’s will” rather than admit the death was, in fact, the result of medical neglect.
Leon Festinger first used the description of cognitive dissonance in his book When Prophecy Fails. 619 The book tells the story of a UFO cult led by a woman named Dorothy Martin (1900–1992), also known as “Marian Keech,” who predicted the end of the world but foretold that aliens from another planet would rescue her followers on a precise date (December 21, 1954). When her prophecy failed, members of the group, who had given up everything to follow Keech, nevertheless remained loyal and committed believers.
Festinger proposes five factors that provide a basis for cognitive dissonance to be successfully implemented to resolve such a failure, which he calls “unequivocal disconfirmation.”620
There must be a deep conviction concerning the belief.
There must be commitment to this conviction.
The belief must be amenable to unequivocal disconfirmation.
Such unequivocal disconfirmation must occur.
Social support must be available after the disconfirmation.
Deep conviction is a common attribute among cult members. This commitment can be expressed through years of hard work, surrender of assets, isolation from family and old friends, and the renouncing of previously held goals and ambitions in favor of a group or leader’s agenda. What this means is that cult members may have a considerable amount of literal and emotional equity in the group and its beliefs. It is because of this considerable personal investment that deeply committed members are likely to accept whatever rationalization or explanation is offered to obviate the “unequivocal disconfirmation” or some failure in the group.
Members frequently cling to such rationalizations rather than accept the more alarming alternative, which is that all their efforts and sacrifices may have been for nothing. Others in similar circumstances within the group will join them in their willingness to accept convenient apologies for failure and offer whatever social support is necessary to protect and secure the sense of equity and mutual sacrifice they share in common. This can also be seen as a refusal to bear the “exit cost” of leaving the group. Sociologist Benjamin Zablocki generally defines such “exit costs” cult members consider as “all disincentives for leaving.” Zablocki includes “costs ranging from financial penalties, to relational commitments to various sorts of cognitive and emotional dependencies.”621
In what is one of the most poignant examples of cognitive dissonance, Waco Davidian survivors Clive Doyle and Sheila Martin continue to cling to their beliefs about David Koresh despite the fact that the cult leader’s prophecies led to death and destruction, not heavenly fulfillment. Eighteen years after the fire that claimed the lives of seventy-six Davidians, including Doyle’s daughter who was one of Koresh’s “wives” as well as Martin’s husband and four of her seven children, their loyalty remains unshaken.
The two aging cult members support each other in their continuing commitment. Sheila Martin says God wanted the fire and destruction. “I don’t expect you to understand,” she told a reporter. But she admits, “We didn’t have a plan for death. I wondered: Did someone change the plan without telling me?” Nevertheless Martin still insists, “David is the messiah, and he’s coming back…Now we just wait for the kingdom.”
Clive Doyle explains, “When people ask why we still believe in David and what he preached, after everything, I think they are asking because they really do want to understand.” Martin concurs. “I think they’ll realize someday everything is under his order, and they’ll understand that it’s not really a choice.”622 Despite the historical facts the two cultists cannot face the disconfirmation of Koresh’s demise and have decided instead to interpret the end in a way that allows them to continue as true believers. This response is most probably linked to the devastating losses they both suffered and the corresponding emotional equity they share. Admitting that David Koresh was a delusional cult leader and fraud would mean that all their sacrifices were for nothing. Rather than bear those exit costs, they have instead embraced what can be seen as cognitive dissonance.
“Psychology of the Pawn”
All these interlocking schemes based on information and emotional control, personality change, and influence techniques lead to an ultimate end result, which Lifton calls “the psychology of the pawn.” Margaret Singer defined six conditions necessary to be “changed one step at a time to become deployable to serve the leaders” of a destructive cult.623 She said, “The degree to which these conditions are p
resent increases the level of restiveness enforced by the cult and the overall effectiveness of the program.”624
“Keep the person unaware that there is an agenda to control or change the person.” Few cultic groups or leaders readily or willingly admit their agenda or ultimate purpose. They may conceal certain teachings from new members and generally endeavor to rationalize anything that may potentially seem negative. No one intentionally joins a destructive cult.
“Control time and physical environment (contacts, information)”—These vary from group to group. Minimally they may mean becoming cocooned within a social environment that monopolizes time and constrains associations, but they can become as extreme as an isolated compound like Jonestown.
“Create a sense of powerlessness, fear and dependency”—This has been called “learned helplessness.” Members are afraid to leave and become dependent on the group for a sense of security, safety, and purpose.
“Suppress old behavior and attitudes”—This is the breaking-down process of an individual personality.
“Instill new behavior and attitude”—This is a changing process to adopt more acceptable personality traits and corresponding characteristic behavior approved by the group.
“Put forth a closed system of logic”—This is based on what Lifton calls a “Sacred Science” that cannot be questioned. And this then becomes the basis of all value judgments.
In this role of “pawn” in a destructive cult, the victim of thought reform may largely feel helpless. As previously mentioned, that response has been called a type of “learned helplessness.” This condition is often associated with battered women who develop a diminished “sense of mastery and self-esteem,” and correspondingly this hinders their “ability to take active steps to change their situation.”625 Victims of both domestic violence and cults are often made to feel they can never be “good enough.” This feeling renders them largely dependent on their controlling influence for validation of their self-worth and contributes to a belief that without that controlling influence, they would be largely “helpless.” Lifton describes the helpless pawn, controlled by thought reform: “Unable to escape from forces more powerful than himself, he subordinates everything to adapting himself to them. He becomes sensitive to all kinds of cues, expert at anticipating environmental pressures, and skillful in riding them in such a way that his psychological energies merge with the tide rather than turn painfully against himself. This requires that he participate actively in the manipulation of others, as well as in the endless round of betrayals and self-betrayals which are required.”626
Psychologist Margaret Singer sees this as the primary purpose of destructive cults—that is, through their “tactics of thought reform” they cast those under their sway in the role of pawns. She explains that this tactical process is used to “develop in the person a dependence on the organization, and thereby turn the person into a deployable agent of the organization.”627 Zablocki defines a deployable agent as “one who is uncritically obedient to directives perceived as charismatically legitimate. A deployable agent can be relied on to continue to carry out the wishes of the collectivity regardless of his own hedonic interests and in the absence of any external controls.” He explains, “Deployability can be operationalized as the likelihood that the individual will continue to comply with hitherto ego-dystonic demands of the collectivity (e.g., mending, ironing, mowing the lawn, smuggling, rape, child abuse, murder) when not under surveillance.”628
Many cults repeatedly claim that their members are “free to leave whenever they want.” But the insidious tactics of thought reform often render its victims psychologically and emotionally unable to escape. This is what sociologist Benjamin Zablocki describes as “the paradox of feeling trapped in what is nominally a voluntary association.”629
Zablocki has also answered the question: why isn’t everyone exposed to cultic coercive persuasion and influence techniques trapped? He explains, “Often it is assumed that a demonstration that not all a cult movement’s members were brainwashed is equivalent to proof that none were brainwashed. But why does this follow? No leader needs that many deployable agents. The right question to ask is not whether all Moonies [Unification Church members] are brainwashed, but whether any Moonies w were brainwashed.”630
Zablocki adds, “A common misconception about turnover data in cults stems from confusion between the efficiency of brainwashing and the efficacy of brainwashing…But nothing in the brainwashing conjecture predicts that it will work on everybody, and it says nothing at all about the proportion of recruits who will become agents.” Zablocki points out, “For the system to perpetuate itself, the yield need only produce a sufficient number of deployable agents to compensate it for resources required to maintain the brainwashing process.” Rather than seeing the dropout rate of destructive cults as somehow providing proof that brainwashing doesn’t work, Zablocki observes, “In general, dropout rates tell us only about the rigor of the program, not about its effectiveness for those who stick it through to the end.”631
What is commonly called “cult brainwashing” is actually a composite of coercive persuasion and undue influence techniques used in an interconnecting process. The various facets of this process and how its pieces fit together and enable each other are best understood through the detailed explanation of thought reform and coercive persuasion. Both information and emotional control are powerful tools used in this context. The foundation of the process is built on basic principles of influence and is continually reinforced in large part by the human tendency to defer to authority.
This chapter includes some of the seminal research done that forms the framework, but there is much more evidence that reflects an innate human vulnerability to such schemes when people are ignorant or unaware that they are being used. Again, some cult leaders have actually studied coercive persuasion and influence techniques in an effort to copy and craft their own version, while others have simply learned from experience how to control and manipulate their followers. A common operating principle seems to be that destructive cult leaders like what they control and don’t like what they cannot.
Those outside the world of destructive cults often become unsettled and at times even horrified by the strange and seemingly mystical power cult leaders appear to possess. As previously pointed out, cults often use convenient masks to hide their true intentions and ultimate objectives. They may pose behind the facade of religion, philosophy, politics, business schemes, therapies, pseudoscience, and miscellaneous forms of exercise, nutrition, meditation, and martial arts. But the truth about destructive cults is that if you remove their outer veneer, they are in fact quite similar. That is, they most often have the same basic structure of authority and rely on virtually an identical process of persuasion and corresponding group dynamics to gain undue influence.
As outlined in a previous chapter regarding the definition of a destructive cult, it is not what the group believes but rather how it behaves that defines it. Destructive cults behave badly. They deceptively employ tactical thought reform, which quells independent thinking and engenders dependency in an effort to attain compliance. This is all done with little regard for collateral damage, which destructive cults rationalize as a necessary evil. Cult leaders instead seem to be relentlessly focused on their own needs and fulfillment; that is the hidden agenda of most, if not all, destructive cults.
CHAPTER 6
HISTORY OF CULT-INTERVENTION WORK
Professionals engaged in cult-intervention work have historically used many titles. Beginning in the 1970s, the first title used was “cult deprogrammer.” This title is specifically linked to the process of deprogramming or unraveling a program a destructive cult used based on behavioral, emotional, and psychological control. The job description of the cult deprogrammer seems to be permanently etched on popular culture and may remain the most common label the general public uses and understands to describe cult-intervention work.
Ted Patrick
&nbs
p; Ted Patrick was the first cult deprogrammer, a and he originated the term deprogramming in the early 1970s.632 Patrick said, “Deprogramming is like taking a car out of the garage that hasn’t been driven for a year. The battery has gone down, and in order to start it up you’ve got to put jumper cables on it. It will go dead again. So you keep the motor running until it builds up its own power. This is what rehabilitation is. Once we get the mind working, we keep it working long enough so that the person gets in the habit of thinking and making decisions again.”633 This vivid analogy identifies the essential elements of cult-intervention work. Patrick also said that cults often use “fear, guilt, poor diet and fatigue” to recruit and retain members.634
The activities of a notorious group called the Children of God (COG), now known as Family International, founded by David Berg, initially drew Ted Patrick to the issue of destructive cults. The group, which is known for the sexual abuse of minor children,635 tried to recruit Patrick’s son.636 As the head of community relations for San Diego and Imperial counties in California, Patrick also received complaints from distressed families about the group. He infiltrated COG in the summer of 1971 to investigate how it worked.637
Patrick, who attended public school for only ten years, developed his deprogramming approach by “trial and error,” which he refined through many cases.638 According to cult researchers Flo Conway and Jim Siegelman, who wrote about cults in the book Snapping: America’s Epidemic of Sudden Personality Change, Patrick “probed with questions…until he found the key point of contention at the center of each member’s encapsulated beliefs. Once he found that point, Patrick hit it head on, until the entire programmed state of mind gave way.”639 A man Patrick successfully deprogrammed confirmed, “Ted took me to the limits with a series of questions.”640 But Ted Patrick said he didn’t promote a particular belief system in the context of his intervention work. “When I deprogram people I don’t make any mention of a church or whether or not I even believe in God. That’s beside the point. My intention is to get their minds working again and to get them back out in the world,” he said.641 Eventually others copied or adapted Patrick’s method of intervention and began doing deprogramming across the United States.642
Cults Inside Out: How People Get in and Can Get Out Page 17