The Cult of Trump

Home > Other > The Cult of Trump > Page 9
The Cult of Trump Page 9

by Steven Hassan


  WE AREN’T REALLY THAT RATIONAL—A BRIEF HISTORY OF PSYCHOLOGY

  What makes the human mind so vulnerable to such manipulations? To answer that question requires going back to the turn of the twentieth century and the work of Sigmund Freud. Until Freud, many viewed human beings as rational creatures, a view epitomized in Descartes’s famous maxim, “I think, therefore I am.” Freud theorized that below the surface of conscious awareness lies a well of urges and feelings—often sexual and aggressive—that may be latent or repressed but that, under the right circumstances, can erupt. Freud saw World War I, with its horrific acts of inhumanity, as a battle waged by dark forces within us that we didn’t know we possessed. So the assault on rationality began. Human beings couldn’t be trusted to make rational decisions.2

  Some thinkers of the time argued that a new elite was needed to control the public—the “bewildered herd,” in the words of leading political writer Walter Lippmann.3 During World War I, the U.S. government, wanting to sway public opinion in favor of the war, called in Freud’s nephew, Edward Bernays, who had previously worked as a theatrical press agent to help promote the war and its message of making “the world safe for democracy.” After the war, Bernays, fascinated—like his uncle—by what drove people’s thoughts and actions, realized that the best way to get people to buy something—a war, but also an idea or a product—was to appeal to their emotions and desires. He would be the first to apply psychological principles to the area of public relations. In his 1928 book, Propaganda, he spelled out in stark detail techniques for scientifically shaping and manipulating public opinion, which he called “the engineering of consent.”4

  The following year, hired by the tobacco industry to promote cigarette smoking among women, he paid women to light and smoke cigarettes—he called them torches of freedom—as they walked in the New York Easter Parade and touted it as a bold act of defiance. Bernays’s powerful theory of selling—that products should be sold not as necessities but as fulfilling human desires—spawned the modern era of consumerism. Ultimately, it would help define an American ideology, one that equated success with material objects—a fancy home, car, makeup, and clothes.

  When Wall Street crashed on October 24, 1929, so too did Bernays’s approach to selling—people could barely afford to buy food. Then came World War II, with acts of inhumanity that eclipsed those of World War I. Millions of people—Jews, blacks, gays, gypsies, communists—were killed in Nazi concentration camps that were run by “ordinary” Germans. This mass collaboration provoked great interest among psychologists.5 How, they asked, could ordinary people help carry out murders on such a scale? Freud’s belief that, deep down, humans are more carnal, even savage, and need to be controlled, was for many as good an explanation as any.

  Rather than true peace, the end of World War II was followed by the Cold War, which pitted the United States and its European allies against communist countries like the Soviet Union and China. By the late 1950s, both the United States and the Soviet Union were ramping up nuclear testing. Images of another war culminating in a world-ending mushroom cloud scared every American. To allay their fears and to promote their own interests, the U.S. government once again called on Bernays, who would, among his many public relations campaigns, spin a 1954 coup in Guatemala as the “liberation of a country from the jaws of Communism.”6

  THE CIA AND MIND CONTROL

  Whether he knows it or not, Trump—the salesman—owes many of his techniques to Bernays. But the work on influence and mind control intensified dramatically in the 1950s. With the rise of communism, and fearing that the Soviets were devising techniques to alter people’s minds, the U.S. government, and in particular the Central Intelligence Agency, set up secret experiments to explore the limits of human behavioral control. As described by John Marks in his book The Search for the Manchurian Candidate and by Alan W. Scheflin and Edward M. Opton Jr. in The Mind Manipulators, the CIA conducted mind control research from the late 1940s through the early 1960s. Code-named MK-ULTRA, their research program was a clandestine and illegal program of experiments on human subjects in a quest to find ways to manipulate people’s mental states, alter their brain functions, and control their behavior. The techniques used in their experiments ranged from LSD and other psychotropic drugs to brain surgery, electroshock therapy, sensory deprivation, isolation, hypnosis, and sexual and verbal abuse. Other researchers attempted to follow up on this work but the CIA, in violation of many federal laws, destroyed almost all of its relevant files, claiming the research had not been productive.7

  SOCIAL PSYCHOLOGY RESEARCH

  The U.S. government raced to uncover the secrets of mind control by also helping fund academic research by social psychologists who were realizing that our thoughts, feelings, and behaviors can be deeply influenced by the actual, imagined, or implied presence of another person or persons. Their work would yield surprising insights about the power of group conformity, authority, and human suggestibility.

  Among the most remarkable discoveries is that people are hardwired to respond to social cues. Consider these key experiments, which I cite when I am counseling and teaching:

  The Asch Conformity Experiments. In 1951, Solomon Asch conducted his first conformity experiment with a group of eight Swarthmore College students. All but one were “actors.” The students were shown a card with three lines of different lengths and then asked to say which line was closest to the length of a target line on another card. The actors agreed in advance which line they would choose, even if it was obviously not the correct answer. Asch ran eighteen trials with the group. In twelve of them, the actors intentionally gave the wrong answer. Even when their answers were blatantly incorrect, the unwitting student would occasionally agree with the rest of the group. Asch repeated this experiment with multiple groups of eight students. Overall, 75 percent of the students conformed to the group consensus at least once, while 25 percent never conformed.8 The results demonstrated that most people will conform when placed into a situation of social pressure.

  The Milgram Experiment. In 1961, inspired by the concentration camp horrors of World War II, where ordinary Germans carried out horrific acts, Yale University psychologist Stanley Milgram undertook an experiment to test the limits of obedience to authority. He did not believe it was only “authoritarian personalities” that were to blame for conscienceless obedience. He was curious to see whether ordinary Americans could be made obedient like German citizens had been. He recruited male volunteers and paired them with another subject, actually an actor, for what they thought was a memory and learning experiment. They were instructed to teach a task to their partner and to administer what they thought was a shock, ranging incrementally from 15 to 450 volts, each time the learner made a mistake. Tape recordings of the learner feigning pain or even screaming when receiving the punishment at higher shock levels were played. If the subject refused to administer a shock, the experimenter would order them to do so. Milgram found that all of the subjects administered shocks of at least 300 volts, though some were visibly uncomfortable doing so. Two-thirds continued to the highest level of 450 volts. Milgram wrote, “The essence of obedience consists in the fact that a person comes to view himself as the instrument for carrying out another person’s wishes, and therefore no longer regards himself as responsible for his own actions.” This experiment showed how people will follow orders from someone they think is a legitimate authority figure, even against their conscience.

  The Stanford Prison Experiment. In 1971, Dr. Philip Zimbardo conducted a world-famous prison experiment in the basement of the psychology building at Stanford University. He wanted to explore the psychological effects of roles and perceived power, as might exist in a prison setting. Twenty-four healthy young men were randomly divided into two groups: prisoners and prison guards. Prisoners were mock “arrested” at their homes and brought to the so-called prison, where they encountered the guards, who were dressed in uniforms, including mirrored sunglasses, and equipped with
batons. In very little time, the subjects adopted their roles with disturbing results. The experiment was supposed to last two weeks but it had to be called off after only six days because some of the guards had become sadistic, and some of the prisoners had psychological breakdowns. Good people started behaving badly when put in a bad situation and were unaware of the mind control forces at work. Even Zimbardo got pulled into the power of the situation. It took graduate student Christina Maslach, later Zimbardo’s wife, to shock him out of his role as warden and to realize that young men were suffering—and that he was responsible.

  Both the Milgram and Zimbardo studies led to the establishment of strict ethical review board requirements for doing experiments with human subjects.

  Why do we bow to social pressure? According to Nobel Prize–winning author psychologist Daniel Kahneman, when it comes to making choices, we have two systems in our brains. As he writes in his 2016 book, Thinking, Fast and Slow, the first system is fast and instinctive and the second system more deliberate. The fast system relies on unconscious heuristics, and makes decisions based on instinct and emotion—a kind of “sensing”—without consulting the more analytic, critical “slow” system. It’s the part of the mind that you use when you’re “thinking with your gut,” that looks to others in your environment when it gets confused, and that defers to authority figures. When a person is unsure, they do what the tribe is doing—they conform. We unconsciously look to someone who promises security and safety. In short, we are unconsciously wired to adapt, conform, and follow to promote our survival.

  FUNDAMENTAL ATTRIBUTION ERROR

  Other discoveries were showing the limits of human rationality. In 1967, two researchers, Edward Jones and Victor Harris, conducted an experiment in which subjects were asked to read essays either for or against Fidel Castro. When the subjects believed that the writers freely chose their positions, they rated them as being correspondingly pro- or anti-Castro. But even when subjects were told that the writer had been directed to take their stance, subjects still rated the pro-Castro writers as being in favor of Castro, and the anti-Castro speakers against him.

  This psychological bias—known as the fundamental attribution error—is important to understand before we go any further in explaining the science of mind control. When we see a negative behavior in another person (for example, joining a cult), we might explain it as an expression of a personality defect in that person (they are weak, gullible, or need someone to control them). When we see such a behavior in ourselves, we tend to attribute it to an external situation or contextual factor (I was lied to or pressured). The fundamental attribution error refers to this tendency to interpret other people’s behavior as resulting largely from their disposition while disregarding environmental and social influences.9 When I was in the Moonies, people probably assumed that I was weak, dumb, or crazy to join such a group. I thought I was doing something good for myself and the planet. At the same time, I might have looked at a Hare Krishna devotee and thought they were weird. The truth is, we were both being lied to and manipulated. We also see it at play in our country—between Trump supporters and anti-Trumpers, each assuming the other is dumb, stupid, or crazy. We are all affected by situational factors, including our exposure to influence. By understanding the fundamental attribution error we are encouraged not to blame other people but rather to learn about the influences that have led them to adopt their position and work to expand the sharing of information and perspectives.

  COGNITIVE DISSONANCE

  In their classic book, When Prophecy Fails, Leon Festinger and his colleagues Henry Riecken and Stanley Schachter describe their studies of a small Chicago UFO cult called the Seekers. The leader of the group had predicted that a spaceship would arrive on a particular date to save them, the true believers, from a cataclysm. The big day came— without a spaceship. To Festinger’s surprise, rather than become disillusioned, members of the group claimed that through their faith, the catastrophe had been averted. “If you change a person’s behavior,” the authors observed, “[their] thoughts and feelings will change to minimize the dissonance”10—a phenomenon Festinger called “cognitive dissonance.”

  Dissonance is psychological tension that arises when there is conflict between a person’s beliefs, feelings, and behavior. We think of ourselves as rational beings and believe that our behavior, thoughts, and emotions are congruent. We can tolerate only a certain amount of inconsistency and will quickly rationalize to minimize the discrepancy. This often happens without our conscious effort or awareness. What this means is that when we behave in ways we might deem stupid or immoral, we change our attitudes until the behavior seems sensible or justified. This has implications for our ability to accurately perceive the world. People who hold opposing views will interpret the same news reports or factual material differently—each sees and remembers what supports their views and glosses over information that would create dissonance. Trump campaigned on the promise of a wall along the border with Mexico and even guaranteed that the Mexicans would pay for it. Today the wall has not been built and Mexico has made it clear that they will not pay for any such thing. Here is how Trump dealt with the cognitive dissonance. “When, during the campaign, I would say, Mexico is going to pay for it. Obviously, I never said this and I never meant they’re going to write out a check. I said, ‘They’re going to pay for it.’ They are. They are paying for it with the incredible deal we made, called the United States, Mexico, and Canada (USMCA) Agreement on trade.”11 Apparently, many of his supporters believe him. People tend to look for congruence and avoid discordance, which can create emotional distress. Beliefs often shift to fall more in line with a person’s emotional state.

  THE PSYCHOLOGY OF MIND CONTROL

  In 1961, Massachusetts Institute of Technology psychologist Edgar Schein wrote his classic book Coercive Persuasion. Building on a model developed in the 1940s by influential social psychologist Kurt Lewin, he described psychological change as a three-step process. Schein, like Lifton, Singer, and others, had studied the Chinese communist programs and applied this model to describe brainwashing. The three steps are: unfreezing, the process of breaking a person down; changing, the indoctrination process; and refreezing, the process of building up and reinforcing the new identity. It’s a model that could apply to the millions of Americans who have fallen under the sway of Trump and his administration.

  UNFREEZING

  To ready a person for a radical change, their sense of reality must first be undermined and shaken to its core. Their indoctrinators must confuse and disorient them. Their frames of reference for understanding themselves and their surroundings must be challenged and dismantled.

  One of the most effective ways to disorient a person is to disrupt their physiology. Not surprisingly, sleep deprivation is one of the most common and powerful techniques for breaking a person down. Altering one’s diet and eating schedule can also be disorienting. Some groups use low-protein, high-sugar diets, or prolonged underfeeding, to undermine a person’s physical integrity. Former Trump inner circle member Omarosa Manigault Newman reports that while working closely with Trump, she adopted many of his habits—working all hours of the night, eating fast food, often at Trump’s insistence.

  Unfreezing is most effectively accomplished in a totally controlled environment, like an isolated country estate, but it can also be accomplished in more familiar and easily accessible places, such as a hotel ballroom. When they were not all in the White House, Manigault Newman and other Trump aides would vacation together at Trump-owned resorts, either Mar-a-Lago or at his golf course in Bedminster, New Jersey.12

  Hypnotic techniques are among the most powerful tools for unfreezing and sidestepping a person’s defense mechanisms. One particularly effective hypnotic technique involves the deliberate use of confusion to induce a trance state. Confusion usually results when contradictory information is communicated congruently and believably with an air of certitude. For example, if a hypnotist says in an author
itative tone of voice, “The more you try to understand what I am saying, the less you will never be able to understand it. Do you understand?” the result is a state of temporary confusion. If you read it over and over again, you may conclude that the statement is simply contradictory and nonsensical. However, if someone is kept in a controlled environment long enough, they will feel overwhelmed with information coming at them too fast to analyze. If they are fed disorienting language and confusing information, they will zone out, suspend judgment, and adapt to what everyone else is doing. In such an environment, the tendency of most people is to doubt themselves and defer to the leader and the group, as in the Asch and Milgram experiments.

  Trump is a master of confusion—presenting contradictory information convincingly. We saw it earlier in the way he explained the funding for his wall. He will say something, “Mexico will pay,” then say he never said it: “I never said Mexico will pay.” And then say it again, “Mexico will pay,” but in a new context: “with the incredible deal we made.” He does it all in an assured tone of voice, often with other people present—for example, at rallies—who nod in agreement. And so do you.

 

‹ Prev