Book Read Free

Trust Us, We're Experts PA

Page 37

by Sheldon Rampton


  Using the assumed name “George Jetson,” for example, we plugged in an estimate that he currently spends $24,166,666 per year on gasoline, electricity, heating oil, and natural gas. (After all, it takes a lot of energy to propel those flying cars.)

  The computer promptly generated messages to our elected officials. “I am proud to be a worker which you represent,” Mr. Jetson stated. “Estimates suggest I will personally see my cost for electricity, for natural gas, and for gasoline go up by $24,239,987.52 a year!”

  It’s nice to know that the democratic system works. Thanks to the miracles of modern computer technology and sophisticated PR, even cartoon characters can do their part to save America from the eco-wackos and their newfangled scientific theories.

  11

  Questioning Authority

  I know of no safe depository of the ultimate power of the society but the people themselves; and if we think them not enlightened enough to exercise their control with a wholesome discretion, the remedy is not to take it from them, but to inform their discretion.

  —Thomas Jefferson1

  When psychologists have explored the relationship between individuals and authority figures, they have found that it can be disturbingly easy for false experts to manipulate the thinking and behavior of others. One of the classic experiments in this regard was conducted in 1974 by Stanley Milgram, who tried to see how far people would go in following orders given by a seemingly authoritative scientist. The subjects of Milgram’s research were taken into a modern laboratory and told that they would be helping conduct an experiment that involved administering electric shocks to see how punishment affected the learning process. The subjects were seated at a machine called a “shock generator,” marked with a series of switches ranging from “slight shock” to “severe shock.” Another person was designated as a “learner” and was hooked up to receive a jolt each time he gave the wrong answer on a test. A third individual, the “scientist,” stood over the experiment giving instructions and supervision. Unbeknownst to the real subjects of the experiment, both the “learner” and the “scientist” were actors, and no actual electricity was used. As each fake shock was administered, the “learner” would cry out in pain. If the subject administering the shocks hesitated, the “scientist” would say something like, “Although the shocks may be painful, there is no permanent tissue damage, so please go on,” or “It is absolutely essential that you continue.” The result was that many subjects continued to administer shocks, even when the “learner” claimed heart trouble, cried out, or pleaded to be set free. “With numbing regularity,” Milgram observed, “good people were seen to knuckle under the demands of authority and perform actions that were callous and severe. Men who are in everyday life responsible and decent were seduced by the trappings of authority, by the control of their perceptions, and by the uncritical acceptance of the experimenter’s definition of the situation, into performing harsh acts.”2

  In another famous experiment, known as the “Doctor Fox Lecture,” a distinguished-looking actor was hired to give a meaningless lecture, titled “Mathematical Game Theory as Applied to Physical Education.” The talk, deliberately filled with “double talk, neologisms, non sequiturs, and contradictory statements,” was delivered before three audiences composed of psychiatrists, social workers, psychologists, educators, and educational administrators, many of whom held advanced degrees. After each session, audiences received a questionnaire asking them to evaluate the speaker. None of the audience members saw through the lecture as a hoax, and most reported that they were favorably impressed with the speaker’s expertise.3

  The rich and powerful seem to be no better at seeing through bogus experts than anyone else. In September 1999, the Wall Street Journal announced the arrest of Martin A. Armstrong, charged with bilking Japanese investors out of $950 million. “For decades,” the Journal reported, “Armstrong sold himself to investors as expert on anything of precious value, from coins minted by the Egyptian pharaohs to turn-of-the-century U.S. stamps, not to mention current-day markets for stocks, bonds, commodities and currencies. Now, Mr. Armstrong . . . stands accused in a federal indictment of using this market-wizard image to conduct one of the most common frauds in the history of finance: making big promises to investors that he couldn’t deliver.”

  Armstrong’s “self-confident forecasting style” had made him a hit at conferences in which he addressed hundreds of Japanese corporate chieftains. Even as his currency deals were losing hundreds of millions of their dollars, “Armstrong continued to confidently sell himself as a forecaster of market trends, often in language in which he mocks others’ mistakes,” the Journal noted. “Mr. Armstrong’s reams of investing treatises, many posted on his website, range from the monetary history of Persia to the ‘Panic cycle in global capital flows.’ The historical data maintained by his Princeton Economic Institute has been used by many media outlets. The ‘first and most important rule about investing is to “Know what you are buying and why!” ’ he warned in a July 1997 report. . . . He wasn’t shy about promotion, jumping at the chance to have his picture taken with heavyweights in any market in which he was playing. Princeton Economics’ website, which is filled with Mr. Armstrong’s essays on the market, shows a photo of Mr. Armstrong with former United Kingdom Prime Minister Margaret Thatcher at one of the firm’s conferences in 1996.”4

  It is tempting to look at these examples and despair. If people are this easily duped, how can anyone hope to expert-proof themselves? The answer, of course, is that no one can, but there are some things we can all do to improve our chances.

  Recognizing Propaganda

  Between World Wars I and II, the rise of the public relations industry in the United States and the growing use of propaganda by fascist and communist governments prompted a group of social scientists and journalists to found a remarkable organization called the Institute for Propaganda Analysis. The IPA published a periodic newsletter that examined and exposed manipulative practices by advertisers, businesses, governments, and other organizations. Fearlessly eclectic, it hewed to no party lines and focused its energies on studying the ways that propaganda could be used to manipulate emotions. It is best known for identifying several basic types of rhetorical tricks used by propagandists:1. Name-calling. This technique, in its crudest form, involves the use of insult words. Newt Gingrich, the former Speaker of the U.S. House of Representatives, is reported to have used this technique very deliberately, circulating a list of negative words and phrases that Republicans were instructed to use when speaking about their political opponents—words such as “betray,” “corruption,” “decay,” “failure,” “hypocrisy,” “radical,” “permissive,” and “waste.” The term “junk science,” which we discussed in Chapter 9, is an obvious use of this same strategy. When name-calling is used, the IPA recommended that people should ask themselves the following questions: What does the name mean? Does the idea in question have a legitimate connection with the real meaning of the name? Is an idea that serves my best interests being dismissed through giving it a name I don’t like?

  2. Glittering generalities. This technique is a reverse form of name-calling. Instead of insults, it uses words that generate strong positive emotions—words like “democracy,” “patriotism,” “motherhood,” “science,” “progress,” “prosperity.” Politicians love to speak in these terms. Newt Gingrich advised Republicans to use words such as “caring,” “children,” “choice,” “commitment,” “common sense,” “dream,” “duty,” “empowerment,” “freedom,” and “hard work” when talking about themselves and their own programs. Democrats, of course, use the same strategy. Think, for example, of President Clinton’s talk of “the future,” “growing the economy,” or his campaign slogan: “I still believe in a place called Hope.”

  3. Euphemisms are another type of word game. Rather than attempt to associate positive or negative connotations, euphemisms merely try to obscure the meaning of what is being talked about by
replacing plain English with deliberately vague jargon. Rutgers University professor William Lutz has written several books about this strategy, most recently Doublespeak Defined. Examples include the use of the term “strategic misrepresentations” as a euphemism for “lies,” or the term “employee transition” as a substitute for “getting fired.” Euphemisms have also transformed ordinary sewage sludge into “regulated organic nutrients” that don’t stink but merely “exceed the odor threshold.”

  4. Transfer is described by the IPA as “a device by which the propagandist carries over the authority, sanction, and prestige of something we respect and revere to something he would have us accept. For example, most of us respect and revere our church and our nation. If the propagandist succeeds in getting church or nation to approve a campaign in behalf of some program, he thereby transfers its authority, sanction, and prestige to that program. Thus, we may accept something which otherwise we might reject.” In 1998, the American Council on Science and Health convened what it called a “blue-ribbon committee” of scientists to issue a report on health risks associated with phthalates, a class of chemical additives used in soft vinyl children’s toys. People familiar with ACSH’s record on other issues were not at all surprised when the blue-ribbon committee concluded that phthalates were safe. The committee’s real purpose, after all, was to transfer the prestige of science onto the chemicals that ACSH was defending.

  5. Testimonial is a specific type of transfer device in which admired individuals give their endorsement to an idea, product, or cause. Cereal companies put the pictures of famous athletes on their cereal boxes, politicians seek out the support of popular actors, and activist groups invite celebrities to speak at their rallies. Sometimes testimonials are transparently obvious. Whenever they are used, however, the IPA recommends asking questions such as the following: Why should we regard this person (or organization or publication) as a source of trustworthy information on the subject in question? What does the idea amount to on its own merits, without the benefit of the testimonial?

  6. Plain folks. This device attempts to prove that the speaker is “of the people.” Even a geeky multibillionaire like Bill Gates tries to convey the impression that he’s just a regular guy who enjoys fast food and popular movies. Politicians also use the “plain folks” device to excess: George Bush insisting he eats pork rinds; Hillary Clinton slipping into a southern accent. Virtually every member of the U.S. Senate is a millionaire, but you wouldn’t know it from the way they present themselves.

  7. Bandwagon. This device attempts to persuade you that everyone else supports an idea, so you should support it too. Sometimes opinion polls are contrived for this very purpose, such as the so-called “Pepsi Challenge,” which claimed that most people preferred the taste of Pepsi over Coca-Cola. “The propagandist hires a hall, rents radio stations, fills a great stadium, marches a million or at least a lot of men in a parade,” the IPA observed. “He employs symbols, colors, music, movement, all the dramatic arts. He gets us to write letters, to send telegrams, to contribute to his cause. He appeals to the desire, common to most of us, to follow the crowd.”

  8. Fear. This device attempts to reach you at the level of one of your most primitive and compelling emotions. Politicians use it when they talk about crime and claim to be advocates for law and order. Environmentalists use it when they talk about pollution-related cancer, and their opponents use fear when they claim that effective environmental regulations will destroy the economy and eliminate jobs. Fear can lead people to do things they would never otherwise consider. Few people believe that war is a good thing, for example, but most people can be convinced to support a specific war if they believe that they are fighting an enemy who is cruel, inhuman, and bent on destroying all that they hold dear.

  The IPA disbanded at the beginning of World War II, and its analysis does not include some of the propaganda devices that came to light in later years, such as the “big lie,” based on Nazi propaganda minister Joseph Goebbels’s observation that “the bigger the lie, the more people will believe it.” Another device, which the IPA did not mention but which is increasingly common today, is the tactic of “information glut”—jamming the public with so many statistics and other information that people simply give up in despair at the idea of trying to sort it all out.

  To get an idea of how sophisticated modern propaganda has become, compare the IPA’s list of propaganda techniques with another list—the 12 points that consultant Peter Sandman advises his clients to bear in mind when attempting to minimize public outrage over health risks. Like the IPA’s list, Sandman is primarily interested in emotional factors that influence the public rather than what he and his clients consider the “rational, real” issues related to risk and public harm. His points, however, bear little surface similarity to the points on IPA’s list:1. Voluntary vs. coerced. Sandman observes that people are less likely to become outraged over risks that they voluntarily assume than over risks that are imposed upon them against their will. “Consider,” he suggests, “the difference between getting pushed down a mountain on slippery sticks and deciding to go skiing.”

  2. Natural vs. industrial. People tend to trust what can be promoted as natural: organic food or natural means of pest control.

  3. Familiar vs. exotic. “Exotic, high-tech facilities provoke more outrage than familiar risks (your home, your car, your jar of peanut butter),” Sandman observes.

  4. Not memorable vs. memorable. If you want to minimize outrage, not memorable is preferable. “A memorable accident—Love Canal, Bhopal, Times Beach—makes the risk easier to imagine,” Sandman explains. A memorable symbol or image can do the same thing. This is why evidence of genetically modified crops harming colorful Monarch butterflies prompted more concern than similar evidence of harm to other insects.

  5. Not dreaded vs. dreaded. For example, diseases like cancer, AIDS, plague, and tuberculosis create a great deal more public concern than others, such as heart disease.

  6. Chronic vs. catastrophic. Thousands of people are killed each year in highway accidents, but rarely in large groups. Plane accidents are much rarer and cause fewer deaths, but because they can cause large fatalities, air travel is much more widely feared than car travel.

  7. Knowable vs. unknowable. People tend to be less apprehensive about risks that are known and measurable than about risks that cannot be measured. The unknowable aspects of some risks make them more upsetting.

  8. Individually controlled vs. controlled by others. Individuals can decide whether they smoke cigarettes, exercise, or drive cars. They often can’t decide whether a factory emits pollution in their community.

  9. Fair vs. unfair. “People who must endure greater risks than their neighbors, without access to greater benefits, are naturally outraged,” Sandman says, “especially if the rationale for so burdening them looks more like politics than science.”

  10. Morally irrelevant vs. morally relevant. Arguing that a risk is small will fall on deaf ears if creating the risk is morally wrong in the first place. “Imagine a police chief insisting that an occasional child molester is an ‘acceptable risk,’ ” Sandman says.

  11. Trustworthy sources vs. untrustworthy sources. The “third party technique,” which we have discussed throughout this book, is a PR strategy built around the effort to put industry messages in the mouths of seemingly trustworthy sources.

  12. Responsive process vs. unresponsive process. “Does the agency come across as trustworthy or dishonest, concerned or arrogant?” Sandman asks. “Does it tell the community what’s going on before the real decisions are made? Does it listen and respond to community concerns?”

  At his best, Sandman is advising companies to listen to the public and respond to its concerns. In practice, however, his advice often lends itself to manipulation. One way that industry and government bodies try to make it appear that their activities are being accepted “voluntarily” rather than “coerced,” for example, is to create so-called community
advisory panels (CAPs) to seek the advice of people who live where their facilities are located. One of Sandman’s clients, the U.S. Department of Energy, used this tactic in trying to overcome the objections of Nevada residents over the DOE’s efforts to establish a national dump site for high-level nuclear waste at Yucca Mountain, Nevada. “The Secretary of Energy announced that there would be a ‘citizen advisory panel’ to discuss the Yucca Mountain project,” recall Judy Treichel and Steve Frishman, who have led the state’s campaign to block the project. “However, the real purpose of the panel was to invite opponents of the site such as ourselves to draft standards that would make the Yucca Mountain program acceptable. We were also invited to workshops in which government, industry and public representatives were supposed to ‘prioritize your values.’ Then we were supposed to ‘trade off’our values in order to reach an acceptable compromise. Our response was to ‘just say no.’ We were then told that we were being ‘unreasonable.’ ”5

 

‹ Prev