Fault Lines

Home > Other > Fault Lines > Page 19
Fault Lines Page 19

by Kevin M. Kruse


  In the early 1980s, the company introduced a number of innovations that revolutionized the field of personal computing. To distinguish its products from IBM’s more successful desktop models, Apple moved beyond its simple keyboard and command lines interface to introduce the mouse as a means of navigating an on-screen graphic desktop. In January 1984, Apple unveiled its cutting-edge Macintosh computer with a $1.5 million Super Bowl ad that revealed the company’s ambitions. “I want something that will stop people in their tracks,” Jobs had insisted. “I want a thunderclap.” He got his wish. John Scully, a major advertising executive whom Apple had recruited from Pepsi (famous for the “Pepsi Generation” campaign that brought the soft drink to younger Americans), had a massive budget. Directed by Ridley Scott, the Hollywood legend behind the sci-fi film Blade Runner, the Super Bowl ad depicted a bleak Orwellian future: dressed in drab, workers tromped through bleak industrial hallways to assemble before the ominous image of a Big Brother–type figure droning on a screen. In sharp contrast, a defiant blonde woman in brightly colored clothing ran into the huge hall, escaping faceless security forces and hurling a giant hammer at the screen, shattering it and sending a shockwave through the crowd. “You’ll see why 1984 won’t be like ‘1984,’ ” the narrator intoned. An instant sensation, the Super Bowl ad helped transform the image of the computer from being an engine of corporate and government dominance to an instrument of personal liberation.34 After 43 million Americans watched the ad, stunned by what they’d just seen, the legendary CBS football announcers John Madden and Pat Summerall asked each other: “Wow, what was that?” 35

  The original Macintosh offered a model for the new era of personal computing in terms of the ease and simplicity of use. Priced at $2,495, it took less than an hour to learn how to operate, compared to the twenty to one hundred hours needed to navigate other computers. “The mouse is the Macintosh’s biggest selling point,” one analyst noted, “because it takes almost no prior knowledge of computers to run the machine. As a result, Apple hopes the computer will appeal to a new market: executives, students and home users who want to use a computer without having to understand its inner workings.” 36 Jobs argued that the difference between IBM and Apple was “like the difference between the telegraph and the telephone. Back in the days of the telegraph, there were people who talked about putting a telegraph on every desk. But it would never have happened, because people weren’t willing to spend the time necessary to learn to use them.” 37 Throughout the decade, Apple continued to innovate, first improving the low memory of the first model and then introducing the Macintosh II, which had color graphics and even more expansion potential, in 1987. Despite such advantages, Apple still lagged behind IBM in sales for the remainder of the 1980s. Corporations, still the biggest buyer of computers, were more willing to buy from an established company like IBM than an upstart without much history.

  Other technological developments complemented the personal computer. Though few Americans knew about it, the federal government had already begun working on developing an interlocked network of computers, the internet, which would vastly increase the speed of communication. For ordinary Americans, the next best thing was dial-in services to information companies. CompuServe, founded by Jeffrey Wilkins and John Goltz, two graduate students in electrical engineering at the University of Arizona, emerged as an early innovator. The company’s primary mission had been to provide computing resources to Golden United Life Insurance, but it also offered a time-sharing system for individual users. Based in Ohio, the CompuServe Network Services launched in 1979 to offer access nationwide. By the end of the decade it was the first company to offer services such as sending electronic messages called “email,” home shopping, news and travel information, and “bulletin boards” devoted to discussing various issues. From the start, eleven newspapers, including the Washington Post and the Los Angeles Times, offered electronic versions of their print editions.38

  More than simply replicating print media, CompuServe pioneered new forms of what would come to be known as “social media.” Its most popular feature, for instance, was the “CB Simulator,” which allowed people to communicate in real time, simulating the “citizens’ band (CB) radio” that had been popularized by truckers in the 1970s. Connectivity to the outside world became a major selling point for CompuServe. “Someday,” its first print advertisement read, “in the comfort of your home, you’ll be able to shop and bank electronically, read instantly updated newswires, analyze the performance of a stock that interests you, send electronic mail across the country, then play Bridge with three strangers in LA, Chicago, and Dallas.” Under the photograph of a couple dressed in futuristic white outfits sitting with a computer in the living room, the caption read: “Welcome to someday.” The system offered subscribers access to a vast new world of information, but initially the costs were still prohibitive. Consumers who had already splurged on a personal computer now needed to invest in additional equipment, as well as pay for monthly access fees. “It only took a couple of days to realize that the biggest single obstacle I faced in trying to tap the enormous potential of my new home computer wasn’t my ignorance of computer languages or my lack of imagination,” one critic wrote. “Nope. More mundane than that. It was my personal checking account.” 39 Despite the costs, CompuServe made its mark immediately. By 1981, the company had roughly 10,000 subscribers.40

  The new computer networks immediately gave rise to the problem of “hacking.” The term went mainstream on September 5, 1983, when the cover of Newsweek magazine featured a photograph of Neal Patrick, a 17-year-old boy who was part of a group that had broken into several major computer systems, including Sloan-Kettering Cancer Center and Los Alamos National Laboratory. The cover story, “Computer Capers,” explained that these talented prodigies had done this just because they could. “It’s terribly unethical for computer centers and networks to have the low level of security that they do,” one expert warned. “It’s like leaving the keys in the ignition of an unlocked car.” 41 The hackers broke into businesses and institutions, sometimes for the thrill but other times for data theft and financial fraud. In one incident, the “Inner Circle,” a collective of fifteen teen hackers living in eight cities, including Detroit and New York, had accessed the GTE Telemail network in Vienna, Virginia, which handled electronic communications for corporations like Raytheon as well as NASA.42

  The phenomenon of hacking soon intersected with pop culture. In the hit film WarGames (1983), Matthew Broderick played a young hacker who broke into the Pentagon’s computer system and nearly initiated World War III. The film concluded dramatically with the computer running every possible scenario for a global thermonuclear war before concluding that “the only winning move is not to play.” The film scared Reagan so much that he asked advisors whether the film’s scenario could really happen. General John Vessey Jr., chairman of the Joint Chiefs of Staff, reported that “the problem is much worse than you think.” The fears generated by the movie triggered NSDD-145, the first major directive to deal with cybersecurity.43 But the film also inspired a greater number of Americans to get involved in hacking. “The scene exploded,” one observer explained. “It seemed that every kid in America had demanded and gotten a modem for Christmas. Most of these dabbler wannabes put their modems in the attic after a few weeks, and most of the remainder minded their Ps and Qs and stayed well out of hot water. But some stubborn and talented diehards had this hacker kid in WarGames figured for a happening dude.” In real life, however, hacking often proved to have a personal price. One of the Inner Circle whiz kids, a 16-year-old with a 163 IQ named Bill Landreth (known as “the Cracker”), had hacked into approximately 100 corporate computer systems since 1979, first using a Tandy Corporation TRS-80 and then the Apple II his parents purchased as a gift. In 1984, he was convicted in federal court for computer fraud. Placed on three years’ probation, he wrote a memoir of the mysterious hacking culture called Out of the Inner Circle (1985), which introduced man
y Americans to this subterranean world. Then, he mysteriously went missing himself in 1986.44

  The chaotic open world of computing was soon replicated across the realm of telecommunications. Most significantly, the Federal Communications Commission, with President Reagan’s support, brought an end to the “fairness doctrine” in 1987. The policy, in effect since 1949, had been based on the notion that the three major networks were “public trustees” licensed by the government and therefore needed to serve the entire public by airing competing perspectives on controversial issues. Since there had been a scarcity of available broadcasting space, the Supreme Court ruled in 1969 that the federal government could indeed make requirements of the networks and of radio stations in exchange for their access. That said, the doctrine had been far from perfect. FCC enforcement was notoriously weak. Many radio stations, particularly conservative religious broadcasters, had flouted the rule without punishment. Nonetheless, the doctrine served as a strong and constant disincentive for most major broadcasters to allow for openly political shows. Though it was intended to encourage a fair debate, in practice the fairness doctrine led networks to avoid employing anchors or reporters with obvious biases and instead to play most issues down the middle.45

  By the mid-1980s, the availability of many more channels through cable weakened the notion that the networks were solely responsible for disseminating information and, as a result, weakened support for the fairness doctrine as well. Many conservatives, both inside and outside the Reagan administration, argued that the government did not have the right to regulate broadcasts. At the same time, they complained that the networks, despite their claims of moderation, were actively pushing a liberal agenda. In 1987, in Meredith Corp. v. FCC, a federal court ruled that the FCC had no obligation to adhere to the fairness doctrine and could end it at any time. Accordingly, the FCC, now packed with Reagan appointees, dropped it. “Our action should be cause for celebration,” insisted FCC chairman Dennis Patrick. “By it, we introduce the First Amendment to the twentieth century.” The Democratic Congress tried to overcome the FCC decision with legislation that mandated a version of the fairness doctrine, but Reagan vetoed the bill. Suddenly, there were new possibilities on television and radio, as hosts could now openly make political arguments from one side of the spectrum with no requirement for balance.46

  In all, the 1980s witnessed a revolutionary transformation in the realm of communications. As old institutions from the telephone to the television became fragmented, individual consumers entered a new era in terms of how they could interact with one another and the world around them. Changes in government policy opened the door for programs that promoted specific political perspectives, while cable television and the new internet provided new homes for such programs, and much more. While many consumers marveled at the dizzying array of choices in communications and culture that now stood before them, some conservatives worried that these new opportunities threatened what they regarded as traditional values and sought to push back.

  Backlash

  The new fault lines cracked open by this new era of telecommunications and technology triggered a political backlash. In the summer of 1985, a pair of well-connected Washington spouses—Tipper Gore, the wife of Democratic senator Al Gore Jr. of Tennessee, and Susan Baker, the wife of Republican treasury secretary Jim Baker—launched a new group called the Parents’ Music Resource Center (PMRC). Despite their different political affiliations, the two shared a common identity as concerned mothers who had been horrified by their children’s exposure to sexually explicit rock songs by the performers Prince and Madonna, two prominent stars on MTV. A sympathetic newspaper columnist explained that Gore and Baker were “no blue-nose record smashers. They are mothers who are distressed that their children are being exposed to filth, violence, sado-masochism and explicit sex whenever they switch on their favorite radio station or watched televised videos.” Insisting that they were not calling for such music to be banned, the PMRC leaders instead framed themselves as consumers seeking a greater degree of command over the chaos of cultural choices that were now becoming available to them and their children, whether they wanted them or not. “It’s simply gone too far, and it has to be stopped,” Gore said; “at least we have a right to know what’s on an album so we can exercise some control.” 47

  Sensing a growing backlash, executives in the music industry responded to the PMRC complaints with incredible speed. They quickly took steps agreeing to self-regulation before the government imposed rules of its own (taking a lesson from the movie industry, which had undertaken a similar strategy with its ratings system when faced with complaints earlier in the twentieth century). For instance, at the PMRC’s request, the president of the National Association of Broadcasters asked forty-five record companies to start providing written copies of lyrics so radio and television stations could know precisely what they were broadcasting. Meanwhile, Stan Gortikov, president of the Recording Industry Association of America, personally traveled from New York to Washington to meet with PMRC leaders and hear out their demands. Although he balked at some of the more excessive proposals—which ranged from requiring record stores to hide albums with explicit covers to having production companies pressure broadcasters from airing their own artists’ songs and videos—he quickly agreed to the PMRC proposal to place warning labels on albums with “explicit lyric content.” Within days, all of the largest record companies backed the plan, seeing it as a form of self-regulation akin to the ratings system used in Hollywood.48

  Even with the recording industry’s decision, the PMRC pressed ahead in a campaign to mandate such warning labels by law. Sensing that there could be a political payoff to taking on the industry, the Senate Commerce, Science, and Transportation Committee scheduled formal hearings to consider the proposal. Many of the committee’s members, including Al Gore, were receptive to the idea of making the warnings a federal requirement, while others pressed for even greater action. “It is outrageous filth and we must do something about it,” insisted Senator Ernest Hollings of South Carolina, the ranking Democrat. “If I could find some way constitutionally to get rid of it, I would.” Members of the music industry, however, insisted that such intervention was wholly unconstitutional. Danny Goldberg, chairman of a smaller record company and president of the newly formed activist group Musical Majority, argued that warning labels were “absolutely a move toward censorship.” Many musicians agreed, warning of a slippery slope ahead. “Right now, it’s sex and violence,” noted singer John Cougar Mellencamp; “before long, it’ll be ‘That’s just too political.’ ” 49

  In September 1985, the two sides in this increasingly heated debate came face-to-face in the Senate hearings. Hundreds of spectators gathered for the show, with hundreds more waiting outside hoping to secure a seat. To make their concerns clear, PMRC leaders showed the assembled crowd some popular videos from MTV, including Van Halen’s “Hot for Teacher” (which showed schoolboys ogling a bikini-clad teacher) and the heavy-metal group Twisted Sister’s “We’re Not Gonna Take It” (which critics insisted promoted violence). The PMRC claims, however, did not go unchallenged. Three prominent musicians—country artist John Denver, avant-garde performer Frank Zappa, and Dee Snider, the lead singer of Twisted Sister—were on hand to push back. Although their wildly different styles made them what the Los Angeles Times called “the weirdest pop music trio since Alvin and the Chipmunks,” the three performers presented a determined and united front against the proposals. Denver noted that even his wholesome songs had often been misunderstood. Zappa, dressed conservatively in a dark suit with his own teenage children in tow, claimed that the lyrics in question represented only a small minority of music and the proposal to crack down on the whole industry was therefore “the equivalent of treating dandruff by decapitation.” But Snider stole the show. Though he arrived wearing sunglasses and a cut-off denim vest, with long, wiry bleached hair, he insisted that the PMRC critics had badly misunderstood him and his music. Presentin
g himself as a devout Christian and the father of a three-year-old boy, he claimed his personal example was one that critics should praise. “I do not drink, I do not smoke, and I do not do drugs,” he insisted. His songs reflected his conservative values, he said, forcefully rebutting the PMRC charges point by point. “Ms. Gore claimed that one of my songs, ‘Under the Blade,’ had lyrics encouraging sadomasochism, bondage and rape,” Snider noted. “The lyrics she quoted have absolutely nothing to do with these topics. On the contrary, the words in question are about surgery and the fear that it instills in people. . . . I can say categorically . . . that the only sadomasochism, bondage and rape in this song is in the mind of Ms. Gore.” 50 Though the Senate ultimately decided not to act, its hearings served both to publicize and polarize the issue across the nation.51 The television show American Bandstand prevented the singer Sheena Easton from performing “Sugar Walls” because the PMRC had complained about the lyrics, while Marvin Gaye’s record company forced him to retitle a song from “Sanctified Pussy” to “Sanctified Lady” to avoid a likely controversy.52

 

‹ Prev