Book Read Free

Fault Lines

Page 18

by Kevin M. Kruse


  At first, Turner and Schonfeld tried to pressure AT&T into lowering its rates so that they could broadcast through the same landlines as the networks, but the telephone monopoly sided with its biggest customers. Instead, Turner and Schonfeld purchased space on RCA’s satellite, following the path taken by ESPN. Alternately describing their plans as replicating an old-fashioned newspaper or all-news radio stations like 1010 WINS in New York, CNN’s ultimate goal was to beat the networks in terms of the speed and depth of its coverage. It aimed both to break news before the “big three” but also to handle a broader range of stories, from human-interest tales to coverage of weather catastrophes. They believed doing so would offer a product that was fundamentally different from the short nightly newscasts of the networks. “We have to let ’em know we’re here,” Schonfeld explained as he formulated the programming plan. “We’re gonna match ’em and then we’re gonna go beyond—to show ourselves to be something.” 10

  CNN debuted at 6:00 p.m. on June 1, 1980. With modest resources, it had only six domestic bureaus and a few foreign correspondents. Two hundred of Turner’s guests were invited to watch the start of this channel, which its founder promised would continue to broadcast “until the world ends.” Turner dedicated the channel, on air, with a speech about his commitment to improving the world through knowledge, followed by the national anthem. Anchors David Walker and Lois Hart, a married couple, then began the first broadcast. The rest of the original on-air talent featured veteran journalists like Daniel Schorr, formerly of CBS News, as well as a host of established figures, ranging from political pundits like Rowland Evans and Robert Novak to popular personalities like pop psychologist Dr. Joyce Brothers. But behind the scenes, CNN’s small headquarters in Atlanta had a production staff filled with young 20- and 30-year-olds who had little experience in the industry.11

  To compensate for its meager resources, the network purchased footage from reporters at independent local stations, a strategy that let CNN keep its staff small. Due to these limitations, CNN’s original format was fairly straightforward. The daily schedule was divided into segments that covered a variety of topics, including entertainment, politics, business, sports, and more. In addition, there was a two-hour evening news show, an expansive format that contrasted with the half-hour network news programs. Meanwhile, thanks to its twenty-four-hour format, CNN was able to go to live breaking news at a moment’s notice, unlike the networks, which were reluctant to break into their regularly scheduled programming except in extreme circumstances. CNN quickly demonstrated the advantage the twenty-four-hour news channel provided when it was the only network live-broadcasting Ronald Reagan during the event at the Washington Hilton that culminated with John Hinckley’s assassination attempt in March 1981. As CNN took news into directions previously unseen in print or network television, its subscriber base rose considerably, from 2 million to 4.3 million in its first year.12

  The networks immediately saw the threat posed by CNN. “It’s only the broadcasting establishment that doesn’t like me,” Turner said. “They’re pea-green with envy.” 13 And indeed, the networks soon sought to imitate Turner’s model. They launched one- or two-hour “news magazine” shows that focused on crime, crises, and scandals, expanding on the successful models of Nightline and 60 Minutes. But they tried to compete with CNN on cable, too. Within a year of CNN’s launch, ABC and Westinghouse started a new Satellite News Channel of their own in June 1981. The station, funded by advertising, broadcast eighteen-minute shows, with another five minutes of each half hour devoted to regional news stories. In January 1982, Turner responded by introducing CNN-2, later restyled “Headline News,” an offshoot that provided thirty-minute news blocks throughout the day. Importantly, it was offered free of charge to subscribers of the original CNN. Unable to compete, ABC shut down its operation, selling Turner the rights for $25 million.14 With no real competitors on cable, CNN quickly became the major source of breaking news for many people around the nation. Within two years the station reached about 13.9 million homes and attracted over 200 advertisers. In short order, it fundamentally restructured the American news cycle, making it ongoing and instantaneous.15

  The growing competition from televised news stimulated changes in the print industry, which struggled to keep up. On September 15, 1982, the Gannett Company launched a new national newspaper called USA Today. With color print, attractive visual graphics, and shorter, punchier articles, the newspaper sought to capture readers who would otherwise turn on the tube for their news. In response to criticism that the press only covered scandal and tragedy, the editors self-consciously sought to put more “good news” into stories. The front page of the first issue included traditional stories about politics and a plane crash in Minnesota (though the paper distinguished itself with a color photograph of the burning airplane). But it also carried softer stories, such as a piece called “Your Kid REALLY may be sick of school” and another on the death of Princess Grace in Monaco, which the editors selected instead of a piece on the assassination of the president-elect of Lebanon.16 Other newspapers would eventually follow USA Today’s lead, including the esteemed New York Times, which instituted weekly special sections on issues such as science and “living” and, finally, in 1997, started using color photographs as well.17

  As CNN reshaped the news world, another innovation in cable—Music Television (MTV)—did the same for entertainment. The idea came from Robert Pittman, a former NBC official who had produced a music video show for the network in the 1970s. Believing the show could be expanded to a full channel, Pittman forged a partnership between Warner Communications, which sought to be on the cutting edge in technology and entertainment, and American Express, the credit card company that saw cable and computers as its next great markets. The Warner Amex Satellite Entertainment Company set up the operation and got it running. On August 1, 1981 at 12:01 a.m., cable viewers saw the image of astronaut Buzz Aldrin from the 1969 Apollo 11 moon landing, with the MTV logo superimposed on the American flag. “Ladies and gentlemen,” an announcer intoned, “rock and roll.” The first video aired on the network, by the rock band the Buggles, was aptly titled: “Video Killed the Radio Star.” “In my mind and in my car, we can’t rewind, we’ve gone too far,” the band sang. “Pictures came and broke my heart.” 18

  Playing music videos all day long, MTV pioneered a faster, provocative format for entertainment, one that was visually splashy and rapid-fire. The videos, according to one of the best studies on the channel, distinguished themselves with “aggressive directorship, contemporary editing and FX, sexuality, vivid colors, urgent movement, nonsensical juxtapositions, provocation, frolic, all combined for maximum impact on a small screen.” Older musicians were surprised by the new style. Billy Gibbons, one of the guitarists from the rock band ZZ Top, first thought he’d stumbled onto the broadcast of a concert. “Twelve hours later, we were still glued to the TV,” he recalled. “Finally, somebody said, ‘No, it’s this twenty-four-hour music channel.’ I said, ‘Whaaat?’ MTV appeared suddenly—unheralded, unannounced, un-anything.” 19 The innovative form elevated new bands as varied as the synthesized English duo the Eurythmics and the L.A. glam metal band Mötley Crüe, acts that learned how to adapt and exploit the new format well. Some older bands, meanwhile, adjusted themselves too. ZZ Top became a popular fixture on the channel, pairing their old style with new videos that highlighted fast cars and attractive women. Dire Straits, another established rock band, found the greatest success of its career with a 1985 album fueled by slick videos. Notably, the video for “Money for Nothing” featured cutting-edge computer animation and guest vocals by Sting, lead singer of the Police, who offered the cable network’s slogan as the song’s refrain: “I want my, I want my, I want my MTV!” The song, intended as a tongue-in-cheek critique of MTV, nevertheless became a hit on the channel.20

  MTV followed a strategy from radio advertising, known as “narrow casting.” In contrast to the “broad casting” approach of traditional
television networks, which sought to build large audiences, MTV targeted smaller specialized markets.21 “Rock music is not just a form of entertainment,” Pittman explained. “It also represents a lifestyle, a value system, to that age group. If you’re 50 years old, you might ask a new acquaintance what church they go to. But if you’re 30, you’d ask what kind of music they like.” 22 Accordingly, MTV’s programming revolved around video jockeys (“VJs”) introducing blocks of three-to-four-minute videos, much as a radio DJ would. The station was only available in a few cities when it started, but the phenomenon quickly caught on. “The buzz in this town for MTV is incredible,” a radio station manager in Tulsa, Oklahoma, told Billboard. “We added two records [to our rotation] due to MTV airplay.” As the VJs toured America to meet with cable operators and sell their product, their influence became clear. “Within six months,” VJ Mark Goodman recalled, “we started getting these stories back from small towns in the Midwest and in the South where people were going into record stores and asking for the Buggles, who had been off the shelves for about three years by 1981. I also remember doing an appearance in Cheyenne, Wyoming at a record store where thousands of people showed up. I said, ‘What’s going on?’ They said, ‘You.’ I was completely blown away, and I said, ‘Okay, it’s working.’ ” 23

  While successful, MTV’s narrowcasting approach was soon criticized as being a bit too narrow. In a live interview on the network in 1983, icon David Bowie noted that MTV has “a lot going for it. I’m just floored by the fact that there are so few black artists featured on it. Why is that?” When Mark Goodman defensively noted that “the company is thinking in terms of narrow-casting,” Bowie responded coolly: “That’s evident. It’s evident in the fact that the only few black artists that one does see are on about 2:30 in the morning to around 6. Very few are featured predominantly during the day.” Chastened by the criticism, the network soon began broadening its lineup to feature more popular black artists. Michael Jackson’s “Billie Jean” video, which debuted on March 10, 1983, opened the door to black performers on a station whose primary audience had been middle-class white kids. The video, along with two others from the smash album Thriller, weakened the race barrier and brought the still-struggling channel what it had long sought: ratings and profits. MTV now reached 24 million homes and, more importantly, a quarter of teens who watched television daily. Notably, when MTV held its first annual Video Music Awards in 1984, the biggest winners were both African Americans. Keyboardist Herbie Hancock won five awards; Jackson took home three. (Bowie, meanwhile, won Best Male Video.)24

  MTV’s narrowcasting model was replicated in larger cable systems. The goal was to develop stations that appealed to distinct portions of the viewership that would focus on their favorite channels. “Cable has not so much won its audience as broadened to meet it,” one reporter explained. “In part this makes the absurd little competitive duels on commercial TV . . . seems quite quaint. Because it is all-inclusive, the cable systems must, it is true, cater to different tastes and here you will find a practically continuous obbligato of nowhere Clint Eastwood shoot-’em-ups, of religious barkers exhorting me to send $15 a month to some missionary in Elyria, Ohio, of reruns of ‘Get Smart’ and ‘Petticoat Junction’ or worse.” With each channel seeking small segments of the market, there was no need for a commons anymore.25

  Technology Splitting

  The fragmentation of consumer culture was also fueled by the spread of the videocassette recorder (VCR). The technology had been around for decades but only started to reach a significant market share during the early 1980s. With several Japanese-based companies such as Sony, Toshiba, and JVC pioneering new products, the VCR market grew from representing 10 percent of homes in 1982 to over 30 percent by 1985.26 Initially, there were two rival formats for videocassettes: Betamax and VHS. Though Betamax was the more sophisticated form of technology, VHS won out. The pornography industry was one of the first to capitalize on the technology and become a leading force in home entertainment; executives there decided to produce films solely for VHS. While the VCR revolution allowed Americans to watch movies of all kinds at home, they soon found that they could record television programs as well. In a landmark decision in 1984, Sony v. Universal City Studios, the Supreme Court ruled that as long as the tapes were not resold for profit, home recordings of network broadcasts did not violate existing copyright laws. The decision was a huge victory for those who produced the machines, which would greatly enhance the ability of viewers to control when they watched certain shows and to gain access to movies and programming that were not being aired on the networks.27

  That same year, the telephone monopoly AT&T also fell. The Department of Justice had initiated an antitrust suit in 1974 and, after eight years of litigation, the company reached a settlement in January 1982. Under the landmark agreement, AT&T promised to divest its local companies. Starting on January 1, 1984, the communications giant was divided into seven independent companies—Regional Bell Operating Companies, or “Baby Bells”—that worked on their own. For many consumers, the adjustment to the new fragmented telecommunications landscape was initially confusing. “When there was one [AT&T] account group handling us and we had a problem, we could turn the whole thing over to them,” noted a frustrated executive at Xerox. “Ma Bell’s end-to-end responsibility no longer exists.” But the breakup also spurred competition. The new freedoms of deregulation, noted the chairman of one Baby Bell, means “we’re fleeter afoot.” Most significantly, the new marketplace led to breakthroughs in phone technology as competitors sought to gain advantages with improvements on the traditional landline.28

  The cellular phone was one of the first of these innovations to hit the market. The FCC had loosened the regulations that served as barriers to this kind of innovation back in the 1960s, but it took time for the technology to develop. Following FCC approval for carriers to develop cellular machines in 1982, Motorola finished work on the DynaTAC 8000X phone, the first major cell phone available for consumer purchase. When it debuted in March 1984, the phone sold for almost $4,000, lasted for only half an hour of talking time, and weighed nearly two pounds. Despite its costs, “the Brick” became a huge success. “We didn’t design them for teenagers—well, unless it was a teenager with $4,000,” one executive explained, “but we couldn’t build them fast enough. Businesses started taking them on and it became something else, a part of business—not a convenience, but a necessity.” 29 The technology for carrying calls steadily improved, and the size and price dropped too. By 1987, there were almost a million cell phone subscribers nationwide.

  The greatest technological innovation of the decade, however, came with the arrival of personal computers. Although computers had become a fixture of life in postwar America, their considerable size, complexity, and cost meant that they were largely confined to the workplace. In August 1981, industry leader IBM introduced its first desktop computer designed for home use. IBM had benefitted from a partnership it launched in 1980 with the upstart computer software company Microsoft, which had been founded five years earlier by Bill Gates and Paul Allen. As ordinary consumers rushed to buy the new machines, many predicted that personal computing would revolutionize the way people lived. In January 1983, Time ran a cover story on “The Computer: Machine of the Year.” “By the millions,” the article began, “it is beeping its way into offices, schools and homes. . . . A personal computer . . . can send letters at the speed of light, diagnose a sick poodle, custom-tailor an insurance program in minutes, test recipes for beer.” 30

  Although IBM led the way, it faced competition from an upstart. Since the mid-1970s, Steve Jobs’s Apple Computers had been working to make computers cheaper and easier to use. Jobs, a college dropout who had been taught to fix cars, radios, and televisions by his stepfather, had become interested in computing at a young age and quickly found friends who shared his new hobby. While computing had previously been reserved for massive institutions like the federal government or larg
e corporations, Jobs’s generation set its sights on creating a truly “personal computer,” a machine that was small, compact, and simple enough for an individual to use in the home. Jobs caught the attention of Robert Noyce, who had created the first silicon microchip and then founded Fairchild Semiconductor, the first major business in what would become known as Silicon Valley.31 Noyce and another scientist at Fairchild founded the Intel Corporation in 1968 and, with it, launched the semiconductor industry, which made the region into a hub of economic growth. “The Mayor of Silicon Valley,” as Noyce was known, soon left his daily role as president at Intel and started to spend more time nurturing new talent in the region, including Jobs. “Steve would regularly appear at our house on his motorcycle,” Noyce’s wife, Ann Bowers, recalled; “he and Bob were disappearing into the basement, talking about projects.” 32

  Jobs began working out of a garage near Los Angeles, along with his high school friend Stephen Wozniak, another college dropout who had worked as an engineer for Hewlett Packard. Together, Jobs and Wozniak developed a prototype in the summer of 1975. To their delight, it worked. “It was the first time in history,” Wozniak remembered, that “anyone had typed a character on a keyboard and seen it show up on their computer screen right in front of them.” One year later, the pair had created the first personal computer, a small device with a keyboard that could connect to a television monitor. After the technology companies they had worked for expressed no interest in the product, Jobs and Wozniak decided to create the computers themselves through a new company of their own called Apple. They soon received their first order—for fifty of their crude “Apple I” computers—from a local store and frantically began building them in their garage. As sales increased, they poured the profits into developing the more sophisticated “Apple II,” a much smaller version that came with a standardized keyboard and power supply. The pair incorporated the company in 1977, releasing the Apple II to first year sales of $2.7 million. The company’s ads proclaimed the dawn of a bold new era, ushered in by “the home computer that’s ready to work, play, and grow with you.”33

 

‹ Prev