The twenty-four-year-old Jobs now looked nearly as natty as his PR man, having trimmed his hair, put on a sport coat and tie, and assumed the breezy confidence of a new-generation California capitalist. McKenna steered his client for one-on-ones with key reporters he’d gotten to know over his years in the business, not only the usual suspects from the science and technology beat, but also the general-issue business reporters who covered Wall Street. McKenna was particularly attentive to seeding good coverage in The Wall Street Journal, which the team considered a particularly important “positioning/educational tool” because it would readily print the quotes of company executives and supportive analysts, especially in industries—such as personal computing—where the reporters knew little. Over lunch or dinner, Jobs not only gave the reporters his pitch about Apple, but also provided a crash course on the history and philosophy of computing.15
“Man is a toolmaker [and] has the ability to make a tool to amplify the inherent ability he has,” Jobs liked to say. “We are building tools that amplify human ability.” First there had been ENIAC, he explained. Then came commercial mainframes and time-sharing. Now the march of progress would deliver a computer to every desk.16
In his accessible retelling of how computers evolved, and where they were going, Steve Jobs never mentioned government’s role. The war work that had inspired ENIAC, and the supercomputers that cascaded in its wake, never figured in the story. Although Jobs often cited Bill Hewlett and Dave Packard as Apple’s entrepreneurial inspiration, he never talked about the military-industrial complex that had helped HP and the rest of the Valley grow. He didn’t mention the moon shots that had propelled the microchip business or the research grants that built Stanford into an engineering powerhouse. He and his micro brethren didn’t mention that Lockheed still hummed away as Silicon Valley’s largest employer.
Steve Jobs, the adopted son of a man with a high school education, was indeed a product of the American Dream—and a testament to what was possible amid the postwar abundance that, particularly in Northern California, came out of considerable public investment in people, places, and companies. But if the government was anywhere in his story, it lurked in the background as a dull menace, responsible for nuclear anxiety, misguided foreign wars, enforcement of an outdated status quo.
Although too young for the counterculture, Jobs confessed that the tumult of the 1960s had been personally transformative. “A lot of ideas that came out of that time focused on really thinking for yourself, seeing the world through your own eyes and not being trapped by the ways you were taught to see things by other people.” And what Jobs found, as he told his rapt listeners, was that the way to change the world was through business, not politics. “I think business is probably the best-kept secret in the world,” he offered. “It really is a wonderful thing. It’s like a razor’s edge.”17
The ability to tell a story—and to frame America’s extraordinary high-tech history—made Steve Jobs a new-model business celebrity, the premier evangelist for the personal-computer business. “He was one of us in a bigger, alien world, explaining our immature little industry and products to a much broader public than we could reach on our own,” explained Esther Dyson, a junior Wall Street analyst who was one of the few tracking the personal-computer business at the time. “Our small industry had lots of its own stars, but only Steve had the charm and eloquence to be a star to the outside world.”18
COMPUTER COWBOYS
Steve Jobs’s tale of the high-tech world wasn’t new, of course. He was just reflecting back what he had heard his entire life, from the halls of Homestead High School to the night shift at Atari and the show-and-tells of Homebrew. It was the latest iteration of the free-market narrative that had propelled the Valley from the start.
The mythos had only intensified as the semiconductor generation matured, and the personal-computer industry gained market velocity. Now the independent zeitgeist was topped with a dollop of high-net-worth self-satisfaction. Reporters mingling among a trade-show crowd in 1979 remarked on the Valley’s “singular sense of frontierism, its self-awareness that says, ‘Hey, we’re the people making it all happen.’” Tech was the domain of rebels, of cowboys, of revolutionaries. It was business with an authentic soul.
The trade press for the microcomputer industry became a critical amplifier of this message in the last years of the 1970s. As the consumer base grew, there now was serious money to be made in this kind of publishing. How-to textbooks with titles like BASIC Computer Games sold in the hundreds of thousands, providing pragmatic and friendly navigation through the often-bewildering world of personal computing for a new set of users and neophyte programmers.
One textbook impresario was Adam Osborne, a British chemical engineer who had caught the hobbyist bug after moving to California in the early 1970s. Standing out among the shaggy Californians with his trim mustache and posh accent, Osborne became legend for his how-to manual, An Introduction to Microcomputers. Unable to find a trade publisher interested in so esoteric a subject, Osborne published the text himself, which proved to be a lucrative decision after eager hobbyists snatched up more than 300,000 copies. With that, he launched an entire publishing imprint devoted to the field, which he sold to textbook giant McGraw-Hill for a healthy sum.
The humble tabloids of the microcomputer tribe had morphed into much glossier trade magazines, enlarging their subscription bases along the way. West Coast Computer Faire impresario Jim Warren’s Intelligent Machines Journal became InfoWorld. Adam Osborne wrote a column for it, where his acerbic product reviews and blunt takes on company prospects earned him the moniker of “the Howard Cosell of the industry.” Computer-club stalwarts like Sol Libes and Lee Felsenstein became regular contributors. New titles like Personal Computing and Compute! joined Byte on the newsstands, all bulging with full-page ads for the newest micro models. The shelf devoted to computer specialist publications got longer. Nearly every major personal-computer company had at least one magazine or newsletter dedicated to its fans and users. These special-interest magazines also showed readers how to charge up their computing power through peripherals and software, further driving sales.19
It was clear from reading these magazines that personal computing was growing up. Gone were the strings of DIY software code. Instead there were stock forecasts and product announcements. But InfoWorld wasn’t quite BusinessWeek, yet: the micro-generation evangelism was still on display in how reporters and editors talked about the computer and its potential, further binding machine to myth. “The personal computer operator is the Electronic Man on Horseback riding into the (sinking) Western sun,” declared InfoWorld columnist William Schenker in early 1980. “He is the last of the rugged individualists, and the personal computer is his only effective weapon.”20
Such big ideas also gained exposure and velocity courtesy of the popular subgenre of business books that alternately despaired about the state of the “old” American economy and made optimistic forecasts about its tech-driven future. Looming largest was Alvin Toffler, whose Future Shock had become a touchstone for information-overloaded Americans during the volatile 1970s. Toffler’s ardent fans included Regis McKenna, who readily admitted to reading Toffler “over and over.”21
The ideas in the dog-eared pages of Future Shock simmered just below the surface of Apple’s cheerful ad copy and Steve Jobs’s techno-evangelism. “Important new machines,” Toffler had written back in 1970, “suggest novel solutions to social, philosophical, even personal problems.” A decade later, Jobs shared similar ruminations: “I think personal computers will promote much more of a sense of individualism, which is not the same as isolation. It will help someone who is torn between loving his or her work and loving the family.”22
By that point, the admiration had become mutual. Toffler, seeing the personal computer surge as evidence that his predictions were right on the mark, issued a follow-up volume, The Third Wave, in early 1980. In 500-plus pages bu
rsting with gloriously operatic Toffler-isms like “techno-sphere” and “info-sphere,” Toffler declared the coming of an entirely new era in political economy, fueled by computer technology. Giant institutions of the Industrial Age—including American-style big government—would become “de-massified” and diversified. The market would become one of personal autonomy, of near-infinite consumer choice. It was a future, he wrote, “more sane, sensible and sustainable, more decent and democratic than any we have ever known.”23
THE ROSEN LETTER
McKenna could get reporters to reprint the gospel of Steve Jobs, and Toffler could persuade curious readers to buy books about a marvelous techno-future. But convincing corporate managers and investors to be bullish on personal computing required credible voices from inside the financial industry as well. Because it was all fine and good for a computer to be a tool of personal empowerment, but the real money in the industry came when a computer was an essential tool for business. Could Apple be another Digital? Another Wang? Another IBM? Wall Street wasn’t sure.
Enter Ben Rosen. Despite investors’ skepticism about microcomputing, the geeked-out analyst had kept up his careful tracking as the industry improved on early models with more-powerful machines. He noted with interest when Texas Instruments jumped into personal computing by November 1979, releasing the TI-99/4, a natty desktop designed to go head-to-head with the Apple II. When you added up all the peripherals, it would cost about $2,000 to turn the TI computer into something that could do a decent job of calculating your personal tax returns. Yet it showed that micros might be getting one step closer to taking a bite out of the minicomputers’ small-business base.
Rosen watched closely as Tandy/Radio Shack upped its game as well, releasing a higher-end TRS-80 II, and following up within the year with three more product releases in quick succession, each with more computational oomph and wider market appeal. “Companies began to view a favorable report from Ben Rosen as the key to success in the personal computer business,” observed Regis McKenna. “A bad review from Rosen was the kiss of death.”24
Happily for McKenna, Ben Rosen reserved his five-star reviews for Apple. One of the semiconductor guys that Rosen had gotten to know through his conferences was Mike Markkula, and one of the first things that Markkula did once he joined Apple was to introduce Rosen to Steve Jobs and to the Apple II. Rosen was immediately hooked, lugging the machine Markkula had given him between home and work, because the Morgan Stanley IT department refused to buy him one. He soon became, in his words, “the self-anointed evangelist of personal computers in general and Apple in particular.”25
Rosen brought his Apple II along when he visited his investor clients. He provided demonstrations to visiting financial journalists, earning Apple—and Rosen—valuable publicity among business readers. “Apple for Ben Rosen,” read a headline in an August 1979 edition of Forbes magazine, atop an article that showed how the analyst used an Apple II to do his job. He talked up the company and its products inside Morgan Stanley—an establishment broker if there ever was one, representing some of the largest, best-known brands in the country. The Apple II wasn’t a toy, he’d tell his colleagues, and Apple was a serious business. In payment for his loyalty, Rosen got world-class customer support. “When he didn’t understand some feature of the Apple,” Time reporter Michael Moritz noted, “he called Jobs or Markkula at home.”26
Then, two months after Rosen’s appearance in Forbes, a small Boston-based software company released VisiCalc, a full-blown business application for a personal computer—and they made it for the Apple II. The duo behind this electronic spreadsheet program were another example of the magic that could happen when you married engineering and finance: Dan Bricklin was a Digital veteran and Harvard MBA, and Bob Frankston was a computer scientist with multiple degrees from MIT. In a personal-computer software world dominated by space-invader games and rudimentary educational programs, Bricklin and Frankston delivered a piece of software that proved the micro could be a serious business machine. Adam Osborne proclaimed grandly that VisiCalc had finally produced something “that allows these silly little boxes to do something useful.”27
By the time Apple’s “new kind of bicycle” ad splashed on the pages of the Journal late that summer, Ben Rosen was sounding positively triumphant about where both semiconductors and personal computers would take Wall Street. “The market is looking ahead, well ahead, over the recession’s valley,” he declared. “The salutary effect of the Golden Age of Electronics will be dramatic and long term—certainly extending through the rest of this century and probably well into the next.” Further boosting Rosen’s optimism, fusty old Morgan Stanley eagerly signed on to underwrite the coming Apple IPO. However, the revolution had not won over all converts—yet. Regulators in Massachusetts still deemed Apple stock “too risky” and barred residents from buying it. Tellingly, the denizens of Route 128 would not get a piece of Silicon Valley’s most glorious deal to date.28
GONE PUBLIC
On December 12, Apple made its stock market debut, offering 4.6 million shares at $22 apiece. Hambrecht & Quist joined Morgan Stanley as the deal’s underwriters. Within weeks, the company had a valuation of nearly $2 billion: larger than those of Ford Motor Company, Colgate Palmolive, and Bethlehem Steel. Burt McMurtry hadn’t seen anything like it. “We are living in a goofy time,” he mused to a reporter.29
Apple’s IPO enriched the Silicon Valley ecosystem up and down the line. Early venture investors like Arthur Rock and David Morgenthaler made multiple millions, as did the two Steves and their founding team. (A few years later, a fanciful drawing of the perennially media-shy Rock, wearing a suit made of money, appeared on the cover of Time magazine.) Morgan Stanley’s delight at the results prompted the firm to dive wholeheartedly into the technology business. The broker that wouldn’t buy Ben Rosen an Apple II became the favored dealmaker for Silicon Valley companies, brokering some of the biggest IPOs of the next two decades.30
Being bullish on personal computers—and Apple—paid off handsomely for Rosen as well. The next year, he handed off his newsletter and conference business to his young deputy Esther Dyson and teamed up with chip entrepreneur L. J. Sevin to start a new venture capital operation based out of Dallas. Sevin Rosen went on to have giant paydays with hits like Compaq Computer and the pioneering word processor Lotus, turning a $25 million fund into $120 million in three years. Amid this stunning wealth creation, other evangelists of tech couldn’t bear to stay on the sidelines either. Time reporter Michael Moritz parlayed his early acquaintance with Jobs into unparalleled insider access to Apple, and then left journalism altogether to become a venture capitalist, joining Don Valentine’s Sequoia Capital in 1986.31
Other journalists turned into entrepreneurs. In 1981, Adam Osborne took the earnings from his McGraw-Hill sale and started his own computer company. The machines weren’t the prettiest or the most powerful, but they had one asset few possessed—they were small enough for a business traveler to fit under an airplane seat. It was the first portable personal computer. “I saw a truck-sized hole in the industry,” preened Osborne, “and I plugged it.”
Capitalizing on his personal brand, Osborne put himself at the center of his company’s story, and didn’t hesitate to draw grandiose, Jobsian comparisons. “Henry Ford revolutionized personal transportation,” proclaimed one ad. “Adam Osborne has done the same for personal business computing.” Clocking in at over twenty-three pounds, the “luggable” Osborne computer didn’t quite live up to the revolutionary hype. Yet tech insiders appreciated its patrimony. Knowing very little about technology—he was trained as a chemist, after all—Osborne had sought out one of the Valley’s best technical minds to design the machine: Lee Felsenstein. The countercultural computer had come full circle.32
The earlier generation of tech entrepreneurs were engineers by training, men possessing multiple degrees and many thousands of hours logged in research labs and machine shops. S
o too were many of the original Valley VCs. A company founder like Adam Osborne—textbook author, magazine scribe, showman—would have been unimaginable in the earlier era. But now tech was no longer only about engineering. It had become a business of storytelling and salesmanship.
It also had become something more than just computers and electronics. It was shorthand for the new economy itself. Steve Jobs landed on countless magazine covers during his lifetime. The first major one was Inc. magazine, October 1981. Next to Jobs’s beatifically bearded face, the headline read: “This Man Has Changed Business Forever.”33
CHAPTER 14
California Dreaming
Three things happening in quick succession in the second half of 1980—the euphoria over Apple, the stunning biotech debut of Genentech, and the election of Ronald Reagan to the U.S. presidency—marked the start of a new, and even more intense, phase of America’s long fascination with California, a place of new starts, new ideas, and dreams coming true.
Reagan was another maverick turned mainstream, a candidate written off as far too conservative for the White House, but who won the day with an energetic, telegenic presence that shook off the Seventies malaise and looked eagerly forward. “Someone once said that the difference between an American and any other kind of person,” Reagan declared in his first speech of the campaign, “is that an American lives in anticipation of the future because he knows it will be a great place.”1
Silicon Valley came to embody that future. It still was one of several major tech regions in the U.S., but the explosive growth and media buzz at the turn of the 1980s had made its business culture so influential that even Bill Gates in soggy Seattle and Ken Olsen in snowy Boston could get swept up into the all-purpose journalistic descriptor of “Silicon Valley entrepreneur.” Its variation on the California dream attained velocity and altitude not only because the technology had reached a critical inflection point—a computer on every desk! A video game in every living room!—but also because so many Americans wanted and needed to dream anew. For a society whose idols had been felled by assassins’ bullets and sullied by corruption and scandal, the new heroes had arrived.
The Code Page 23