A basketball scholarship got Millines to her local school, Monmouth College, whose campus sprawled along the boardwalks of the Jersey Shore. She intended to major in electrical engineering, but the required lab classes happened at the same time as basketball practice. So she decided to go with computer science instead.
Trish Millines’s entry into the tech world was far different from that of the bright young things who breathed the rarefied air of Stanford and MIT, but it was a path followed by thousands of Americans in the 1970s and early 1980s—white and black, male and female, immigrant and native-born. Computer science programs had sprouted up at plenty of mid-range public institutions like Monmouth, responding to an ever-constant demand for skilled programmers. There were more courses than there were university mainframes; to get her punch cards analyzed, Millines had to shuttle over to the more well-resourced computer labs of Rutgers. Her tech scene wasn’t that of the homebrewers and computer faires. It was one of working-class kids at ordinary schools, learning a skill that could get them out of Asbury Park or Wilkes-Barre or Utica. It remained a world of mainframes and teletypes and management information systems and programming in B, of IT jobs in big companies or month-to-month contracts as coders for hire.
The personal-computer revolution was catching fire by the time she graduated in 1979. But Trish Millines lived in the world of the big machines, a young person without Valley connections. She was female, black, and gay, never having had an internship or a summer job that might open doors and lead to a real paycheck. She scanned the trade magazines, put her résumé in where she could, and ultimately found her first jobs. First came a defense contractor in the Philadelphia suburbs. Then, a few months later, came an offer from Hughes Aircraft in Tucson, Arizona.
Packing her suitcases and boxing up all her worldly possessions, Trish Millines flew west to the desert. She’d never live in New Jersey again.1
WALL STREET, 1980
Although he had never worked in Silicon Valley, Ben Rosen’s career had intersected with the place since the start. Born in New Orleans at the bottom of the Great Depression, Rosen revealed technical smarts that propelled him westward to Cal Tech’s prestigious electrical engineering program by the early 1950s. First-year graduate student Gordon Moore was his freshman chemistry instructor. Rosen later joked about his feeble grade in the class from the co-founder of Fairchild Semiconductor and Intel, “who obviously had never heard of grade inflation.” The next stop was Stanford, for a master’s in engineering, then off to Columbia for an MBA.
Unlike most of his straight-arrow compatriots, Ben Rosen had a hard time figuring out what he wanted to be when he grew up. He did tours of duty at a couple of big electronics companies. Then he dropped out for a bit to work on his tan and sell Frisbees on the beaches of the French Riviera. He at last returned to the world of suits and ties in the early 1970s, ending up at Wall Street broker Morgan Stanley. There, he forged a reputation as the Street’s foremost electronics analyst, his star rising along with the semiconductor industry.2
Those companies needed someone like Rosen, desperately. The California chipmakers might be making incredible technical products, but Wall Street wasn’t all that interested in hearing about it. “In our opinion,” a Merrill Lynch analyst report noted breezily in 1978, “future developers of promising technologies, new products and new services are likely to be well-financed divisions of major corporations.” That view was typical. Analysts not only didn’t promote the Valley in the 1970s, Regis McKenna remembered, “they bet against it.” In contrast, Rosen was consistently bullish about microchips, even when the industry’s growth had its mid-decade hiccups. By the end of the decade, he was turning his analysis into a business empire, publishing a must-read industry newsletter and throwing an annual semiconductor conference that had become an essential event for anyone involved in the microchip business.3
So when entrepreneurs out in California started putting microprocessors on motherboards and building computers around them, Rosen was about the first person on Wall Street to notice. There was a new industry here, Rosen realized, and he organized a “Personal Computer Forum” in early 1978 to showcase personal computers to his investor clients. His high hopes quickly came back down to earth. Only about twenty people showed up. Speakers nearly outnumbered attendees. “Ben,” one of Rosen’s banking clients told him, “when you have a conference on an industry with real investments, let me know.”4
Then, rather suddenly, things started to get real.
CHAPTER 13
Storytellers
When we invented the personal computer, we invented a new kind of bicycle.” The boldfaced headline blazed out at readers opening their copies of The Wall Street Journal the morning of August 13, 1980, sitting atop a full-page advertisement for Apple Computer. Below came a page crowded with print, accompanied by a portrait of its credited author, a professorially bearded Steven P. Jobs, “talk[ing] about the computer, and its effect on society.”
The copy repeatedly referred to Jobs as “the inventor” of the personal computer, an artful fabrication that glossed over the fact that the elegant innards of the Apple came from the inventive mind of Jobs’s media-shy co-founder, Steve Wozniak. No insider technical specs here. The ad used simple, evocative language. “Think of the large computers (the mainframes and the minis) as the passenger trains and the Apple personal computer as a Volkswagen,” Jobs wrote. A Beetle might not be as powerful as a passenger train, but it could take you anywhere you wanted to go, on your own schedule. It brought liberation; it unleashed creativity. It was the future.1
Here was the same story that the countercultural computer folk had been hyping for years. Now, Steve Jobs was delivering it to a much broader audience of Wall Street dealmakers who thought of computers as IBM behemoths, considered word processing the work of secretaries, and who had never wielded a soldering iron nor darkened the door of a hobby shop. Even though the financial world didn’t understand the technology, however, its inhabitants salivated at Apple’s sales figures. The company now had upwards of 150 employees. Its 1980 revenue was about to hit $200 million—double what the entire microcomputer industry had been only three years before. Apple was readying for its IPO, and brokers were keen to get in on the action. “It’s like bears attracted to very sweet honey,” remarked one.2
WALL STREET AWAKENS
Much to Jimmy Carter’s election-year distress, recession still gripped the American economy, but the flush times were returning to Wall Street, and tech was a big driver of the boom. At the start of 1980, analysts had greeted the new decade with some uncertainty—the year could be “either the beginning of the long-awaited recession or the beginning of the long-touted Electronic Eighties,” speculated Ben Rosen.
It turned out to be the latter. Median stock gains that year ultimately topped out at 40 percent, and electronics-industry performance proved even more impressive, with high-flying stock prices that averaged 65 percent gains. Enthusiasm for tech stocks started spiking so high that some analysts were getting worried about a repeat of the irrational exuberance of the late 1960s. For the semiconductor industry, the stock surge defied economic logic. Chips were getting cheaper and the surge of demand from established industries had eased up, but, as Rosen observed, “the prices of technology stocks have done nothing but soar.”3
A number of factors contributed to this boom. One was the logic-defying nature of the chip business itself, as the Moore’s Law flywheel delivered products that became vastly more powerful as they became less pricey. Another was the river of capital flowing into the tech industry, accelerated by the confidence boost of capital gains tax cuts and newly loosened rules on pension-fund investing. It suddenly seemed as if everyone was starting a new venture fund: brokerages, established corporations, old and new VCs, and entrepreneurs themselves. Tech now made up the majority of all VC investments. “We’ve gone from one extreme to another,” Gordon Moore remarked, a little anxiously.
The rush of newcomers spawned a sardonic, and hard-to-shake, nickname for the industry: “vulture capitalists.”4
All of this drove the buzz around personal computing, and Apple in particular. Frenetic traders ran up the stock prices of Apple’s chief rivals even though no one had new product releases on the horizon. Further whetting their appetite was the emergence of a fresh set of offerings from the growing field of biotechnology, notably the Northern California–based Genentech, a firm co-founded by Stanford and University of California scientists and Silicon Valley VCs.
Although Genentech had made a profit in only one year of its four-year history, dealmakers salivated at the prospect of getting in on the ground floor of a new and hugely lucrative field. New Issues, a stock-industry newsletter, called the company “the Cadillac, Mercedes and Rolls-Royce of the industry rolled into one.” Swirling into the soaring valuation was the U.S. Supreme Court’s June 1980 decision in Diamond v. Chakrabarty, which ruled that new life forms created in laboratories were patentable inventions. Another factor was legislation on the cusp of being signed by Jimmy Carter, the Bayh-Dole Act, that allowed universities and their researchers to commercialize inventions that sprang from government-funded research—a move of particular benefit to the health sciences. The prospects for biotechnological commercialization made Genentech seem like only the tip of a very large iceberg.5
Biotechnology was profoundly different from computer hardware and software—it was far more anchored in basic research, was more tightly regulated, and had much slower product development cycles—but investors rightly recognized that the two sectors shared the same venture-capital DNA. Genentech in fact owed its existence to Eugene Kleiner and his venture partner Tom Perkins, who had adapted an “incubation” model of recruiting young associates, then giving them a mandate to hunt down promising tech and build new companies around it. When Genentech went public on October 14, trading opened at $35 a share and shot upward to a peak of $88 only an hour later. It was the biggest run-up in Wall Street history. Yet the spike was brief, and the stock ended up at only a few more dollars than its initial valuation. The IPO had made Genentech’s founders rich, but it hadn’t been as good to other investors.6
Observing disapprovingly that most Wall Street brokerages still lacked analysts with enough knowledge to properly understand either tech or biotech, “or to put proper valuations on these issues,” BusinessWeek quickly pronounced Genentech “the perfect example of how investors can overreact to a stock.” Yet company co-founder Bob Swanson recognized something else at work as well. “The market was ready for a risky, small company,” he observed. Biotech “was the kind of technology that can capture people’s imaginations. You had people say, ‘We’ll put this stock in the drawer for our grandkids. This is something for the future.’”7
The fever for Apple could have easily been interpreted as just another market overreaction. The personal-computer industry was younger than a kindergartener. While sales were roaring, it still wasn’t clear whether these devices would be merely a passing fad. None of the newcomers had yet managed to crack the big-money office market dominated by IBM mainframes and Digital and Data General minis. They weren’t pumping out commodity products like Texas Instruments or Intel.
Plus, Apple was still a one-hit wonder at the end of 1980. The II had sold by the tens of thousands, but its most recent product release, the Apple III, had fallen short of expectations. More-seasoned investors doubted it could keep up its astounding rate of growth. If Wall Street was ready for a risky little company, there were plenty of other candidates out there with comparable, if not better, prospects.
But Apple—and Silicon Valley—had something the established computer companies, and many other new-era competitors, did not. It had a great story. Straight out of Northern California came a tale that fit right in to American legend: a story of invention, of scrappy do-it-yourself entrepreneurship, of thinking different. At the same time, the story was delivered through a polished, multi-pronged campaign that positioned Apple as a serious business enterprise. It was a countercultural message that capitalists could love.
SELLING APPLE
Despite the splash of the Apple II’s 1977 debut, the tech world took a little while to warm up to the company, its founders, and the business potential of micros. Digital’s Ken Olsen dismissed the notion that there would be “any need or any use for a computer in someone’s home.” Venture backers were skeptical in 1977 too. “It may have been a while since he had a bath,” thought Arthur Rock upon meeting Steve Jobs. Nonetheless, Rock was impressed by the hordes crowding around Apple’s computer-fair booths as well as the fact that Mike Markkula was providing adult supervision. He agreed to invest $60,000.8
David Morgenthaler thought microcomputers were an iffy business proposition, and when it came to the Steves, “I was turned off by the fact that the two were pretty much kids.” But Rock’s involvement sparked his curiosity, and he dispatched one of his junior associates to hear Apple’s pitch. It didn’t go well. “They kept me waiting for a half hour,” the colleague reported back. “And they were really arrogant.” Morgenthaler had little patience for that nonsense. He walked away. A year later, after researching twenty-five personal-computer companies, the Clevelander realized that “those two guys were going to win” and joined a second round—at a much higher price per share—in 1978. Although the deal made a fortune and sealed his reputation as one of the industry’s premier dealmakers, Morgenthaler never stopped regretting that he missed his first chance. “That’s cost me a lot of money.”9
Apple wasn’t the market leader, either. It was Tandy/Radio Shack and Commodore—two companies from far outside the Silicon Valley and computer-club circuit, with no venture backing—that became the mass market’s gateways into personal computerdom, initially generating far larger sales numbers than the II. Price and distribution were big reasons why. Tandy’s TRS-80 was much cheaper and was widely available through the deep-pocketed Texas company’s national network of Radio Shack stores. True computer nerds scoffed at the stripped-down machine, calling it the “Trash-80,” but the depth of the market “knocked us off the wall,” one Radio Shack executive remarked. “We’re still catching up.” So was Commodore, which had a backlog of thousands of orders for its humble PET, a computer that purists also dismissed as little more than a spiffed-up calculator with Chiclet-like keys.10
Regis McKenna knew that if Apple was going to enlarge the market for its more expensive product, the company needed to appeal to the heart, not the head. Those curious buyers wandering into Computerworld and the Byte Shop needed to be convinced that Apples were not cheap playthings, but indispensable home appliances, well worth the sticker price. The start-up founded by two college dropouts needed to look like “a large, stable computer manufacturer.” So out rolled the slick, four-color magazine ads, promising potential buyers a whole new world of household efficiency and discovery. “Other personal-computer companies emphasized the technical specifications of their products,” noted McKenna. “Apple stressed the fun and potential of the new technology.”11
The two-page spread heralding Apple II’s debut—that one with computer-pecking husband and admiring, dish-washing wife—was just the beginning of a stream of print ads and corporate brochures using friendly headlines to drill right into the psyches of consumers who were pragmatic, price-conscious, and more than a little apprehensive about how to use computer technology. “PR was an educational process,” said McKenna firmly, “not a promotional process.” Apple was here to educate.12
“Sophisticated design makes it simple,” reassured one headline. “How to buy a personal computer,” invited another. “We’re looking for the most original use of an Apple since Adam,” ran a tagline above a winsome photo of a handsome surfer type, naked except for a strategically positioned Apple II. It wasn’t just about a better method of home bookkeeping. It was about freedom, about creation, about revolution. The ads also indulged in a
little creative license, calling Apple “the best-selling personal computer,” which only worked if you considered the TRS-80 and the PET glorified calculators. Simple, non-technical copy was essential. “Don’t sell hardware,” Apple reminded its partner retailers. “Sell solutions.”13
Ad placement further trumpeted the company’s bold ambitions. In addition to the usual computer-geek outlets, Apple bought spreads in Playboy, The New Yorker, and airline in-flight magazines. Other companies marketed to the small demographic sliver of Americans who’d spent boyhoods in basement workshops and science fair booths. Apple went bigger: “Men 25–54, $35,000+ Household income, College Graduate+.” (Even though there soon were senior women both in Apple’s marketing ranks and in the offices of Regis McKenna Inc., it still did not occur to anyone to pitch the friendly little product specifically to female consumers.)14
Over time, the company became even bolder in its product claims as it chased the hearts and minds of upwardly mobile American males. By the time the decade was out, its magazine ads featured actors dressed as great men of history, with both imagery and ad copy hammering home the message that Apple was changing the world. “What kind of man owns his own personal computer?” asked one, accompanied by a photo of an actor dressed as Ben Franklin, delightedly gasping at the wonders of an Apple II. “If your time means money,” the ad continued, “Apple can help you make more of it.” Others in the series featured Thomas Jefferson, Thomas Edison, the Wright Brothers, and Henry Ford. Apples were tools for legendary men of ingenuity and innovation, and they were tools for men like you. “Don’t let history pass you by,” read the copy.
The Code Page 22