A great admirer of Intel’s Bob Noyce, Jobs wanted to build a campaign for the Apple II that was as jazzy as the one that had propelled the Intel 8080 into the stratosphere. In a replay of his audacious call to Bill Hewlett a decade earlier, Jobs dialed up the Intel switchboard, where someone connected him with the man who’d crafted that marketing campaign, Regis McKenna Himself.
McKenna was unfazed by Apple’s garage setting and the co-founders’ scraggly looks. He’d worked with “lots of strange people” in the Valley already, and he was familiar with the Homebrew scene and the intriguing little enterprises bubbling up from it. The first meeting, however, was a bust. The Steves wanted help placing a Woz-authored article on the Apple II in Byte. It turned out that Steve Wozniak was much better at building elegant motherboards than crafting accessible prose; the piece was a rambling mess better suited for the hobbyist crowd back over at Dr. Dobb’s. McKenna told them it would have to be rewritten, and an offended Woz refused. Then I have nothing to offer you, replied McKenna.9
But Steve Jobs was not one to take no for an answer. Ever persistent, he called McKenna “about forty times” to persuade him to take Apple on as a client. Here’s the deal, McKenna told Jobs. Good corporate marketing involves lots of different elements, and all of it costs money. Apple needed venture capital to make it work. Jobs already had been making futile attempts to fund-raise: pitches to Wozniak’s bosses at HP, then to computer maker Commodore, then to his old boss Nolan Bushnell, who had just hit a $10 million payday by selling Atari to Time Warner. They’d come to nothing.
Both McKenna and Bushnell pointed Jobs toward a financier who might be willing to help: Don Valentine, the hard-driving National Semiconductor executive who’d now become a venture capitalist. Valentine agreed to come by the garage. There, the Mercedes-driving Republican in a rep tie found a skinny, bearded kid who looked “like Ho Chi Minh” and his nerdy, equally shaggy business partner. “Why did you send me these renegades from the human race?” Valentine called to ask McKenna after the meeting was done. The VC wouldn’t invest—yet—but he thought enough of Apple’s potential to connect the two Steves to Mike Markkula, a Fairchild and Intel veteran who recently had retired as a microchip millionaire in his early thirties. Markkula was intrigued. These two founders were very young and kind of strange, but the Valley was full of strange tech people, and the Apple II was exciting.10
With Markkula on board as an advisor, Apple had enough credibility and money for Valley VCs and marketers to start taking it seriously. By December 1976, Regis McKenna had taken on Apple as a client and drafted a comprehensive marketing plan, recorded in tight cursive in a narrow-ruled spiral notebook. (Among the possible distribution channels noted by McKenna in 1976: “Apple Stores.” He was thinking big.) The next order of business was some better branding. McKenna’s art director Rob Janoff replaced the hippy-dippy Newton etching with the iconic once-bitten apple. Local designer Tom Kamifuji, known for his ability to adapt the psychedelic graphics of a 1960s rock poster into corporate branding campaigns, added the rainbow colors. Clean high-tech lines infused with a countercultural vibe: a perfect new image for a microcomputer company with market-altering ambitions.11
McKenna’s team orchestrated a similar transformation in the company’s print advertising, which morphed from misspelled newsprint into sleek color spreads with clean typography and maximum visual punch. “The people we were trying to reach was very specific,” McKenna explained. “The hobbyist looking to the next level, affordable computer, people who had programming skills and built their own computers from kits.” Yet the Apple II was also for those who weren’t already homebrewing: “professionals such as teachers, engineers, or people who would put in the time and effort to learn how to use this new computer.”
In 1977, nearly everyone in these target groups was male. The debut spread for the Apple II, which ran in general-interest magazines like Scientific American as well as more specialist publications, pictured a young husband at the kitchen table, checking stocks on his computer as his smiling wife looked over from her work at the kitchen sink. The ad copy actually had been written by a woman, but the picture was worth a thousand words. “Within just a few weeks,” the copywriter confessed, “Steve [Jobs] received a letter from a woman in Oregon, complaining that the ad was sexist—which it very clearly was.” Whether their ads were homey or slick, all of the personal-computer companies were like Apple: they marketed their products to the men who already were inclined to use them. Girls and electronics still didn’t mix.12
By the time Jobs, Wozniak, and Markkula hit the 1977 trade-show circuit, the company was starting to get a professional polish that belied the fact that this was a tiny little operation of about a dozen employees. The Steves donned collared shirts, ran combs through their hair, and pinned nametags to their chests. Their first stop was Jim Warren’s first annual West Coast Computer Faire, a landmark in tech history all on its own. Apple landed a coveted booth near the entrance.13
Despite the proliferation of new entrepreneurs, the first Faire had a program and a vibe that was more Whole Earth Catalog than Wall Street Journal. Panels focused on the change-the-world potential of computing, with titles like “If ‘Small is Beautiful,’ is Micro Marvelous? A Look at Micro-Computing as if People Mattered” and “Computer Power to the People: The Myth, the Reality, and the Challenge.” There were sessions on computers for the physically disabled, and four panels on using personal computers in education (Liza Loop appeared on one of them). Novices were welcomed with their own panel on “An Introduction to Computing to Allow you to Appear Intelligent at the Faire.” Business uses were rarely mentioned, aside from considering “Computers and Systems for Very Small Businesses.” For $4, attendees could buy an official conference t-shirt that read, “Computer Phreaques Make Exacting Lovers.”14
But this was no PCC potluck. The times were, at last, a-changing. “Here we are, at the brink of a new world,” Ted Nelson proclaimed in his keynote speech at the Faire. “Small computers are about to remake our society, and you know it.” He continued:
The dinky computers . . . will bring about changes in the society as radical as those brought about by the telephone or the automobile. . . . The rush will be on. The American manufacturing industry will go ape. American society will go out of its gourd. And the next two years will be unforgettable.15
Nelson’s exuberance didn’t seem all that irrational as soon as a conference-goer traveled from the breakout rooms to the trade show floor, and saw the crowds clustering around the booths of the Valley’s new little computer firms. The booth with the biggest crowd of all was Apple, where the spruced-up Steves proudly showed off the new Apple II. The device finally delivered what the tech evangelists had been promising for so long: a self-contained unit. You plugged it into a wall, hooked up an ordinary TV and tape recorder, and started typing. It had BASIC installed so that you didn’t need to write your own software. It had eight—count ’em, eight—expansion slots that allowed the user to add on applications and memory. The Apple II wasn’t cheap—at $1,300, it cost twice as much as the Apple I and more than three times as much as the Altair—but it was the friendly little machine that the microprocessor finally made possible. As Apple’s first print ad said, “You’ve just run out of excuses for not owning a personal computer.”16
The shows unveiled the Apple II to the world, and they also introduced Steven P. Jobs to the national press. As business reporters trolled the trade show halls in Boston and Dallas and New York over the frenetic months of 1977, McKenna’s team made sure that the “Vice President—Marketing” was available to talk. A fresh-faced Luke Skywalker who could speak a consumer-friendly language, Jobs reliably provided comments with enough color and interest to make it into the story. He usually was the only personal-computer entrepreneur quoted.
Steve Jobs may not have built the Apple’s motherboard, but he knew how to explain it in evocative language, a rare talent in the eng
ineering world. The chip industry brimmed with hard-charging, take-no-prisoners personalities, so Jobs’s mercurial temper wasn’t at all remarkable. But his skill set was. Here was someone who combined Andy Grove’s gimlet-eyed understanding of product with Bob Noyce’s charisma, in a youthful package with countercultural appeal. The barefoot vegan with the straggly beard also understood that the sales job at hand involved more than slick logos and zippy sloganeering. “Inventions come from individuals,” observed Regis McKenna, “not from companies.”17
The large-computer industry was legible and familiar to these reporters and their readers. The microcomputer industry was mysterious, and it still was difficult to understand why anyone would drop $1,300 on an Apple II that couldn’t do much more than balance a checkbook. Jobs keenly understood this dilemma. “Most people are buying computers not to do something practical but to find out about computers,” he told The New York Times. “It will be a consumer product, but it isn’t now. The programs aren’t here yet.”18
Ah, the programs: Apple and others succeeded in developing personal-computer hardware, but there remained a huge shortage of software programs to run on it. This was a big stumbling block to home and office users who didn’t have the appetite to learn BASIC and build apps of their own. For all the hype, the personal-computer business remained tiny. Its total 1977 sales came to about $100 million in a more than $22 billion computer industry. To scale up, it needed software makers to catch the same entrepreneurial spirit that had seized the kit-builders.19
Software was a very different business proposition, however. Electronics hardware was something that hacker-hobbyists had been used to buying ever since they were kids playing with their first crystal radio kits. To get parts, you went to a hobby store and paid for them. But code wasn’t something you could find on a retail-store shelf. It was know-how, something you learned by doing. If you didn’t write the software yourself, you could borrow code from others. The only entities that sold software were behemoths like IBM, who packaged it with their large, expensive computers. In the moral calculus of hackerdom, stealing software from big computer companies was equivalent to using blue boxes to make free long-distance phone calls. No one got hurt, just corporations that already made far too much money.20
By the time Homebrew began and Dr. Dobb’s published the code for Tiny BASIC, it was common understanding that personal-computer software was to be shared, “liberated” from corporations, and given away for free. Less than a year into Homebrew’s happy swap meets, however, the organizers received a sharply-worded missive from someone who was trying to turn software into a business. “An Open Letter to Hobbyists” wasted no time in laying the blame for the software gap at the feet of the hackers themselves. Paying for hardware, but not for software, kept good programs from being written. “Who can afford to do professional work for nothing?” the author asked. “The thing you do is theft.”21
The letter came from Albuquerque, penned by another relentless twenty-year-old college dropout. His name was Bill Gates.
JET CITY
William Henry Gates III was a Cold War kid from Seattle, born and bred in a place and time where opportunities abounded for a curious boy to learn about computers. His childhood coincided with the apex of Seattle’s Boeing boom, when the aerospace giant employed hundreds of thousands and produced successive generations of innovative commercial airliners at an astonishingly rapid pace. Seattle was also home to the University of Washington and its multiple federally funded science and engineering programs, including in the new academic discipline of computer science. The city of modest bungalows sprawled into a metropolis of millions, as highways snaked through neighborhoods and floating bridges unfurled across the lake to connect the Eastside suburbs.
In 1962, Seattle’s space-age boosterism spawned the Century 21 Exposition, a futuristic World’s Fair featuring a soaring Space Needle and exhibits of the latest innovations in computing—a UNIVAC that contained a library’s worth of information, a “TV telephone” that broadcast pictures along with sound, and the Freedom 7 capsule that had just carried NASA astronaut Alan Shepard into space. Gates was only six when he visited, and his favorite thing was riding the fair’s sleek new monorail. But by the time he hit middle school, he’d become a die-hard math and science guy, so much so that he didn’t bother to earn decent grades in other subjects. From an early age, all Bill Gates cared about was technology, and he was quite certain that he understood it better than anyone else.
Luckily, his parents had sent him to the private, all-boys Lakeside School, in the hopes that its reputation for academic rigor would rub off on their son. Exhorted by a resourceful teacher who believed that computers were about to become the future of education, Lakeside parents had raised money to buy the school its own teletype machine connected to a time-sharing system. The computer room became Gates’s hangout from eighth grade on. It was where he learned BASIC, and where he met a quiet tenth grader named Paul Allen. Soon the two could be found haunting the University of Washington computer labs, learning the ins and outs of a Digital PDP-10 and becoming notorious for their ability to skillfully hack into places they weren’t supposed to be. In Gates’s last year of high school, he and Allen embarked on their first business venture, selling a microprocessor-based device they programmed to analyze traffic volume on city streets. They called the company Traf-O-Data.
Gates started his freshman year at Harvard in September 1973, the same month that Don Lancaster’s TV typewriter graced the cover of Radio-Electronics. His buddy and business partner Allen came to Boston as well, to take a job at Honeywell. Gates didn’t stay in college long, but left an impression. “He was a hell of a good programmer,” remembered his faculty advisor, but he was “a pain in the ass.” Gates was far more interested in extracurricular hacking and video gaming than in his Harvard coursework. (At one point, he became obsessed with the Atari game Breakout—a game that had been built out by the two Steves before they started Apple.)22
Then, halfway through his sophomore year, came the Altair. The kit was all hardware, no software. Gates and Allen immediately sensed an opportunity. Even though they had no connections or particular entrée to MITS, they sent a letter to Ed Roberts. We’ll write BASIC for the Altair, and you give us office space and royalties in exchange. Roberts, skeptical but desperate, told them he’d take a look. Six weeks of all-nighters later, Allen flew out West, paper tape in hand, to demonstrate their new software. It worked, and the two Seattleites had a deal. Within months, the two had decamped from Boston to Albuquerque, and Traf-O-Data had a new name: Micro-soft.23
Bill Gates had been like most hackers, with little compunction about the backdoors and gray areas that would allow him to get things for free. He’d gotten in trouble at Lakeside and Harvard for hogging free computer time and for using school and university computers for his own side projects. But now that he and Allen were devoting every waking moment to building a software business, his outrage grew about those who thought they could get code for nothing. Only a few months into Gates and Allen’s MITS adventure, a spool of their Altair BASIC tape got into the hands of a Homebrewer, who, in classic hacker fashion, made fifty copies of it to distribute to other members. Those folks made copies for their friends, and on and on. It became, observed Gates biographers Stephen Manes and Paul Andrews, “the world’s first and most pirated software.”24
The young entrepreneur was furious, and fired off the angry note that would go down in tech history as “The Gates Letter.” In it, Gates drew enduring battle lines in the tech world. On the one side, there were the people who believed information—software—should be proprietary data, protected and paid for. On the other side were those who believed in a software universe like Homebrew: where people shared and swapped, iterated and improved, and didn’t charge a cent. Two decades later, open-source software evangelist Eric Raymond famously framed the divide as “the cathedral” versus “the bazaar.”25
As
the personal computer revolution gained speed and financial velocity, and as Jobs and Gates morphed from oddball kids into two of the richest and most celebrated business leaders in the world, the space between the cathedral and the bazaar—and the entrepreneurs and the idealists, the mercenaries and the missionaries—grew even wider.
* * *
—
Boys had been tinkering with electronics since the dawn of the radio age. The hackers might have stayed in their basements and bedrooms if the enabling technology of the mass-produced microprocessor hadn’t come on the scene, and a cadre of ferociously innovative chip companies hadn’t created new markets for computerizing nearly everything. The microcomputer might have remained a funny little toy if it hadn’t been for the educators and evangelists and marketers who showed how it could work in the classroom, in the office, and as home entertainment.
This did not occur in a vacuum. The same conditions that had made Homebrew possible in the first place—microprocessors and miniaturization; a turn away from the military-industrial complex toward other applications for tech; time-sharing and the spread of programming languages like BASIC; a desire for a more inspiring and interactive relationship with technology—created avenues for Silicon Valley hackers to turn their ideas into viable businesses. A bleak economy upped the desire for escapist, tech-fueled fantasy, from Star Wars to video games to Jerry Brown’s Space Day.
What’s more, the electronics hobbyists came together at a moment when the inhabitants of a prosperous and complacent postwar world were experiencing a period of wrenching, bewildering change. America and Western Europe were awash in data, and increasingly distrustful of the institutions and experts who managed that data. The personal computer held the promise of reclaiming control.
The Code Page 19