Jobs moved back in with his parents and, as noted earlier, took a job at Atari. It was during his time at Atari that he began working again with Woz. Nolan Bushnell, founder of Atari, had challenged Steve to develop a single-player version of Pong, with a catch: if the design used fewer than fifty chips, he would get a bonus for every chip under fifty used. Jobs recruited Woz to help him as Woz was by far the better engineer. Woz would work on the design while Jobs implemented it. The pair finished the project in four days, using only forty-five chips. This success reaffirmed what the blue box experience had taught the young men a few years earlier: Jobs’s ambition and design skills paired with Wozniak’s engineering genius made a powerful combination. Jobs was bold and visionary; Wozniak provided the exceptional intellectual and technological resources to deliver on that vision. They were primed to seize the next big opportunity, and they wouldn’t have to wait long.
In January 1975, Popular Mechanics put the first personal computer kit, the Altair, on its cover. The Altair was invented by Henry Edward “Ed” Roberts, a computer engineer who had previously created and sold model rocketry and calculator kits. Although the Altair was little more than a series of switches that could be flicked in a sequence that would generate a pattern of lights, it was an object of desperate desire to hobbyists and would-be software programmers. Before 1975, unless you were a computer engineer working for a large company with a mainframe computer, you had no access to a computer of any kind. The Altair, as simple as it was, was almost a computer, and anyone could buy the kit for about $500. When that January issue came out, Roberts’s company, Micro Instrumentation and Telemetry System (known as MITS), was flooded with orders and sold thousands of kits within the first month. The Altair was an inspiration to Jobs and Wozniak. Wozniak, in fact, knew he could do better than Roberts had done. The Altair used the Intel 8080 microprocessor, which at around $180 per unit was too expensive for Jobs and Woz. Instead, they found a $20 microprocessor made by MOS Technologies, the MOS 6502, which was nearly identical to the Motorola 6800—a $360 microprocessor. Woz immediately set about assembling and coding a computer based on this microprocessor, and just a few months later he had created a prototype that enabled him to type letters on a keyboard and have them show up on the monitor—the first time in history this had been achieved. Jobs made some calls to Intel and was able to obtain some dynamic random-access memory chips for free. Jobs also began accompanying Woz to the Homebrew Computing Club, a club at Stanford where many of personal computing’s “firsts” got their start.12 Woz’s instinct was to give his schematics away for free: “It never crossed my mind to sell computers. It was Steve who said, ‘Let’s hold them in the air and sell a few.’”13 There was no business plan or a hunt for investors. There was no rented office or manufacturing space, and no corporate logo—none of the trappings of business that today’s would-be entrepreneurs assume are crucial for their start-up. It was just two guys building computers in the Jobs family garage with $1,300 raised by selling Woz’s calculator and Jobs’s Volkswagen bus.
The Apple I enjoyed a modicum of success—selling a few hundred units—but it was merely a computer circuit board and required hours of laborious assembly. Jobs soon realized that very few people wanted to assemble their own computer: “It was clear to me that for every hardware hobbyist who wanted to assemble his own computer, there were a thousand people who couldn’t do that but wanted to mess around with programming… just like I did when I was 10. My dream for the Apple II was to sell the first real packaged computer.… I got a bug up my rear that I wanted the computer in a plastic case.”14 In September 1976, Wozniak and Jobs began developing the Apple II, placing it in an elegant molded plastic case so that it had the friendly aesthetic appeal of a high-end kitchen appliance rather than an industrial device. Jobs also insisted that it have a power supply that did not require a fan because the noise of a fan was not Zen. However, these design decisions required money, which Woz and Jobs didn’t have. They also needed managerial expertise to help run the growing company. Fortunately, being in the heart of Silicon Valley, with its dense network of information technology companies and investors, gave them access to both. Jobs approached his former employer, Atari’s Nolan Bushnell. Bushnell declined Jobs’s offer to sell him a one-third stake in the company for $50,000 (Bushnell later remarked, “It’s kind of fun to think about that, when I’m not crying”15) but referred him to Don Valentine, the founder of Sequoia Capital, a Menlo Park venture capital firm. Valentine also declined to invest, but he gave Jobs a referral to Mike Markkula Jr., a former marketing manager at Fairchild Semiconductor and Intel, who had made millions on stock options and retired at the age of thirty-three. Markkula agreed to join as a partner and provide capital, and in April 1977 they launched the Apple II, a $1,298 product that could be used straight out of the box. Markkula’s time at Fairchild and Intel had given him deep insight into the pricing, marketing, and distributing of technology products. He also had a large network of personal connections in the industry and could help Jobs and Wozniak access the people, technology, and money they needed. He would prove to be an invaluable resource to Apple, and to Jobs personally, over the next two decades.
The Apple II was an unprecedented success; by the end of 1978, Apple’s sales were about $15 million, but this was just the beginning.16 By itself, the personal computer was still mostly a toy for hobbyists. The breakthrough to a much bigger market came in 1979, when Daniel Bricklin and Bob Frankston launched a software program called Visicalc (for “visible calculator”). This was a spreadsheet program that came to be known as the first “killer app” of the personal computer era.17 Visicalc could do in an instant what would formerly take accountants hours to do by hand with a ledger and a desktop calculator, and it transformed the personal computer into a crucially important business tool. Luckily for Jobs and Woz, Visicalc had been created on, and for, Apple computers.
By 1980, Apple was generating over $100 million in annual revenues and had more than 1,000 employees. Just three years after its founding, Apple Computer went public, and when trading of its shares closed on the first day, Apple had a valuation of $1.8 billion—more than any other initial public offering since Ford went public in 1956.18 Suddenly IBM, Hewlett Packard, and others realized they had seriously underestimated the potential role of the personal computer. This was particularly jolting for IBM, for decades the 800-pound gorilla of the computing industry. Respect for IBM bordered on reverence. A well-known piece of folk wisdom in the business world was that “nobody ever got fired for buying IBM.” As IBM executive Jack Sams reminisced, “The worry was that we were losing the hearts and minds.… So the order came down from on high: ‘Give me a machine to win back the hearts and minds.’”19
In its race to bring a personal computer to market quickly, IBM decided to use many off-the-shelf components from other vendors, including Intel’s 8088 microprocessor and Microsoft’s software. IBM was not worried about imitators because IBM’s proprietary basic input/output system (BIOS), the computer code that linked the computer’s hardware to its software, was protected by copyright. While other firms could copy the BIOS code, doing so would violate IBM’s copyright and incur the legendary wrath of IBM’s legal team. IBM thus believed its personal computer was protected from direct imitation. This couldn’t have been more wrong. Getting around IBM’s copyright turned out to not be so difficult after all. Copyright protected the written lines of code, but not the functions those codes produced. Compaq exploited this weakness by having a team of programmers document every function the IBM computer would perform in response to a given command, without recording the code that performed the function. This list of functions was then given to another team of “virgin” programmers who could prove they had never been exposed to IBM’s BIOS code.20 These programmers went through the list of functions and wrote code to create identical functions. As a result, Compaq was able to reverse engineer the BIOS in a matter of months without violating IBM’s copyright. Compaq
sold a record-breaking 47,000 IBM-compatible computers in its first year (it would go on to become one of the world’s largest computer manufacturers and was acquired by Hewlett Packard in 2002). Once Compaq had reverse engineered the BIOS, other clones were quick to follow.21 In order to perfectly emulate the IBM personal computer, the clones all adopted the operating system and microprocessor used in the IBM models: Microsoft’s MS-DOS and Intel’s 8088. So, while IBM had been ineffective in securing proprietary control of its own computer design, it had inadvertently created dominant positions for Microsoft and Intel.
Meanwhile, in December 1979 a group of Apple engineers, including Jobs, had been permitted to tour the fabled Xerox Palo Alto Research Center (PARC). Xerox knew that computers were a threat to its printing and copying business and had set up the center to find a way to dominate the paperless office of the future. Xerox thus gave a group of young, genius computer researchers free rein. According to former Xerox PARC researcher and later chief scientist at Apple Larry Tesler, “The management said ‘Go create the new world. We don’t understand it.’”22 There Xerox had developed the Alto, the first computer with a graphical user interface (GUI) that enabled the user to interact with the computer with a mouse, and it was connected to other computers by means of the Ethernet, the first computer network. However, the Xerox executives were hesitant to move forward on any of the leading-edge technologies—they just didn’t understand them. As put by former PARC researcher John Warnock, “There was a tremendous mismatch between the management and what the researchers were doing… they had no mechanisms for turning those ideas into real life products. That was really the frustrating part of it, because you were talking to people who didn’t understand the vision.”23 However, Steve Jobs understood the vision. As he recalled,
And they showed me, really, three things. But I was so blinded by the first one that I didn’t even really see the other two. One of the things they showed me was object-oriented programming. They showed me that, but I didn’t even see that. The other one they showed me was really a networked computer system. They had over a hundred Alto computers all networked, using email, etc. I didn’t even see that. I was so blinded by the first thing they showed me, which was the graphical user interface. I thought it was the best thing I’d ever seen in my life.… Within ten minutes it was obvious to me that all computers would work like this someday.”24
Being able to witness the stunning software that Xerox had created, before Xerox itself had figured out how to commercialize it, was yet another example of Jobs being at the perfect place at the perfect time. It is hard to overstate how important Jobs’s exposure to the developments at PARC was to the success of Apple. PARC had hired an army of brilliant engineering talent that spent years creating revolutionary technologies and then, in a single afternoon, gave them away. From the moment Jobs saw those revolutionary technologies, he knew they would change the world. He picked up the ball that Xerox had dropped and ran with it.
Astounded that Xerox had not commercialized the GUI for the personal computer market, Jobs and his team immediately set about not only executing Xerox’s ideas but also improving them. With Bill Atkinson as principal design engineer, they created a virtual desk where an individual could pick up a file by dragging the mouse over its icon or access other features from a menu bar at the top of the screen. People would no longer need to know commands or be taught how to use a computer—the computer would be intuitive.
The interface was first incorporated into the Apple Lisa, a high-end computer being developed for the business market. The computer was named after the daughter Steve Jobs had fathered with Chrisann Brennan. (This was an interesting choice given that he had initially denied that he was her father.) Unfortunately, Jobs’s demanding and volatile nature led to frequent clashes on the team, so Mike Markkula and Michael Scott (who Markkula had insisted be hired to act as president of the company because neither Wozniak nor Jobs had the experience required) removed Jobs from his role of running the Lisa project. The frustrated Jobs responded by taking over the Macintosh project—a much more affordable personal computer being developed by Jef Raskin. He moved the Macintosh project into a separate building across the street from Apple, flying a pirate’s flag overhead to signify the project’s rebellious autonomy, and told his team “It’s better to be a pirate than to join the Navy!”25 Jobs instinctively understood that in order to create something really special, the Macintosh people needed to be both physically and psychologically distanced from the rest of Apple. This distance would enable them to have their own standards and norms and to create their own dream.
Jobs’s perfectionism bordered on fanaticism. Fonts had to have perfect proportional spacing, and window icons had to have beautiful curves. According to one Macintosh team member, “He hounded the people on the Macintosh project to do their best work. He sang their praises, bullied them unmercifully, and told them they weren’t making a computer, they were making history. He promoted the Mac passionately, making people believe that he was talking about much more than a piece of office equipment.”26 Perfectionism was part of Jobs’s idealism; the Mac was supposed to revolutionize personal expression. Its designers weren’t just engineers; they were artists as well. As Andy Hertzfeld recalled of his time on the Macintosh team, “The goal was never to beat the competition, or to make a lot of money. It was to do the greatest thing possible, or even a little greater.”27 Compromises were not to be tolerated.
Jobs demanded that the Mac launch by early 1982.28 This was an inconceivable schedule, but Jobs insisted it was possible and bent others to this belief with his “reality distortion field,” as described in Chapter 1. As Hertzfeld noted, “If one line of argument failed to persuade, he would deftly switch to another. Sometimes, he would throw you off balance by suddenly adopting your position as his own, without acknowledging that he ever thought differently.”29 It’s worth noting that Elon Musk makes similarly “impossible” demands of his employees, although his style is more brute force than reality distortion field. On one occasion, for example, Musk gave a speech at Tesla noting that the team (including himself) would need to work Saturdays and Sundays, sleeping under their desks in order to meet a deadline. When an employee argued that everyone had been working very hard and needed a break to see their families, Musk fired back with “I would tell those people they will get to see their families a lot when we go bankrupt.”30
The Macintosh ultimately launched in January 1984 at a price of $2,495. It was smaller and lighter than most personal computers, and during its hundred-day introductory period, it included a free word processing program and a graphics package.31 It was a remarkable technical achievement that combined ease of use with advanced graphics capabilities that immediately secured it a stronghold position in desktop publishing. By now, however, it was facing significant competition from the IBM PC, IBM clones, the Commodore 64, the Atari 400/800, and others. Worse still, Bill Gates announced that Microsoft was developing a graphical user interface for IBM personal computers and IBM clones that would enable them to emulate the ease of use of the Macintosh. This interface would be known as Windows.
The Microsoft announcement infuriated Jobs, who felt betrayed. He had signed Gates on to develop graphical versions of a spreadsheet program and a word processing program, along with the BASIC programming language for the Macintosh; thus, Gates and his engineering team had frequent exposure to the Macintosh development project. Hertzfeld had begun to worry about it, noting that his contact at Microsoft had been asking many detailed questions about how the Mac’s operating system worked: “I told Steve that I suspected that Microsoft was going to clone the Mac.”32 Gates had agreed that Microsoft would not develop graphical software for any other computer companies until after the Macintosh launched, but at that time the launch date was scheduled to be January 1983 (already a year past Jobs’s initial launch date for the computer). The delay in launching meant that Gates had not actually violated the Apple agreement.
Job
s insisted Gates come down to face him; remarkably, he did. Hertzfeld recalled that Gates was in a room with ten Apple employees, including Steve Jobs, when Jobs assailed him: “You’re ripping us off! I trusted you, and now you’re stealing from us!” But Gates remained cool, looked Jobs directly in the eye, and responded, “Well, Steve, I think there’s more than one way of looking at it. I think it’s more like we both had this rich neighbor named Xerox and I broke into his house to steal the TV set and found out that you had already stolen it.”33
As IBM personal computers and their clones rose, it put pressure on Apple to allow cloning of the Macintosh as well—that is, to license the Macintosh operating system and hardware specifications to other companies that would enable them to produce Macintosh-like machines. Without cloning, many feared that Apple would increasingly lose share in what was starting to look like a winner-take-all market. An operating system that has more users attracts more developers of complementary software applications. The operating system with the most compatible software applications, in turn, has an advantage in attracting users. It was a self-reinforcing cycle known as “network externalities” that could cause a single standard to rise to overwhelming dominance in a market. Licensing the Macintosh operating system to other computer hardware producers might result in lower-cost Macintosh-like products coming to the market, helping to increase sales of the Macintosh operating system and turning the tide back in Apple’s favor.
Quirky Page 23