Book Read Free

Quirky

Page 24

by Melissa A Schilling


  However, Jobs was staunchly opposed to cloning. He believed strongly that the quality of the hardware and simplicity and beauty of the user experience could be protected only by keeping the entire computer proprietary and integrated. As he would later articulate, “We’re the only company that owns the whole widget—the hardware, the software and the operating system. We can take full responsibility for the user experience. We can do things the other guy can’t do.”34 He wanted to instead boost sales of the Macintosh by cutting its price. Unfortunately for Jobs, he no longer had the last word at Apple after the company went public. As disagreements became increasingly heated between Steve Jobs and John Scully, who had been brought in to help manage the firm, the board of directors sided with Scully. Jobs was ousted from his role of running the Macintosh team and given the ceremonial role of chairman. Feeling shocked and betrayed, he turned down the job and resigned from Apple:

  What had been the focus of my entire adult life was gone, and it was devastating. I really didn’t know what to do for a few months. I felt that I had let the previous generation of entrepreneurs down—that I had dropped the baton as it was being passed to me. I met with David Packard [cofounder of HP] and Bob Noyce [cofounder of Intel] and tried to apologize for screwing up so badly. I was a very public failure, and I even thought about running away from the valley. But something slowly began to dawn on me—I still loved what I did. The turn of events at Apple had not changed that one bit. I had been rejected, but I was still in love. And so I decided to start over. I didn’t see it then, but it turned out that getting fired from Apple was the best thing that could have ever happened to me. The heaviness of being successful was replaced by the lightness of being a beginner again, less sure about everything. It freed me to enter one of the most creative periods of my life.35

  One of the most important traits of a successful breakthrough innovator, as we previously noted, is persistence in the face of criticism or failure. The most profoundly successful innovators are tenacious, persevering long after most other people would walk away. Jobs was such an innovator. Within a few months he founded a new computer company called NeXT, and the next year he also funded the spinoff of a computer animation division from George Lucas’s Industrial Light & Magic that was led by Edmund Catmull and Alvy Ray Smith, and came to be known as the animated film company Pixar. NeXT made high-end workstations based on object-oriented software. The computers were technically extremely elegant and visually striking—black die-cast magnesium cubes, with each side exactly one foot long. The NeXT won accolades and awards for its innovative design, but with a $6,500 price tag and few compatible software applications, the hardware was doomed to commercial failure. Pixar, on the other hand, became one of the most successful film production companies of all time. Jobs’s visionary perfectionism was a perfect fit for the ambitions of Catmull and Smith. Together, Jobs, Catmull, and Smith shared the dream of using computers to push the state of the art in animation as far as possible. The cutting-edge technology they developed made it possible to use computer animation to make full-length feature films with stunning graphics quality.

  During the time with Jobs out of the company, Apple Computer didn’t fare well. Scully had tried to gain share by introducing lower-cost products and initiated a program to develop a version of the Mac operating system that would run on Intel-based personal computers. However, with profit margins at their lowest ever, the board of directors decided to replace him in 1993 with Mike Spindler, who had been running Apple’s international operations at the time and had formerly worked for Digital Equipment Corporation and Intel. Spindler didn’t fare much better. He canceled the program to put the Mac operating system on Intel-based personal computers and instead licensed a handful of companies to make Macintosh clones. He also made efforts to slash costs, cutting 16 percent of Apple’s workforce and reducing R&D spending.36 Profits unfortunately continued to slide, and after posting a $69 million loss in early 1996, the board replaced him with Gilbert Amelio, a former CEO of National Semiconductor who had previously worked at both Bell Labs and Fairchild Semiconductor. Amelio decided to streamline Apple’s product line and focus on higher-margin products. He also hoped that a new advanced operating system would restore Apple to a position of technological leadership. However, work on the next generation of the Mac operating system wasn’t going well. The company had spent $500 million on R&D to develop the operating system, had incurred multiple delays, and still was far from where it needed to be. Scully, Spindler, and Amelio were all experienced and respected executives, but it appeared that none of them could revive Apple, and by the fall of 1996 it appeared to be months away from bankruptcy. Wired even ran an article titled “101 Ways to Save Apple,” and among the ways were “Sell yourself to IBM or Motorola” and “License the Apple name to appliance manufacturers.”37

  In December 1996, in a truth-is-stranger-than-fiction moment, Amelio canceled the Macintosh operating system development program and announced his plan to buy NeXT Software instead.38 Apple bought NeXT Software for $429 million in January 1997; its operating system, NeXTSTEP, would become the foundation for the new Mac operating system, and Jobs would be brought on as a part-time adviser.39 Many in the industry had already been murmuring about the possibility of bringing Jobs back to run Apple. Despite the many criticisms one could levy against Jobs for his management style, many people had begun to realize that much of the “magic” that had made Apple special and had inspired a zealously loyal following had come from his intense idealism and passion. Amelio’s decision to bring Jobs in as an adviser was a tacit admission of it, and years later Scully would state to the press that Jobs had been the greatest CEO of all time. Within nine months of the acquisition, the board asked Jobs to take the position of CEO. He declined, agreeing instead to serve as interim CEO to help get the company back on track until someone else could be found. He was, after all, already CEO of Pixar.

  With Jobs as interim CEO, Apple began to turn around. First, he killed the Mac clone project. Then he slashed almost all of the fifty or so development projects that were under way. Apple would focus on making just four great products: a desktop and a notebook for consumers (the iMac and iBook) and a desktop and a notebook for businesses (Power Mac and Powerbook). As Jobs described the situation, “People think focus means saying yes to the thing you’ve got to focus on. But that’s not what it means at all. It means saying no to the hundred other good ideas that there are. You have to pick carefully. I’m actually as proud of the things we haven’t done as the things I have done. Innovation is saying no to 1,000 things.”40 Whereas companies like Sony and Nike had become famous for using incremental innovation to rapidly spawn hundreds of different versions of their products—a phenomenon referred to as “mass customization”—Jobs preferred to invest all of the company’s money and energy on a few blockbuster products. In 1998, when UCLA strategy professor Richard Rumelt pointed out that it could be difficult to survive as a niche computer maker and asked Jobs what he planned to do next, Jobs just smiled and replied that he was “waiting for the next big thing.”41 As it turned out, he wouldn’t have to wait very long.

  All of the products were successful, but the most pivotal would be the iMac. Jobs had put industrial designer Jonathan Ive in charge of creating the look of the product, which would be remarkable for its rounded shape and boldly colored translucent case. Jobs, who always had a preference for simplicity, insisted that the iMac not have a floppy drive and that all the old input/output ports common on other personal computers be replaced with USB slots. The iMac turned out to be one of Apple’s most successful products ever, selling two million units in its first two years.42 The iBook, which had a similar design theme as the iMac, with rounded shapes and bold colors, and an attractive $1,599 price, was also a strong seller.43 NeXTSTEP was also successfully integrated into a new Mac operating system called MacOS X. From 1997 to 2000, Apple went from posting a $1 billion loss to a $786 million profit, and in 2000 Jobs acknowledged what
was already obvious to everyone—that he would stay at Apple—and he dropped the “interim” from his title.

  If anyone had ever doubted the role that Jobs played in creating breakthrough innovation at Apple, the next decade would stamp out all uncertainty. In the late 1990s, the music industry was undergoing turbulent change. An algorithm developed by Fraunhofer IIS of Germany had enabled digital audio files to be compressed to approximately one-tenth of their original size. This format was later dubbed MP3. Digital files could now be stored on the hard drive of a computer and shared over the Internet. In the late 1990s several companies introduced portable audio devices to store and play the files, such as Diamond’s Rio, Compaq’s Personal Jukebox, and Creative’s NOMAD jukebox, but none gained much traction. Then in 1999, Shawn Fanning, a student at Northeastern University, released Napster, a software program that offered a user-friendly way of finding and sharing music online, becoming one of the first widely adopted “peer-to-peer” applications.44 Napster was free, and by March 2000, five million copies had been downloaded.45 The great majority of music downloaded through Napster was copyrighted—commercial records and songs. The Record Industry Association of America (RIAA), the trade group that represents the leading music business entities in the United States, became increasingly alarmed and sought a way to stem the flow of pirated music. The RIAA initiated legal action against Napster and its users, and in July 2001 the courts ruled that the Napster service had to be taken offline. However, the genie could not be put back in the bottle. Other MP3 exchange services began sprouting up online, and it was clear that if the record labels wanted to stop the illegal exchange of music, they needed to come up with a better option. Warner Music, BMG, EMI, and Real-Networks teamed up to introduce a subscription service called MusicNet, and Sony Entertainment partnered with Universal to create its own service called Pressplay. However, both were harder to use and offered fewer selections than the illegal exchange services did. The music industry giants needed a better solution, and Jobs was about to offer it to them.

  In October 2001, Apple launched the iPod, a portable music player with a sleek shape, an easy-to-use interface, and a hard drive big enough to put “1,000 songs in your pocket.”46 It was a very surprising move for a company whose entire history had been in computers. Analysts and press were openly skeptical; even Apple fans were dubious of the move.47 However, Jobs had a vision for the Mac as the center of a “digital hub” in the home, and a digital hub had to provide, among other things, music. Jobs didn’t like any of the existing portable digital audio players of the time. They either held too few songs or were too heavy or cumbersome to use. The company would have to come up with its own, better design. It took many talented engineers, some external components, and the design talent of Jonathan Ive to come up with what emerged as the iPod. Given the previous lack of success of portable digital audio players and Apple’s inexperience in consumer electronics, most people thought the device would fail. They would be proven wrong. Even at $399, a price well above other competing products, the iPod had strong initial sales. The original iPod was compatible only with the Mac, but after a version was released in mid-2002 that was compatible with Windows (a move suggesting that Jobs realized that the iPod could be more successful than Mac computers could ever be), sales took off, reaching over a million iPods sold within a year.

  In April 2003 Apple opened its iTunes Music Store. Jobs had struck agreements with the five major record labels (Sony, Universal, BMG, Warner Music Group, and EMI), and he launched iTunes with an initial catalog of 200,000 songs, offered for $.99 each.48 Although several of the record labels initially balked at selling individual songs or at selling them all for the same price, Jobs countered with “We don’t see how you can convince people to stop being thieves, unless you can offer them a carrot—not just a stick. And the carrot is: We’re gonna offer you a better experience… and it’s only gonna cost you a dollar a song.”49 Later, when Jobs was asked why he thought the music industry had been willing to trust Apple, he responded:

  Apple is the most creative technology company out there—just like Pixar is the most technologically adept creative company.… Also, almost all recording artists use Macs and they have iPods, and now most of the music industry people have iPods as well. There’s a trust in the music community that Apple will do something right—that it won’t cut corners—and that it cares about the creative process and about the music. Also, our solution encompasses operating system software, server software, application software, and hardware. Apple is the only company in the world that has all that under one roof. We can invent a complete solution that works—and take responsibility for it.50

  Apple’s cool image, the attractive price point for the songs, and the fact that it could offer music from all five of the major record labels was a recipe for success. iTunes reached 50 million downloads within the first year and quickly became the leading distributor of music online.51

  In January 2007 Apple made an even more stunning move by announcing the iPhone. To understand how surprising a move this was, it is useful to understand just how brutally competitive the mobile handset industry had become. Although sales of handsets were growing rapidly, the market was also extremely consolidated (Nokia, Motorola, and Samsung collectively controlled about 70 percent of the market), and extreme competition on both price and innovation was the norm. It was almost impossible for smaller companies to compete. This was especially true in the United States, where a large share of mobile phones were sold to a few giant phone service carriers such as AT&T, Verizon, and Sprint, which wielded considerable power to negotiate deep discounts. Furthermore, while on the one hand the emergence of smartphones had prompted consumers to begin to expect much more advanced technologies in their phones, the fact that carriers typically subsidized the purchase price of the phones by building the cost into service contracts meant that most consumers did not have a good sense of the cost and worth of the technology. Most people did not expect to pay more than $200 for a cell phone because most never had—at least not directly. By narrowing the gap in price between basic cell phones and the most sophisticated cell phones, carriers had (probably unwittingly) made it harder for handset manufacturers to charge higher prices for more-sophisticated phones.

  Ericsson, once a leading handset maker, gave up the battle in October 2001 and exited the handset business. Palm, the maker of what is considered to be the first successful personal digital assistant, and its offspring Handspring had both evaporated by 2006. Why would Apple want to enter such a competitive, consolidated industry that appeared so unrelated to its core area of expertise? Ed Coligan, the former CEO of Palm, remarked that it had taken the company a few years of struggle to figure out how to make a decent phone, and “PC guys are not going to just figure this out. They’re not going to just walk in.”52 He could not have been more wrong.

  In my many years of teaching innovation and counseling entrepreneurs and innovators, I’ve come to realize that when entrepreneurs and innovators come up with breakthrough ideas, it is often difficult for others (including myself) to understand them. Other people don’t see the vision, and they don’t feel the thrill felt by the innovator. They often react with deep skepticism. Sometimes the innovator can explain it well enough to bring others on board, but even if she cannot, it doesn’t mean the idea isn’t a good one. The very things that make an innovator capable of generating and pursuing a breakthrough idea are the reasons that others won’t initially understand. Their ability to challenge assumptions and their extreme self-efficacy make them able to conceive of, and commit to, an idea that sounds absurd to others. They are willing to pursue an idea even when everybody else says it’s crazy precisely because they don’t need the affirmation of others—they believe they are right even if you don’t agree. Thus, I now try not to judge the potential of an idea—I’m too often wrong—and instead I advise innovators not to expect everyone to understand. I just try to give them the tools they need to make better decisions bas
ed on their own conception of what is possible.

  When Apple launched the iPhone in June 2007, the reason for the move became suddenly clear. Not only was the iPhone a thing of beauty, with the clean, smooth lines that Apple had come to be known for, but it also had a remarkable range of functionality accessed through an exquisitely intuitive interface. Many people clearly remember the first time they stroked a finger across the screen of an iPhone and saw the applications scroll by. They didn’t slide mechanically; they sailed smoothly, with acceleration and deceleration, like a coaster flicked across a polished table. It was captivating. Furthermore, the interface was so intuitive that toddlers could immediately use it and then expect the television and other electronic devices to work the same way. That interface instantly elevated our expectations about how things should work, which must have put an enormous grin on Steve Jobs’s face. Furthermore, there were dozens of applications that became indispensable to people’s lives, and the quality of the applications was very carefully controlled so that the user experience would always be seamless. Ryan Block, writing for Engadget at the time, stated, “To date no one’s made a phone that does so much with so little.… It’s totally clear that with the iPhone, Apple raised the bar not only for the cellphone, but for portable media players and multifunction convergence devices in general.”53 The iPhone was, in essence, an evolutionary form of the Mac—a new and beautiful species of the Mac that users would keep with them all day long and that would enhance myriad aspects of their lives. It was a bicycle for the mind that fit in your pocket. It was so in line with Jobs’s vision that when we look back now it seems an inevitable move, although it certainly was not obvious to most people in 2006.

 

‹ Prev