Return to the Little Kingdom

Home > Other > Return to the Little Kingdom > Page 38
Return to the Little Kingdom Page 38

by Michael Moritz


  When Sculley was fired, Apple was in peril. Windows 3.0, introduced by Microsoft in 1990, was not as elegant as the Macintosh software, but it was good enough. As Sculley returned to the East Coast, Apple’s market share had eroded, its margins had collapsed; the best young engineers were inclined to apply for openings at companies such as Microsoft, Silicon Graphics or Sun Microsystems.

  By the time Sculley departed, Apple’s board had degenerated. The people who had been major owners of the company and had a vested interest in its success had been replaced by an odd cast. This troupe was almost certainly bolted together by a nominating committee eager to demonstrate political correctness by assembling a board composed of people with different experiences and backgrounds. Over the course of forty-eight months in the mid- 1990s, the Board included the company’s own Chief Financial Officer, a person who had built a riverboat gaming company, the CEO of an enormous European packaging company, the head of National Public Radio, and an executive from Hughes Electronics and StarTV. None of these people had experience in the personal computer industry, none had worked for any time in Silicon Valley, none knew the others well and none, with the exception of Markulla, had a major economic or emotional interest in Apple. It is hard to imagine that they thought of themselves, let alone acted, as owners. If there were any bond tying them together it was probably the desire to avoid embarrassment. It’s little wonder they made two terrible selections, each of whom was more suited to be a corporate undertaker than an imaginative leader.

  The first, Michael Spindler, was a European whose business life, prior to Apple, had consisted of stints at DEC and Intel, where he had been a marketing strategist. As CEO he continued efforts begun by Sculley to sell Apple—with IBM, Sun Microsystems and Philips as his principal targets—and debating whether to license the Macintosh operating system to other manufacturers. An alliance with IBM and Motorola—the sort of convoluted corporate lash-up which in the world of technology never amounts to anything—was supposed to slow Microsoft by marrying Apple software to microprocessors made by the other two companies. In 1996, after less than three years as CEO, Spindler was ushered to the exit. The eight-person Board, without surveying outside candidates, turned to a fellow director, Gil Amelio, and charged him with rejuvenating Apple. Though Amelio liked to be referred to as Doctor (for his PhD in Physics), it was obvious, even before his appointment, that he was not the sort of medicine-man the patient needed.

  While Apple shriveled, Steve Jobs endured his wilderness years—an arduous, painful journey that, in retrospect, was probably the best thing that ever happened to him. After being banished from Apple, he sold all but one share of his stock and, aged 30, cast about for a new beginning. In 1986 he bought Pixar, a 44-person company which was owned by the creator of Star Wars, George Lucas, and had developed a small reputation for making computer-aided animation systems. Jobs was largely interested in the influence Pixar’s technology could have on personal computers. But for Jobs, Pixar was not the main event. In 1985 he formed a new computer company which, with characteristic elegance and symbolism, he named NeXT. There began a tortured tale which culminated at the end of 1996 with the most unlikely of endings: an acquisition by Apple.

  Between NeXT’s formation and its sale lay many sagas. The company showed how difficult it is for anyone to start another company after enjoying extraordinary success with a first. Jobs was a victim of his fame and notoriety and, instead of recalling all the lessons of Apple’s first year (when money was tight, resources strained, survival always a question, and a workbench in his parents’ garage the production line) his first acts at NeXT seemed like a continuation of life at a $1 billion company. Paul Rand designed NeXT’s corporate logo just as he had done for IBM, ABC and UPS; I.M. Pei, the high priest of modernist architecture, was commissioned to build a floating staircase (echoes of which showed up years later in many apple stores), and Ross Perot, Stanford University and others (including Jobs) supplied the start-up capital at a valuation roughly equivalent to what was awarded Microsoft at the time of its 1986 IPO.

  As NeXT evolved into a maker of corporate workstations, Jobs was taken out of his natural milieu. Instead of conjuring up ideas for products millions of consumers could use, he found himself consigned to a market in which purchasing decisions are made by committees who aren’t rewarded for making adventurous choices; where competitors such as Sun, Silicon Graphics, IBM, Hewlett-Packard and, of course, Microsoft wasted no opportunity to heap scorn; and where an expensive sales force is required to make inroads into customers. The black, cube shaped computer, which fell victim to terrible delays—one of the many curses that imperil an over-financed start-up—soon found itself along other Jobs-inspired products in New York’s Museum of Modern Art. Customers were less impressed. NeXT’s founding team gradually got burned out by the strain and the scent of failure. In 1993, Jobs threw in the towel on the systems business and attempted to convert NeXT into a software business—a strategy that invariably is the harbinger of doom for any computer company.

  By 1996, both NeXT and Apple had petered out. Jobs had been relegated to a cameo act in the computer business, although his tenacity and patience at Pixar had paid off. Nine years after his purchase of the company, the release of the animated picture Toy Story and a subsequent IPO gave the company the financial staying power to cope with the muscle of its exclusive distribution partner, Disney (which, a decade later, bought the company for $7.4 billion, thereby making Steve Jobs its largest individual shareholder since Walt Disney himself).

  Then, almost like a chapter out of a nineteenth century Victorian romance, Jobs, getting wind of Apple’s interest in purchasing Be, a company started by a former Apple executive, convinced Amelio that he was better advised to purchase NeXT and use its prowess with the UNIX operating system as the software foundation of Apple’s future. Amelio voted for NeXT, bought the company for $430 million in cash, gave Jobs 1.5 million shares of stock and thus, unwittingly, issued his own exit visa. There followed an awkward period during which Jobs announced that he was only interested in advising Amelio and taking care of Pixar. His sale of all but one of his recently awarded Apple shares registered his real opinion of Amelio. Less than three quarters after NeXT became part of Apple, Amelio was replaced by Jobs, who was appointed interim CEO. This provoked cackling and headlines that sounded like obituary notices: “How did this mess happen? The untold story of Apple’s demise” and “Rotten to the Core” were just two of the messages splattered across the front of national magazines. Michael Dell, then one of the darlings of the personal computer industry, posed a rhetorical question about Apple in the fall of 1997: “What would I do? I’d shut it down and give the money back to the shareholders.”

  Steve Jobs had been weathered by his years in the wilderness. His battle with NeXT had taught him to to cope with dire circumstances and his experience in the animation business at Pixar was as the CEO of the world’s most technologically advanced creative company. The Apple he inherited in the fall of 1997 had lost its creative zest and leadership position in the technology industry, was almost out of cash, was unable to recruit bright young engineers, was drowning in inventory of unsold computers and had nothing imaginative in the works. Jobs was unromantic. The marketing department, eager to announce a change for the better, wanted to run ads that said “We’re back!” Jobs would brook none of it.

  Instead, he rolled out an advertising campaign labeled, “Think different,” based on a series of black-and-white photographs of remarkable individuals. A couple of iconoclastic businessmen were billboarded, but they were heavily outnumbered by the artistic and inventive. There were musicians (Bob Dylan, Maria Callas and Louis Armstrong), artists (Picasso and Dalí), an architect (Frank Lloyd Wright), charismatic leaders (Mahatma Gandhi and Martin Luther King), scientists (Einstein and Edison), movie makers (Jim Henson), dancers (Martha Graham), and an adventurer (Amelia Earhart). The campaign was a rallying cry, but it was also a keen expression of the artistic, sensuou
s, romantic, mystical, inquisitive, seductive, austere and theatrical side of the Jobs—adjectives not usually associated with the leader of a technology company. It was these attributes that eventually came to be expressed in Apple’s products, which Jobs turned into objects of desire.

  The advertising campaign was simple and direct, which perhaps had more significance inside the company than outside. It said, in plain terms, that the company could not afford to mimic others but needed to forge its own path. Jobs also forced the company to act differently. He cut costs, forced substantial layoffs, killed entire product lines he deemed worthless, undifferentiated or insipid, such as printers or the Newton. He stopped licensing the Macintosh operating system to other manufacturers, limited the distribution of Apple products to all but the most rabid of its dealers, installed five of his executives from NeXT as pillars of his management team while keeping Fred Anderson as Chief Financial Officer, jettisoned most of the discredited board of directors and replaced them with practical, hard-knuckled people he trusted, engineered a $150 million investment from Microsoft (that simultaneously ended years of legal bickering between the two companies, beefed up Apple’s cash position, and ensured the Internet Explorer browser a prominent place on Macintosh computers) and, within ten months, introduced a fresh line of Macintosh computers with his customary flirtatious flair. One year later, in the fall of 1998, Apple reported annual sales of almost $6 billion and a profit of more than $300 million, compared to sales of $7.1 billion and a loss of $1 billion at the time he took the helm.

  Despite Jobs’ helmsmanship, the pop of the dotcom bubble, the recession of 2001 and the Mac’s small market share meant that Apple was still battling against the tide. Leaks had been caulked, useless crew members had been made to walk the plank, worthless cargo had been tossed overboard, but the vessel’s course had not been altered. This was reflected in the losses of 2001, the first red ink in three years. It was against this perilous backdrop that the iPod and Apple’s stores were birthed—both born of necessity and the sense that the company could not count on the kindness of others to foster its growth. Independent software developers, including companies such as Adobe, which had helped Apple create the desktop publishing market, were beginning to abandon the Macintosh; retailers, particularly the large stores, were either ignoring or neglecting Apple’s products. Jobs and his management team resisted the temptation to make large acquisitions—the normal way that large companies tend to try to escape testing times and which, almost always, starts with slide presentations promising the moon but ends with write-offs and recrimination. If Apple’s leaders uncovered a small product or promising team that could be quickly made productive and folded into an existing or nascent effort, they pounced. But for real growth they relied on their own wit and ingenuity.

  The first example of Apple’s desire to fend for itself came with iMovie, software designed to help consumers manage and edit video, an application that, hitherto, might have been supplied by Adobe. Jobs’ conviction that video was the company’s freedom ticket meant Apple was almost blind-sided by the dawn of the digital music business. While Apple was presenting consumers with video software, tens of millions of people were discovering that music was available all around the internet. Websites such as Napster and Kazaa roused the ire of music publishers and record companies, but these spots on the internet, when combined with hundreds of models of portable mp3 players, spelled a new chapter for the distribution of entertainment.

  It was against this change in consumers’ habits that the iPod was conceived and rushed to market at a pace similar to the manner in which Sony’s Walkman had arrived a generation earlier. It went from start to store shelves in less than eight months—a madcap effort to buoy Apple’s flagging sales during the 2001 holiday season. The iPod, which at first only worked with Macintosh computers, had a novel user interface—a dial that helped people sift through their music libraries—and a much longer battery life than most mp3 players. Buried inside was its most important feature: a compact version of a UNIX operating system that meant this innocent-looking device contained as much computing power as many laptops. In 2003, while the music labels bickered and dallied, Apple introduced the first legal music online service and replaced the notion of an album with the reality of a track.

  In the same year in which the iPod was introduced, Apple opened its first retail store a few miles from the Atlantic coast in Tyson’s Corner, Virginia. Later that same day, the second store opened near the Pacific Coast in Glendale, California. The stores were another expression of the need for Apple to take charge of its destiny. For most, this looked like a desperate measure, particularly since there were so few examples beyond the worlds of fashion or cosmetics of manufacturers becoming successful retailers. Apple’s approach to retailing was influenced by the success of another Northern Californian company, The Gap. Jobs had become a director of the company and, in turn, Mickey Drexler, the merchant who led The Gap during its decade-long rise, joined the Apple board. The first Apple store revealed a merchant’s virtuosity. Computers, software and consumer electronic devices were displayed in an atmosphere that was like a breath of fresh, California coastal air.

  The iPod and Apple’s stores struck chords with consumers, and the company’s management pounced on the opportunities with the thirst and relish of indefatigable and experienced travelers finally reaching an oasis. Variants of the iPod were introduced as quickly as possible and within forty-eight months it had been transformed from a 5GB monochrome device to a 60GB color player. When any model showed signs of wear, Apple’s management resisted the opportunity to milk the last drop of sales and, instead, replaced it with something better.

  The same sort of touch was applied to the stores with their Genius bars and roving sales assistants armed with wireless credit-card terminals. Apple’s U.S. flagship store, which opened on Manhattan’s Fifth Avenue in 2006, five years to the day after first two stores opened, was the apogee. Here, on the plaza in front of what was once another symbol of American success, the GM building, floated a glass cube in which Apple’s illuminated logo was suspended. The frame for the cube was made from hand-blasted Japanese steel, pietra serena stone lined the floors and there, twenty-four hours a day, throngs of people of all ages and backgrounds came to wander, ogle, browse and shop. Apple’s stores reached $1 billion in retail sales faster than any other company and by 2007 its sales per square foot—the shorthand measure of any retailer’s health—was more than ten times larger than Saks, four times greater than Best Buy, and even handily outstripped Tiffany.

  Inevitably, Apple’s progress was marred by blemishes. There were the occasional product miscues: a Macintosh housed in a clear plastic cube that developed hairline surface cracks, lame versions of the iPod introduced with Motorola and Hewlett-Packard, battery packs in laptop machines that over-heated and an occasional product, like the first version of Apple TV, that fell far short of expectations. Later a contretemps erupted over stock options, in particular, two large grants made to Steve Jobs in 2000 and 2001 and which he surrendered in 2003. These, and some more made to other executives, attracted the attention of the SEC, stirred up the whiff of impropriety and occupied the business press. But nothing came to threaten Apple as much as Steve Jobs’ health when it was disclosed in 2004 that he had been operated on for pancreatic cancer.

  Rumors about Apple’s encore to the iPod had been in the air long before Steve Jobs used one of his hallmark solo shows in San Francisco in 2007 to introduce the company’s riff on hand-held computing. Though it was dubbed the iPhone, the device Jobs introduced was not a conventional cell phone or mp3 player, was far removed from a personal digital assistant and did not bear too many resemblances to a portable game device. The same day the iPhone was introduced, the word “computer” was dropped from the company’s name, which was shortened to Apple, Inc.—a sign of how far the company had traveled in the previous years.

  The iPhone was a glorious expression of Apple’s approach to product d
esign. It did not start with laborious research, focus groups or the acquisition of another company with a hot product. It began with a few people trying to design a product they would want to use and be proud to own. Like many previous products conceived under Jobs’ leadership, this required taking a close look at the short-comings of existing products, adapting ideas from others and melding them into something that, by 2007, could only have come from Apple. Such was the allure and romance of being associated with Apple that AT&T management signed up, on draconian terms, to be the exclusive U.S. distributor for the iPhone with barely a glimpse of the product. While the advertisements, commercials or press reports would frequently employ the term “revolutionary” to describe the iPhone and other Apple products, they were evolutionary—exquisite refinements of the half-baked ideas and products full of compromises and shortcomings that other companies had prematurely rushed onto store shelves.

  The iPhone appeared simple. It fired up immediately. It was housed in a case less than 12 millimeters thick and could connect to any machine—from a supercomputer to a smoke detector—that was hooked up to the internet. But simplicity, especially elegant simplicity, is deceptively difficult. Jobs’ magisterial achievement, one that has few, if any, precedents, was to ensure that a technology company employing tens of thousands of people could make and sell millions of immensely complicated yet exquisite products that were powerful and reliable and while also containing a lightness of being. This is Apple’s triumph. It is one thing for an individual—Matisse with a line, Henry Moore with a shape, W.H. Auden with a phrase, Copland with a bar, Chanel with a cut—to express themselves. It is another matter entirely for the germ of an idea to be developed, refined, reshaped, molded, tuned, altered and rejected again and again before it is considered perfect enough to be reproduced in the millions. It is another matter too to steer, coax, nudge, prod, cajole, inspire, berate, organize and praise—on weekdays and weekends—the thousands of people all around the world required to produce something that drops into pockets and handbags, or in the case of a computer, rests on a lap or sits on a desk.

 

‹ Prev