Book Read Free

Return to the Little Kingdom

Page 37

by Michael Moritz


  But the triumph of technology was displayed late one night when three men were setting up a demonstration of a television satellite dish. They used a microcomputer to calculate the pitch and tilt needed to find a satellite floating twenty-five thousand miles high and monitored the results of their efforts on a color television. They adjusted the dish, skipping from one invisible satellite to another, until they found what they wanted: a Los Angeles porn television channel bouncing a signal more than fifty thousand miles so that three men in the California desert could watch a naked black woman perform cunnilingus on her equally bare, white, female partner. It was, at least, a marriage of community and technology and the festival organizers would not have been surprised to learn that the women worked well together.

  Many of the two hundred thousand or so people (nobody was all that sure of the numbers) who spilled into the beer gardens, soaked under outdoor showers, sprayed each other with plastic mister-bottles, and wallowed in the drenching drafts of a water cannon, seemed to enjoy themselves. Those who had a way with words called the US Festival one big party, though they made it sound like purdee. Those who liked adjectives thought it one big, fuggin purdee. Many said they had come to purdee, to have a good time, a ball, and a blast. The US Festival was: Neat. Great. Incredible. Fantastic. Unbelievable. Amazing. Like Wow.

  The altar of this vast affair was a stage cast in empyreal proportions that would have done justice to Cecil B. De Mille. A pair of three-story-high video screens served as the outer panels of the tryptych. The crowning glory was another screen, the sort used for instant replays in baseball stadiums, perched half a skyscraper high above the well of the stage. In the bowels of the stage roadhands and stage crew operated elevators and movable platforms and bounced up steep flights of steps tugging racks of guitars, portable wardrobes, and steel trunks crammed with the paraphernalia of rock groups. Out in the desert bowl, black banks of loudspeakers sent four hundred thousand watts bouncing around the San Bernardino and San Gabriel mountains, and cameras filmed the action for cable television syndication. Laser beams struck across the night sky, arrogantly sketching electronic patterns across sleepy black clouds. All the gadgetry and noise seemed like cosmographic versions of the videotape recorders, wide-screen televisions, stereo systems and video games that had crept into Wozniak’s own home. Wobbling alongside the stage was the Apple balloon that in the hubbub looked strangely innocent and quietly forgotten.

  Presiding over the entire concert was Bill Graham, a San Francisco rock promoter: a chunk of irascible threats who, dressed in denim cutoffs, T-shirt, and basketball boots, yelled until the veins in his neck bulged and the spittle in his throat dried but comandeered the festival. He jabbed the air with his fists, flexed his muscles, and later said that Wozniak was a tragic figure. When he slipped on stage between acts he asked for a big welcome for a grrrrade, grrrrade band, for a grrrrade artist, a grrrrade rock ‘n’ roller and hailed these three grrrade days of grrrade, grrade rock ‘n’ roll.

  All the virtues Wozniak had combined in the Apple Computer were absent. This was sledgehammer action and there was no hint of obscurity, no sense of subtlety and little discrimination. Perhaps the festival sprang from some desire to amuse and entertain; perhaps it was nothing more than a spectacularly conspicuous expression of vanity. It was certainly a freeze frame of celebrity-riddled America. In a white press tent two hundred reporters, photographers, and television cameramen waited for Wozniak. There were journalists from the television networks, cable television stations, dozens of radio stations, daily newspapers, weekly magazines, rock ‘n’ roll journals, and the computer trade press. They represented a mad welter of well-known names, and while they waited for a press conference they poked at trays of food, phoned reports to friends and editors, and swatted at the wasps that hovered around soft-drink cans, garbage pails, and trays of half-eaten food.

  They waited for Wozniak to descend from a house that he had rented on a hill overlooking the festival grounds and used as a base for excursions in a long, black limousine. The journalists all waited for a line, a quote, a picture, or a close-up. They ruffled through steno pads, tightened tripods and monopods, and fiddled with cassette and microcassette recorders. A stream of leads led to a hedge of microphones and tape recorders, and when Wozniak arrived, ducking under a canvas flap, the tent ballooned to life. A bowl of Nikons, Canons, Pentaxes snapped and clicked. There was pushing and jostling and elbowing. The curved wall of cameras slid forward. A table collapsed and there were loud shrieks. The motor drives whirred, slapping image after image onto roll after roll of film. There were shouts and whistles. “Keep it down. . . . For chrissake, shuddup. . . . Quiet. . . . Quiet. . . . Woz! . . . Woz! . . .”—and all the while there was pushing and elbowing to get better pictures and angles. Sitting behind a table between the rock promoter and the graduate of Erhard Seminars Training, Wozniak wore a baseball cap that was set at a cockeyed angle, a T-shirt, shorts, and socks and grinned like an admonished schoolboy. He was spattered with a sad, repetitive, empty loop of questions: “How much money you lost? . . . How many people are here? . . . Why d’you do this?”

  EPILOGUE

  More than a quarter of a century has passed since I wrote the previous page on an Apple III computer. In 1984, as the first edition of this book made its way to the press, I received several letters from the publisher—those being the days before email had become the universal telegraph system—expressing anxiety that Apple’s day in the sun might already have passed. The apprehension was understandable. The hullabaloo surrounding the introduction of the Macintosh—trumpeted with an Orwellian television commercial on Superbowl Sunday 1984—had evaporated, and the notices had turned sour. IBM’s personal computer business was gaining strength. Compaq had reached $100 million in sales faster than any previous company and Microsoft’s operating system, DOS, was winning licensees by the month. There were plenty of reasons to think that Apple was teetering.

  Twenty five years later, when people are as familiar with the names iPod, iPhone, or Macintosh as they are Apple, it is hard, particularly for those reared on cell phones and social networks, to imagine a time when the company appeared to be just another technology firm that would be snuffed out or absorbed by a competitor. Since 1984, there have been plenty of technology companies that have faded to grey or gone to black, and it’s remarkably easy to come up with an alphabetical list for these casualties that runs from A to Z.

  The letter “A” alone includes Aldus, Amiga, Ashton-Tate, AST and Atari. As for the rest of the alphabet there’s always Borland, Cromemco, Digital Research, Everex, Farallon, Gavilan, Healthkit, Integrated Micro Solutions, Javelin Software, KayPro, Lotus Development, Mattel, Northstar Computers, Osborne Computer, Pertec, Quarterdeck, Radius, Software Publishing, Tandy, Univel, VectorGraphic, Victor, WordPerfect, Xywrite and Zenith Data Systems. The large technology companies that have weathered these decades—IBM and HP—have done so in areas far removed from personal computing. IBM, once the company that others in the personal computer industry feared, has even surrendered its franchise to the Chinese company Lenovo.

  The mortality rate makes Apple’s survival—let alone prosperity—even more remarkable. I’ve watched Apple, first as a journalist and later as an investor, for most of my adult life. Journalists suffer from the malady of not forgetting a topic that once interested them. I’m no different. But a couple of years after I finished writing this book I found myself, thanks to some twists of fates, working at Sequoia Capital, the private investment partnership whose founder, Don Valentine, had helped assemble some of the formative blocks on which Apple was built. Since then, as an investor in young technology and growth private companies in China, India, Israel and the U.S., I have developed a keener sense for the massive gulf that separates the few astonishing enterprises from the thousands that are lucky to scratch out an asterisk in the footnotes of history books.

  In 1984, if most consumers had been asked to predict which company—SONY or Apple—
would play a greater role in their lives, I wager most would have voted for the former. SONY’s success rested on two powerful forces: the restless drive of its founder, Akio Morita, and the miniaturization of electronics and products consumers yearned for. The Japanese company, which had been formed in 1946, had built up a following as a designer and maker of imaginative and reliable consumer electronic products: transistor radios, televisions, tape recorders and, in the 1970s and 1980s, video recorders, video cameras and the WalkMan, the first portable device to make music available anywhere at any time of day. Like the iPod, a generation later, the Walkman bore the stamp of the company’s founder. It was created in a few months during 1979, it built its following largely by word of mouth and in the two decades prior to the advent of mp3 players sold over 250 million units. Now, as everyone knows, the tables have been turned and some years ago a cruel joke circulated which spelled out the change in circumstances, “How do you spell SONY?” The answer: “A-P-P-L-E.”

  This begs the question of how Apple came to outrun SONY, but the more interesting topic is how the company came to rattle the bones of mighty industries and has forced music impresarios, movie producers, cable television owners, newspaper proprietors, printers, telephone operators, yellow page publishers and old line retailers to quaver. None of this seemed possible in 1984 when Ronald Reagan was President, half of American households tuned into the three television networks, U.S. morning newspaper circulation peaked at 63 million; LPs and cassette tapes outsold CDs by a margin of ninety to one; the Motorola DynaTAC 8000x cell phone weighed two pounds had thirty minutes of talk time and cost almost $4,000; Japan’s MITI was feared in the West; and the home of advanced manufacturing was Singapore.

  Three mighty currents have flowed in Apple’s favor, but these waters were also navigable by other crews. The first swept electronics deeper into every nook and cranny of daily life so that now there is almost no place on earth beyond the reach of a computer or the bewildering collection of phones and entertainment devices with which we are surrounded. The second has made it possible for companies born in the era of the personal computer to develop consumer products. It has been far easier for computer companies with refined software sensibilities to design consumer products than for those whose lineage was consumer electronics and whose expertise lay largely in hardware design and manufacturing prowess. It’s not a coincidence that some of the companies with the acutest envy towards Apple have names like Samsung, Panasonic, LG, Dell, Motorola and, of course, SONY. The third current was “cloud computing”—the idea that much of the computation, storage and security associated with popular software sits in hundreds of thousands of machines in factory-sized data centers. This is the computer architecture that, in the mid 1990s, supported services such as Amazon, Yahoo! eBay, Hotmail and Expedia and later came to underpin Google and the Apple services that light up Macs, iPods and iPhones. Today, for the first time, consumers—not businesses or governments—enjoy the fastest, most reliable and most secure computer services.

  In 1984 more immediate and mundane challenges confronted Apple. Faced with the challenge of managing a fast growing company in an increasingly competitive business, the Board of Directors of Apple, were faced with the most important task that confronts any board: selecting a person to run the company. Mike Markkula, who had joined Jobs and Wozniak in 1976, had made no secret of the fact that he had little appetite for life as Apple’s long-term CEO. Thus the Board, which included Steve Jobs, had to decide what course to take. This decision—and three similar decisions over the ensuing thirteen years—shaped Apple’s future.

  Only in retrospect have I come to understand the immense risk associated with hiring an outsider—let alone a person from a different industry—to run a company whose path has been heavily influenced by the determination and ferocity of its founder or founders. It is not an accident that most of the great companies of yesterday and today have, during their heydays, been run or controlled by the people who gave then life. The message is the same irrespective of industry, era or country and the name can be Ford, Standard Oil, Chrysler, Kodak, Hewlett-Packard, WalMart, Fedex, Intel, Microsoft, NewsCorp, Nike, Infosys, Disney, Oracle, IKEA, Amazon, Google, Baidu or Apple. The founder, acting with an owner’s instincts, will have the confidence, authority and skills to lead. Sometimes, when the founder’s instincts are wrong, this leads to ruin. But when they are right, nobody else comes close.

  When corporate boards start to have misgivings about the condition of a company or the ability of the founder and have no plausible internal candidate, they will almost always make the wrong move. They usually need to make a decision about a CEO when a company is barreling towards a fall, emotions are raw, testosterone levels are running high and, particularly in a company as visible as Apple, when every employee, analyst, smart-aleck and naysayer is ready to dispense advice. At Apple in 1985 the Board’s decision was complicated by the fact that there was no obvious successor within the company. Jobs was considered too young and immature, and for his part, he knew that he needed help if Apple was to achieve the $10 billion sales level he had already started to dream about. The oppressive weight of conventional wisdom tilted the quest towards a résumé dripping with impressive-sounding titles and credentials. But experience—particularly when it’s been acquired in a different industry—is of little use in a young, fast-growing company in a new business that has a different pulse and unfamiliar rhythm. Experience is the safe choice, but is often the wrong one.

  After a lengthy search, Apple’s board announced that John Sculley would be the company’s new Chief Executive. Sculley was unknown in Silicon Valley, which was hardly surprising since he had spent his entire business career at Pepsi Cola where, in his final job, he had run its soft-drinks business, PepsiCo. Sculley’s arrival in Cupertino was greeted with the demeaning commentary that Apple (and Jobs) “needed adult supervision.” This is the very last thing that rare and wonderful founders need. These rare sorts of people may require help, they will certainly benefit from assistance and there may be plenty of things that are new or foreign to them. But the appearance of a boss, particularly one with little experience of technology and the brutish rough and tumble of a company in its formative years, will almost certainly end in misery.

  At Apple, Sculley was greeted like an archangel and, for a time, could do no wrong. He and Jobs were quoted as saying that they could finish each others’ sentences. In hindsight it is fairly easy to say that it would be almost impossible for a man like Sculley, reared within the confines of an established East Coast company selling soft drinks and snacks, to flourish in a business where product life cycles are measured in quarters, if not months, and where cowing to convention marks the start of the death rattle. It is easier for a founder, particularly when surrounded by people with different experiences, to learn about management than for a manager from a large company to master the nuances and intricacies of an entirely new business—especially if that happens to be a technology company.

  Within less than two years familiarity began to breed contempt—a situation complicated by the fact that while Sculley bore the CEO title, Jobs was the Chairman of the company. Disagreements occurred. Sniping and backbiting broke out and the dissension became so intense that in 1987 Sculley, disgruntled, displeased, exasperated and exhausted by Jobs, orchestrated the latter’s dismissal from the company. Sculley’s tenure at Apple lasted until 1993, and for part of that time the external reviews, at least as posted by Wall Street analysts, were favorable.

  In the decade Sculley spent at Apple sales grew from less than $1 billion a year to more than $8 billion a year. On the surface, this looks like a wonderful record. But the reality was far different. Sculley benefited from a powerful force—the massive demand for personal computers. This sort of market growth conceals all types of shortcomings, and it is only when the rate of change slows or the economy contracts that the real cracks become visible.

  During Sculley’s time at Apple, the company was
outgunned by the brute force of IBM, then by the cunning maneuvering of the industry’s arms merchant, Microsoft, which made the operating system that it had licensed to IBM available to all comers. This led to a proliferation of what were labeled “IBM compatibles”—some made by startups like Compaq, others by established players like DEC and still more from cost-conscious Taiwanese companies such as Acer. These machines shared two traits: the hardware was built around microprocessors from Intel, and their operating systems were furnished by Microsoft. Apple, in the meantime, counted on chips from Motorola (and later IBM) and had to labor hard to convince programmers to write software for the Macintosh, whose market share dwindled as the years slipped by. Apple was fighting on two fronts with weak allies against the vast budget of Intel—in an industry where engineering and capital counted for a lot—and the legions of programmers who had discovered they could build a business atop Microsoft DOS and its successor operating system, Windows. Part of Sculley’s response was to gradually increase Apple’s prices in an effort to maintain profit margins—a ploy that propped up earnings for a while but eventually foundered.

  While pesky newcomers attacked, inventiveness withered inside Apple. The company that had led the industry with color on the Apple II, a graphical user interface with the Macintosh, desk-top publishing and laser printing, integrated networks, and stereo sound stopped leading. As Sculley departed, amidst a flurry of recrimination prompted by his affection for the limelight and dalliance with the national stage, the cupboard was bare. The spark of imagination, or, more particularly, the ability to transform a promising idea into an appealing product, had been extinguished. Apple introduced no meaningful new products in the decade Sculley spent at the helm. The computers that did appear bore sterile names such as Performa, Centris and Quadra. Computers with more memory, larger screens and bigger disk drives do not count for lifetime achievement awards. The Newton, a small, digital organizer championed by Sculley in his self-appointed role as Apple’s Chief Technology Officer, amounted to little more than an expensive doorstop. In an autobiography, published in 1987, Sculley—in what now seems like a very accurate assessment about the gulf between his capabilities and the Founder he displaced—savaged Jobs’ ideas of the future by writing, “Apple was supposed to become a wonderful consumer products company. This was a lunatic plan. High tech could not be designed and sold as a consumer product.”

 

‹ Prev