The Code
Page 47
America had become even more fractured and fractious over the course of Obama’s presidency, yet he remained optimistic about social media’s potential to bridge the divide. Even a rising swell of foreign hacks and online security breaches did not dim the president’s hope that much could be overcome if tech and government were both at the table. “I’m absolutely confident that if we keep at this, if we keep working together in a spirit of collaboration, like all those innovators before us, our work will endure, like a great cathedral, for centuries to come,” he exhorted an admiring Stanford crowd during a cybersecurity summit the White House held on campus in early 2015. “And that cathedral will not just be about technology, it will be about the values that we’ve embedded in the architecture of this system. It will be about privacy, and it will be about community. And it will be about connection.” Mark Zuckerberg couldn’t have said it better himself.32
CHAPTER 24
Software Eats the World
For all the chatter about Brin and Page and Zuckerberg, Steve Jobs remained uncontestably the most important person in Silicon Valley in the century’s first decade. A legend in his own time, Jobs had returned to Apple in the summer of 1997 as it clawed for its share of the dwindling fragments of a desktop market utterly dominated by Microsoft and the PC platform. Then he brought the company back from the dead. Adding a theatrical flourish to the whole resurrection, Jobs reached détente with his fiercest business rival, Bill Gates, who agreed to make a $150 million investment in Apple that saved the company from going under.1
In the decade that followed, Apple roared back into the center of the Silicon Valley story, with Jobs headlining one momentous product reveal after another—the bulbous and playful iMac, the sleek and intuitive iPod, and the market-upending iTunes, which harnessed the anarchic file-swapping energy of Napster to create a legitimate and immensely lucrative music platform. By the mid-2000s, the Apple team had shifted its focus to the biggest hardware challenge—and potentially biggest moneymaker—of them all. They were going to make a mobile phone. Cell phones were already a massive market, but Jobs was less interested in imitating what was already out there than he was in creating something quite different: an intuitive, elegantly designed handheld computer.
THE SUPERCOMPUTER IN YOUR POCKET
Silicon Valley technologists had been trying to build such a device since before the Apple II. It had been an arduous quest. In 1972, Xerox PARC’s Alan Kay had mocked up a prototype of a mobile companion for young children that he called the “Dynabook.” In 1991, an all-star roster of Silicon Valley insiders came together to launch Go Corp., developing software for a notebook-sized computer that used a stylus instead of a keyboard. Despite having Bill Campbell as CEO and John Doerr as a major investor, Go was too far ahead of its time. Apple made its own foray into stylus-and-notebook computing with the Newton MessagePad. But that device had an early death as well, felled by glitchy software and by the fact that it was John Sculley’s pet project. As soon as Jobs got back into the CEO suite, he axed it. “God gave us ten styluses,” Jobs’s biographer Walter Isaacson recounted him saying as he waved his fingers in the air. “Let’s not invent another.” The closest the Valley came to realizing the dream was the ill-fated adventure of General Magic.2
By the early 2000s, other companies had achieved tremendous success with cell phones that featured e-mail and some very rudimentary Web browsing. The BlackBerry, a mobile phone featuring a tiny keyboard, became an indispensable device for legions of businesspeople in the first years of the decade, turning swift thumb typing into a badge of workaholic honor. The Palm Treo featured e-mail, a calendar, and a color screen.
Then there were the mobile phone giants—Motorola, Nokia, Samsung—who with every year made their phones “smarter” by loading on features and Internet access and ever-smaller keyboards. Progress in microchip technology fueled the market, as the advanced reduced-instruction-set microprocessors (or ARMs) that had been helping make computers faster and cheaper for a decade were now capable of powering a device small enough to hold, capable of surfing the Web, and possessing enough battery life to be useful.3
However, the way these got made infuriated Jobs and fellow design purists. The telecom companies held great leverage over phone design and packed them with applications users didn’t want or need. Carriers fiercely resisted mobile devices that tried to deliver a richer Web browsing experience, protesting that phones that smart would hog too much network bandwidth. Unsurprisingly, Jobs had a very clear idea of what he wanted in a phone, and because he was Steve Jobs, he and his team were able to wrangle control from the wireless providers to make it happen.
The Apple iPhone that Jobs unveiled to the world in January 2007 was a mobile phone unlike any before it: a sleek bar of metal and glass, no keyboard, no buttons, no antenna. It had a touch screen, a phone, and GPS. Before too long, it would have voice recognition software. The iPhone looked like a palm-sized version of the mysterious black monolith that so fascinated the apes in 2001: A Space Odyssey, and it garnered nearly as much chattering excitement.4
The order of show that day was designed to impress, as other titans of the Valley gathered around Jobs on stage to endorse the effort. Eric Schmidt joked about an Apple-Google merger—“we can call it Applegoo”—and Jerry Yang enthused like a star-struck teenager. “I would love to have one of these too, what a great device!” The final touch came via an iPhone voicemail from Apple board member Al Gore, relaying his congratulations on the achievement.5
Those outside the convention hall weren’t as easily convinced. Microsoft CEO Steve Ballmer dismissed the phone out of hand. “There’s no chance that the iPhone is going to get any significant market share. No chance,” he told USA Today. The $500 retail price seemed ridiculous to Ballmer, as it did to others at the time. Plus, the first version of the phone only featured applications made by Apple. Steve Jobs’s attitude about third-party software was the same as it had always been: he didn’t want any of it gumming up the device’s beautiful simplicity.6
Fortunately for iPhone users and for Apple’s revenue stream, Jobs was eventually overruled. The App Store launched a year later. Apple remained firmly in control, approving any app before it could appear in the store, and taking a whopping 30 percent cut of the profits. The tactic was wildly successful and wildly profitable. Developers flocked to build for the iPhone, choosing it over other competing platforms. With a plethora of interesting applications, consumers got over their trepidation about the iPhone’s high price; this wasn’t just a beautiful piece of hardware, it was useful. Apple embraced its new role as Pied Piper for an entirely new mobile ecosystem. Its advertising tagline, “There’s an app for that,” became so popular that the company had it trademarked.7
The spread of the iPhone and its App Store sent ripples through the entire Internet world. Websites had to be rebuilt to look as good on mobile as they did on a desktop; social and search giants had to scramble to build mobile apps. Over at Google, leaders like Sun veteran Eric Schmidt and Netscape godfather John Doerr heard alarming echoes of platform and browser wars, when Bill Gates made a killing on proprietary software and boxed nearly everyone else out of the business. Microsoft’s Ballmer was already mocking Google as a “one-trick pony” for its continued reliance on search for most of its revenue, and the executives over at the Plex knew they couldn’t miss out on the mobile moment. Google’s don’t-be-evil answer to the dilemma was to release an open-source OS for smartphones, called Android, giving it away for free to any handset maker who wanted to use it. The move was also a boon for Google’s business, providing a well-matched platform for mobile versions of its products. The Android platform spread like wildfire, becoming the standard OS in nearly any mobile that wasn’t an iPhone. By the end of 2016, Android phones made up over 80 percent of the global market, and over half of Google’s revenue came from mobile.8
The entry into the phone market was even more profitable for Apple. Ten
years after its introduction, over one billion iPhones had been sold worldwide. It was the bestselling consumer product in human history. Having a geolocated, camera-equipped supercomputer in millions of pockets jump-started whole new business categories, such as ride-sharing (Uber and Lyft), local search (Yelp), and short-term rentals (Airbnb). It further spiked the growth of social media, launching born-mobile apps (Instagram, Snapchat) and turning existing networks into even more potent vehicles for advertising and sales. The switch to mobile made Facebook’s user base grow even faster. By 2018, three out of four Americans owned a smartphone.9
With so many addictive morsels right at people’s fingertips, the daily hours spent staring at tiny screens rose so sharply that a new and popular category of apps appeared, reminding users to put their phones down. By 2017, the mobile app business was larger than the film industry, and payments to app developers alone totaled $57 billion. Apple became the world’s most valuable company, raking in nearly $230 billion in sales. Although the secret to Apple’s stratospheric earnings was that it remained a hardware company—and very expensive hardware, at that—the iPhone’s greatest contribution was in unleashing software from its desktop anchor and placing it on a candy-bar sized supercomputer. The iPhone was always on, always accessible, and, so very quickly, impossible to live without.10
THINK DIFFERENT
Steve Jobs had been diagnosed with pancreatic cancer four years before the iPhone’s release. Although he pronounced himself healthy after surgery in 2004, his increasingly gaunt appearance caused rumors to swirl in the years that followed. “Reports of my death are greatly exaggerated,” he’d quip, channeling Mark Twain, but by 2009 it was impossible to keep up the front. He took a medical leave from Apple to have a liver transplant, returning shortly after, only to take another leave by early 2011. This time it was for good. On October 5, he died. Jobs was fifty-six. “Ahead of his time to the very end,” eulogized the San Jose Mercury News.11
No other tech leader had been so iconic, so enduring, bridging the generations and becoming the face and the personality behind so many legendary Valley moments and high-tech products. Even the tales of Jobs being an arrogant jerk—moments balanced by the warmer and humbler man remembered by close confidantes like Regis McKenna and Bill Campbell—were an important part of the Valley legend. His death prompted an extraordinary outpouring of grief, not only from those who knew him personally but from the millions of Apple users who felt that they knew him nearly as well. “Steve was a dreamer and a doer,” one wrote in a tribute wall on the company’s website. “I am grateful for the gift he was in his creative genius,” wrote another. At Apple retail stores throughout the world, people brought flowers and personal notes in tribute.12
At the private memorial service held on Apple’s Cupertino campus a few weeks after his death, new CEO Tim Cook played a recording for the assembled crowd of company employees, celebrities, and Valley power players. It was Jobs’s voice that boomed out through the speakers, reading the copy for the 1997 ad campaign—titled “Think Different”—that had started airing soon after his return to the company he’d founded. “Here’s to the crazy ones,” Jobs said. “The misfits. The rebels. The troublemakers . . . Because the people who are crazy enough to think they can change the world, are the ones who do.”13
* * *
—
Not everyone agreed about the saintliness of Steve Jobs. In a social-media-driven moment, the critiques began even before the mourners had filed out of the memorial service. Jobs was a jerk, a greedy capitalist, a terrible boss, cried tweets and blog posts. The back-and-forth about “Good Steve” versus “Bad Steve” was only partly about Jobs. It also was about the place and the industry that he had come to symbolize. By 2011, the largest tech companies had transformed the way people across the globe worked, played, and communicated. They had opened up access to information like never before. The answer to nearly any question was just a Google search away. Long-lost friends and family reunited thanks to Facebook. Smartphones made the dream of a “computer utility” a reality at last.
Yet the greatest beneficiaries of the new tech companies seemed to be the very rich people who led and invested in them. Silicon Valley’s titans had more money than God and an unimaginable amount of data on ordinary people. America was still climbing out of the market-shattering Great Recession, and sharp levels of income inequality had spurred populist movements on both Left and Right. While Jobs was being eulogized in Cupertino, the protesters of Occupy Wall Street had taken over New York City’s Zuccotti Park, railing against “the 1 percent.” Tech moguls were the 0.001 percent, and all their change-the-world promises seemed to have done nothing except encourage smartphone addiction.
Chasing fast money on apps and games that appealed to a narrow demographic of young, educated urbanites, the Valley seemed to be out of ideas. Even leaders within the industry saw a place that was falling short of its promise. Peter Thiel became one of the more outspoken critics. “What Happened to the Future?” asked a 2011 manifesto issued by Thiel’s VC firm, Founders Fund. “We wanted flying cars; instead we got 140 characters.”14
DAY ONE
Jeff Bezos also believed the Internet economy could do more. Visionary and relentless, Bezos already drew comparisons to Jobs for his intense management style and insistence on high standards. As Amazon grew large, his mantra remained largely unaltered since his early bookselling days: think long-term, put customer satisfaction first, and be willing to invent. To underscore Amazon’s continued fidelity to its founding mission, Bezos attached his original 1997 letter to shareholders to every one of the company’s Annual Reports. Its signoff was one of the CEO’s favorite catchphrases: “It’s still Day 1!”15
In contrast to the technicolored playgrounds of the Valley, Amazon remained a realm of spec buildings and door desks, and leanness of operations was at the core of its business model. Like his friend and board member John Doerr, Bezos was a faithful follower of Japanese manufacturing principles, dogged in his pursuit of cutting muda, or waste, at all points in the production chain. Bezos didn’t issue Zen koans like Jobs; while giddily passionate about innovation’s promise, he remained a quant. Numbers, not emotion, guided him. “There is a right answer or a wrong answer,” he once wrote, “a better answer or a worse answer, and math tells us which is which.” He was careful about where he gave interviews, wasn’t much interested in corporate PR, and Amazon still didn’t advertise on television. The product, Bezos believed, spoke for itself.16
Amazon’s dot-bomb days had faded into distant memory, displaced by its new identity as an unstoppable retail behemoth. The company had upended the publishing industry and was pushing into new realms, delivering value and convenience for its customers while it chased brick-and-mortar stores out of business. A big part of its growth came from turning itself into a platform for third-party buying and selling, giving businesses small and large an opportunity to reach Amazon’s enormous audience. Now Amazon was branching out further into large-scale software platforms. The biggest of them all was Amazon Web Services, or AWS.
When he talked about AWS, Bezos sounded a lot like Steve Jobs talking about the Apple II. “The most radical and transformative of inventions are often those that empower others to unleash their creativity—to pursue their dreams,” he told shareholders in 2011. Amazon had launched the service without much hoopla in 2006, targeting a new set of customers: software developers in search of storage and sophisticated computer power. But AWS’s origins went back to the dark days of the dot-com bust, when analysts were getting sacked for having rated Amazon a “buy.” Part of the company’s path out of that mess was to turn itself into an e-commerce platform for other retailers to sell their wares, and it had to rebuild its technology infrastructure in order to do it.17
The result was a cleanly designed and resilient suite of software services, linked into a national network of humming data centers with enormous computing capacity. Bezos
had moved to Seattle partly to be close to the book distribution centers of Washington and Oregon. These states’ vast rural hinterlands now became fertile fields of server farms powering data-intensive operations. The region had an abundance of cheap hydropower, courtesy of the New Deal dams that straddled its great river systems, making it one of the best places on the continent to consume the vast amounts of electricity that were necessary for something like AWS. On the East Coast, Amazon repurposed older data centers in the national-security corridor of Northern Virginia, close to the original backbone of the Internet. From coast to coast, plenty of computing power, and very little muda.18
The name given to this platform—“cloud computing”—was new, but the underlying concept was as old as the UNIVAC. The cloud was time-sharing, twenty-first-century style, in which a data center substituted for a mainframe, laptops subbed in for “dumb” teletype machines, and the virtual machines ran Linux as well as everything else. Instead of the telephone wires of time-sharing and its inheritors, the network on which it ran was broadband-capacity Internet. The value proposition wasn’t all that different from what Ann Hardy had built on that SDS mini at Tymshare four decades earlier: an OS that allowed clients to access computing power on an as-needed basis, lowering cost and increasing efficiency.
Infrastructure technology had deep roots, but the market opportunities were new. Open-source software made it possible for tiny teams to develop and run new systems and apps. A raft of companies had sprung up to build mobile apps, enterprise tools, and video and music streaming services. These entrepreneurs and developers had skills, laptops, and fast broadband connections. They needed server space and computing power, and AWS provided it.