A Great Awakening?
Everybody knows and feels it, although no one knows quite what to say about it. We know on some level how out of control things have gotten, but we can’t even begin to imagine living any other way. The reaction is less intellectual than visceral. Life feels off. People feel stressed, behind, out of sorts, disconnected, lost. It’s not just the whacked-out politics of the current presidential administration, not just the political polarization, not just job anxieties, not just the upheaval of industrialism giving way to the computer age. It’s both more than that, and less.
“I think if you zoom out, this is like 1946 in a sense that we’ve just invented this new, powerful, and very dangerous new technology,” says Harris. “We’ve developed a system of manipulating our own [social] system that is more powerful than the ability of even our own mind to track it.”26
On that score, it’s impossible not to be reminded of Mary Shelley, who wrote her famous Frankenstein in 1818 at a turbulent time—in many ways not unlike our own—that was causing romantics like her to declare a “crisis of feeling” as others before them had declared a crisis of faith. Frankenstein’s monster became one of the three most enduring fictional characters of its type. But unlike Dr. Jekyll or Dracula, her monster was never named. (People mistakenly call the monster Frankenstein, but Dr. Victor Frankenstein was in fact the monster’s creator.) The monster has no name because it is simply too unknowable, too unconstrained, and too wild to be tamed in that way.
Not unlike that nameless monster, Big Tech’s salient feature is intangibility. These companies traffic not in widgets that we can see and touch, but in the abstractions of bits and bytes. In the long history of economic development, this is unprecedented. The wheel. The Roman aqueducts. The printing press. The key inventions of the industrial age—the previous height of technological innovation—were all immediately perceptible to the senses, and often a few of them at once. There was no mistaking an automobile, a light bulb, or a telephone. When a freight train roared by, there was no confusion about what it was, what direction it was going, whether it was carrying human passengers or livestock, and so on.
Big Tech, as vast and pervasive as it is, operates completely by stealth: silent and invisible, without shape, color, or smell. We can’t see it, we don’t understand it—and yet we oddly welcome wherever it might choose to go. Funny how we pride ourselves in our suspicions of salesmen, but let down all our usual defenses when some glittering new piece of technology comes calling.27 In this, the relationship of Big Tech to its users resembles that of the retired cat burglar once played by Cary Grant to the rich and fluttery wives who must have been his victims. One imagines Grant as all charm and suaveness as he slips their diamond necklaces from their necks and nips their earrings off their ears. These ladies would be so delighted by the sheer joy of Grant’s presence in their lives, they wouldn’t notice that he has relieved them of all their jewels.
Part of this is due to the unique aesthetics of the technology itself. If the enduring image representing the industrial revolution was that clanging, smoke-belching locomotive roaring over the countryside, the information age is represented by the sleek, slender iPhone, probably the loveliest mass market product ever made. But wait: It’s not only beautiful; it also does all this great stuff. Has anything ever felt quite so good in the hand, offered so much to the eye, and so enlivened the passions and quickened the mind? Canadian communications theorist and philosopher Marshall McLuhan once observed that every new wave of technology contains all the previous waves within it,28 and so the smartphone is the telephone, the camera, the movie, the phonograph, the radio, and so much more, all in one. And all of it thanks to computer chips that forever pack more and more power into less and less space—and now, with quantum computing, will operate on hyperdrive.
A smartphone’s powers are both unknowable and immensely powerful in the way that magic, by definition, is. People might have a rough understanding of how a car engine works, and if they don’t, they can always pop the hood to see. But who has ever peeked under the cover of their iPhone to see what’s going on inside? Who even understands how this little pocket-sized computer sends and receives photos, or summons the Internet, or lets us stream a two-hour movie? For most people this is wondrous, even magical. In fact, the transmission of the electronic signals that carry email isn’t so different from the movement of sound waves through the air. Still, it’s amazing that no one needs to give any of this a thought—until there’s a glitch, of course.
Few things in life are so mysterious, and yet so utterly normal that we take them for granted. About the only thing that comes close is religious faith, which can likewise take hold of a person. Indeed, in a psychological and social sense, the current Big Tech fervor is oddly reminiscent of the First Great Awakening of the 1730s, which ignited the masses with transformative notions that were similarly hard to put a finger on. Just as the awakenings were the work of just a few thundering divines, most famously Jonathan Edwards, George Whitefield, and John Wesley, who sent forth gripping sermons from their pulpits (platforms, we’d call them today) that soon had people quaking all up and down the Atlantic coast, the high priests of the tech revolution number just a handful of big players: Brin and Page, Zuckerberg, Bezos, Musk, et al. Just as the ministers invoked the fear of everlasting hell if people didn’t go along—and held out the promise of eternal life if they did—the exhortations of Big Tech, too, reach people at a level of consciousness located somewhere below reason—at “the bottom of the brain stem,” as Tristan Harris puts it—where we have the least power to resist.
* * *
—
HARRIS TRIED FOR a year to shift things from his own pulpit at Google. But by 2012, he was growing concerned about the way in which the engineers at the company paid little attention to the impact of their design choices—making the phone beep or buzz with each new email, for example. Huge amounts of money and time were spent on fine-tuning details, but in Harris’s view, very little was spent on asking the big question: Are we actually making people’s lives better?
After a revelatory moment at (where else?) the infamous Burning Man event in the Nevada desert, he put together a 144-slide presentation and sent it to ten other Googlers (the presentation later spread to five thousand more). Entitled “A Call to Minimize Distraction & Respect Users’ Attention,” the deck contained statements such as “Never before in history have the decisions of a handful of designers (mostly men, white, living in SF, aged 25–35) working at 3 companies—Google, Facebook, and Apple—had so much impact on how people around the world spend their attention.”29 The presentation made waves among engineers at the middle ranks, but according to Harris, the top brass (Page himself discussed the topic with Harris) had no interest in making the business model shifts his deck suggested. There was too much money at stake.
Harris managed to parlay the slide presentation into a position (specially created, of course) as Google’s “chief ethics officer.” But he couldn’t get traction for his ideas; there was a general sense that the company was simply delivering what users wanted; what could be so bad about that? “Nobody was trying to be evil,” Harris explains. “These were simply the techniques and the business model that were standard.” Nevertheless, he saw something that most people “on the inside” didn’t: that we had reached a tipping point in which the interests of the tech giants and the customers they supposedly serve were no longer aligned.
“There’s a reason why culture and politics are being turned inside out to become more ego driven,” he says. “There’s an entire army of engineers at all these firms working to get you to spend more time and money online. Their goals are not your goals.” In 2015, Harris left Google after deciding that it was “impossible to change the system from within.” He has since started a guerrilla movement through a nonprofit called Time Well Spent, to try to push tech companies to change their core values.30<
br />
“If you are alone on a Tuesday night, you’d want those thousands of engineers [at Google] who are working to keep you on the screen alone to be trying to help you not be lonely. They could be working on trying to alleviate that,” says Harris. That’s the key point. Developers have a choice. They can design for empathy and connection, or they can design to maximize eyeballs. “Just like the makers of cigarettes who were knowingly peddling an unhealthy and addictive product,” says Harris, “they had a choice.”
Many people in the Valley, and indeed many people who believe that capitalism is simply about selling people whatever they want, would say that it’s not the job of Google or Facebook or Apple to decide what’s moral. Their platforms simply reflect all the good and bad that is human. But plenty of people would disagree. Harris is one of them. One of the tasks of Harris’s own think tank, the Center for Humane Technology, is to develop alternative business models for digital technology that are both healthier for people, and yet economically viable for the companies themselves. It would be a wise strategy for the companies to listen as they draw increasing scrutiny for everything from privacy to monopoly to the health effects of technology.
The Federal Trade Commission announced in late 2018 that they would follow the lead of European regulators and begin investigating the use of those “loot boxes” that my own son Alex found so addictive, with the aim of determining whether gaming companies, who make up an industry that is forecast to be worth $50 billion by 2022, are knowingly using gambling techniques to hook kids. “Loot boxes are now endemic in the video game industry and are present in everything from casual smartphone games to the newest high-budget video game releases,” said New Hampshire senator Maggie Hassan, who called for the investigation.31 Since then, there has been a spate of other sorts of legislation designed to shift how technology companies market to children and how content and advertising can be presented to them. Much of this has been pressed by activists like Harris, as well as others such as James P. Steyer, the CEO and founder of Common Sense Media, who was a major force behind California’s new privacy legislation as well as various propositions to protect children online. In the coming years, the industry can expect only more pressure from parents, activists, and regulators to take responsibility for how their products are affecting our brains—and the brains of our children. The question is how they will respond to it.
Toward Humane Technology?
Technology firms are struggling already to get out ahead of it all, with tweaks to their products and services geared toward children. Of all the Big Tech firms, Apple has been perhaps the most receptive to criticism over the addictive properties of their products, in part because their core model doesn’t depend on monetizing personal data via targeted advertising in the same way that Google and Facebook do. (Though it certainly does depend on attention: The company’s App Store ten-year-anniversary press release lauded the success of games like Angry Birds and Candy Crush, which have hooked millions of people.)32 As The New York Times reported in 2018, Apple devices do host apps that track users’ location within the Apple orbit, but only around 200 or so, as compared with Android’s 1,200.33 And Apple has made some significant changes, prompted by pressure not just from activists like Harris, but more recently from investors—including the large hedge fund Jana Partners and the California State Teachers’ Retirement System, which controls about $2 billion of Apple shares—who in 2018 sent a letter urging the company to develop new software tools that would help parents control and limit the impact of device use on their kids’ mental health.34 The company has responded by creating a new set of controls that allows users (or their parents) to track how they are using apps, and to cut the number of notifications they receive.35
* * *
—
AS FOR GOOGLE or Facebook fundamentally changing their attention-hijacking practices, it will be an uphill battle. Google, which has been more receptive to feedback than Facebook (though that’s not saying a lot), has changed some algorithms on YouTube, for example, to try to combat the problem of filter bubbles. And it has also, as mentioned earlier, considered moving children’s YouTube content onto a separate platform. But it’s difficult to see them successfully shifting their entire business model to revolve less around data collection and the monetization of attention, and like any legacy company, they are reluctant to change what is already so profitable. It’s likely that only a threat of regulation would prompt them to do that, and indeed, that’s slowly but surely happening. In a properly functioning market, start-ups might move into this fray and disrupt the paradigm with new business models that maximize utility rather than time spent online. Some have tried, but the monopoly Google and Facebook hold in their respective areas makes it very hard for innovators to gain traction.36
As Guillaume Chaslot, the former Googler who tried (unsuccessfully) to shift the nature of algorithms at YouTube to combat filter bubbles, put it to me, “There just aren’t any incentives at the big companies to change business models. You need start-ups to do this. But they don’t have scale to compete, and they can’t get the funding to grow,” since nobody will invest in competing technology because the network effects harnessed by the largest players seem too powerful to disrupt.37 How these networks and their disruptive effects work and how they are moving throughout not just consumer technology, but every industry, is the topic of the next chapter.
CHAPTER 7
The Network Effect
Emails are the gift that keeps on giving. Facebook and Google have tried for years to brand themselves as champions of freedom, democratizers of information, and connectors of the world. But when you look at their internal email trails, you often see a different story. So it was in the winter of 2018, when British lawmakers released a trove of Facebook emails dating from 2012 to 2015 that provided a window into the duplicitousness of the company’s top brass.
It should come as no surprise that any big company would be single-mindedly focused on growth. That’s what capitalism—at least the kind we have in the United States and most parts of Europe at this moment in time—is all about. But what’s less expected is the extent to which the tech giants have been allowed to employ anticompetitive practices to sustain and even accelerate that growth, in ways that surely would have triggered regulatory backlash had they occurred in other industries.
First is the way in which Facebook has used its size and scale to quash competitors. As the network grew, Facebook became a company with monopoly power. Like a railroad or a utility, it ran a platform that people needed access to if they wanted to reach a certain audience. It could therefore demand almost anything it wanted from those who needed that network—and its data—to develop their own businesses. And, by the same token, Facebook could deny anyone access to those massive amounts of user data (which is the only reason other businesses are interested in being on Facebook in the first place), for any reason.
As the 250 pages of emails and documents released by British lawmakers revealed, companies who were not considered competitive with Facebook, including Airbnb, Lyft, and Netflix, got preferred access to data, as did the Royal Bank of Canada and a number of other nontech businesses. But those companies that Facebook viewed as competition, like Vine (a Twitter-owned video app), were denied or even shut out of the company’s network altogether. Indeed, after Twitter released Vine in 2013, Facebook shut off Twitter’s access to Facebook friends data at Zuckerberg’s behest.1
Meanwhile, the emails revealed that Zuckerberg discussed charging app developers for access to Facebook user data, while also forcing them to share their own user data with Facebook’s network; email debates show that the company even considered restricting developer access to certain kinds of data unless the developers bought advertising on Facebook. “It’s not good for us unless people also share back to Facebook and that content increases the value of our network,” wrote Zuckerberg. “So, ultimately, I think the purpose of platf
orm…is to increase sharing back into Facebook.” In another email, his COO Sheryl Sandberg advocated the same idea. “I think the observation that we are trying to maximize sharing on Facebook, not just sharing in the world, is a critical one,” said Sandberg, in a telling departure from her and Zuckerberg’s public refrain about “making the world more open and connected.”2
When it came to growing the network, it seems that nothing was off limits. And it’s no wonder, because as we will learn, the network is where the value lives. Facebook needed to grow it, at all costs. That’s why executives agreed to risk potentially bad PR so that Android apps could allow the logging of users’ phone calls. It was an invasion of privacy on a new level—but it also created more data that could be mined, which increased Facebook’s ability to grow.3
The Operating System for People’s Lives
In 2011, the FTC launched an investigation into Google (this was around the same time that a variety of state agencies as well as European and Asian regulators began looking into the company’s competitive practices), centered around the claim that Google had monopoly power in various markets and would use it to crush competitors if it could. The case was prompted in part by complaints brought by Yelp, the popular search service specializing in deep, hyper-local information about individual communities (like, for example, which daycare service is best according to a group of local users in Portland, or where to get the finest Thai food in Boston).
Don't Be Evil Page 15