Variable rewards are likewise ubiquitous in the world of Big Tech. A smartphone itself is a slot machine that you carry around in your pocket,6 and even though most of the “pellets” it feeds us are actually boring beyond belief, we keep pressing the lever, in the hopes of those random rewards: a gossipy email, a cute snapshot, an exciting bit of news. Most of the apps on our phones are slot machines, too, designed to hijack our attention, to redesign our wants, to alter our very natures. Fogg’s motivations were altruistic, as he sought to encourage individuals to do good things like exercise more and smoke less. His classes in computer-human interaction, which essentially translated philosophical and psychological ideas about persuasion into algorithmic form, “wasn’t about developing the dark side,” says Fogg, who has become quite defensive about his role in the propagation of addictive technologies (particularly after having received death threats following some of the negative press coverage). “It was about highlighting and facilitating persuasive technology for socially useful purposes. I don’t know if we were successful….” His voice trails off.
That, of course, depends on how you define successful. Certainly, Fogg’s lab became one of the most popular and influential at Stanford, churning out the founders of Instagram and a host of other start-ups. But the word really spread with the rise in popularity of Internet platforms like Facebook, which suddenly gave app developers the chance to employ the techniques of captology to influence millions. Fogg himself had built a start-up in 2003 and 2004, geared toward combating isolation by helping people—particularly older adults—build strong relationships with friends online. (AARP was a partner.) Then Facebook came calling.
“They said, ‘We’re launching a platform and we want you to create an app and work with us on this.’ And I saw that they’d put all the pieces together. Unlike Myspace, they were starting from a position of credibility (because of Zuckerberg’s Harvard connection). They’d allow people to contribute their talents and skills and have these things be part of Facebook and that would increase users and have a snowball effect of more people coming to the platform.
“It was a testing ground for new apps!” says Fogg, excitedly. “I remember getting into my little Acura Legend, and driving away and calling my mom and saying ‘Mom, you’ve never heard of this thing called Facebook, but they just won the game. You should get on it, because this is where you’ll be eventually.’ ”7
Fogg’s own start-up failed to reach critical mass, though he did have an interesting meeting with Google’s Larry Page about it. “Terry Winograd was my thesis adviser and also had been his. So I reached out, and he got Page to come over, and he looked at what we were doing and said, ‘You’ve got to get this out there….You’ve got to get it out now.’ I was concerned about making it perfect. But I think he was right.”8
Later on, as an academic, Fogg taught a course called “Mass Interpersonal Persuasion.” Facebook had 25 million users by that time (today they have a whopping 2.38 billion), and his class gave students the chance to develop apps that would be tested and marketed on the platform. For Facebook, of course, it was all part of the hunt for eyeballs. Just as any Facebook user would be prompted to share their email and phone contacts with the platform (“Facebook is better with friends!” went the pitch), so would users of any app—which meant the app’s developer could eventually be able to extract information not only on those users, but also on any friends they “invited” to use the app.9 This is exactly the way that the now-infamous data leak with the British firm Cambridge Analytica happened. The academic Aleksandr Kogan created a survey app that was deployed on Facebook, and he used it to collect information not only on the 250,000 people who actually took the survey—but also on the 87 million more users they knew. Cambridge Analytica then used that information to deploy ads that may have helped tip the 2016 U.S. presidential election in favor of Donald Trump. That’s the network effect in action, at its most nefarious.
Fogg had naïvely hoped that the efforts of his students would be skewed in noble directions as they interacted with Facebook, but some had more immediate concerns, like making gobs of money. That 2007 “Facebook class,” as it was known, launched seventy-five students into careers in Big Tech. “It was really hard getting approval to teach that class,” he remembers. “Parents were like, ‘We’re sending you to Stanford to study Facebook?’ But we told the administration, ‘This is really important; we need to study this.’ ”
Fittingly enough, Fogg and his colleagues used Facebook to publicize the class. “There were about 100 students, which was a lot, but for the final presentation, 550 people showed up—many of them top investors, engineers, and innovators from Silicon Valley. It was standing room only. I was exhausted afterward. It took me, like, a month to recover.”10
Some of the students came up with apps for health and wellness—there was one, for example, geared toward people who wanted to train to hike the Oregon Trail. Others developed dating apps. While Fogg was more interested in the benign stuff, the truth was that intermittent rewards could be deployed to persuade anyone to do just about anything—to click on a link, to stay on a page, to stay there longer, to buy things, to encourage others to buy things, and so on and so on. Over the course of ten weeks, his students engaged with a total of 16 million people on Facebook.11
One of Fogg’s students was Tristan Harris, a young technologist raised by a single mother who worked as an advocate for injured workers in the Bay Area.12 Harris was soft-spoken, contemplative, and abnormally bright, even by Stanford standards; as a kid, he dreamed of becoming a magician—or a psychologist. But he started to think differently when he got to Stanford and became entranced by computer science, particularly the potential for computer intelligence to improve the human variety. In Fogg’s class, he studied the famous B. F. Skinner clicker training for dogs, and learned how the same techniques of intermittent variable rewards can inspire behavioral changes in people.
At Stanford, Harris began to learn how the design of online experience, or even the design of a website, could have a powerful effect on emotion. The initial design of LinkedIn, for example, publicly displayed each user’s personal connections. Nobody wanted to seem like the loser who didn’t have enough friends. So they began scrambling to invite more and more people, driving up the size of the network and the value of the offering in turn—and, of course, for free.13 Social pressure, as he soon learned, seemed to be a highly effective technique for “hijacking”14 our attention, as he now refers (quite accurately) to the process by which addictive technology works. With their infinite streams of continuously self-refreshing content—whether it’s your Twitter feed or real-time updates to your virtual FIFA ranking, mobile games and apps are designed to make us believe there’s a chance we could be missing something important. And that is what keeps us checking our phones constantly: around 2,617 times a day, according to one study.15 We might think we are in control, but in fact, we are being manipulated by the attention merchants.
As we’ve already learned, this has a terrible effect on our mental health, increasing stress and anxiety levels and even risk of illness. Gaming apps—like FIFA Mobile, the slot machine that hooked Alex—in particular are increasingly categorized as addictive by mental health professionals. In 2018, the World Health Organization added “gaming disorder” to a new draft version of their International Classification of Diseases, off the back of a growing spate of research showing that large numbers of online gamers—of which there are up to 2.6 billion in the world, or one person in two-thirds of American households—aren’t in control of their behavior. “I have patients who come in suffering from an addiction to Candy Crush Saga, and they’re substantially similar to people who come in with a cocaine disorder,” said Dr. Petros Levounis, the chair of the psychiatry department at Rutgers University, in a June 2018 New York Times article on game addiction. “Their lives are ruined, their interpersonal relationships suffer, t
heir physical condition suffers.”16
Unsurprisingly, children, who spend more time with social media, games, and apps than adults do, are particularly vulnerable. Megahit games like Fortnite, for example, which include as many as two hundred persuasive technology design tricks, have given rise to support groups for families dealing with game-addicted family members—sons in particular. Unfortunately, it’s a losing battle. No matter how hard a parent might try to teach self-control, restraint, and responsibility, these lessons are no match for the dopamine hit kids get from playing the game. No wonder the designer of Fortnite admitted to The Wall Street Journal that his goal was to create a game that would engage kids for “hundreds of hours if not years.” Epic, the company behind Fortnite, has already made $2 billion from its sales of virtual goods.17
Then there’s the craving for social approval. As most of us will recall from our teenage years, this phenomenon is hardly new; what’s new is the way Instagram and Snapchat and other platforms have elevated this need to the level of full-fledged addiction. Consider the average teenager, who spends 7.5 hours a day playing with screens and phones.18 Is it any wonder they are more isolated, less social, and more prone to depression than previous generations?19 As scary as this is, it’s even scarier that these conditions can actually be monetized by the platforms that create them. In 2017, Facebook documents20 leaked to The Australian showed that executives had actually boasted to advertisers that by monitoring posts, interactions, and photos in real time, they are able to track when teens feel “insecure,” “worthless,” “stressed,” “useless,” and “a failure,” and can micro-target ads down to those vulnerable “moments when young people need a confidence boost.” Think about that for a minute. It’s an endless, wanton commodification of our attention, with little or no concern for the repercussions for individuals.
This is how today’s devices create desires we didn’t even know we had, at least not to this degree, making us feel anxiously incomplete without them, almost as if we were missing a limb. When I once asked a pal of Alex’s to put his phone away for Alex’s birthday party, the boy got so upset he nearly hit me. It could have been worse, as with one teen, treated by psychologist Richard Freed, the author of Wired Child, who became so violent after being deprived of her device that her parents had to have her strapped to a gurney and sent to a psychiatric ward. The girl’s parents said her downward spiral had begun with a phone obsession, followed by isolation, falling grades, depression, violence, and finally the threat of suicide. Sadly, this is something that Freed says he is seeing more and more in his practice.
Parents don’t understand, he says, “that their children’s and teens’ destructive obsession with technology is the predictable consequence of a virtually unrecognized merger between the tech industry and psychology. This alliance pairs the consumer tech industry’s immense wealth with the most sophisticated psychological research, making it possible to develop social media, video games, and phones with drug-like powers to seduce young users.” Psychologists and social anthropologists are being hired in droves by the largest companies (and many smaller ones, too), to help the technologists translate the latest persuasive research and techniques into ever more tricked-out products designed to capture ever more children’s attention.
In early 2019, a number of consumers’ and children’s advocacy groups filed a request for the U.S. Federal Trade Commission to investigate Facebook for alleged deceptive practices following revelations that the company had knowingly tried to dupe children into spending large amounts of money in online games. (According to the unsealed class-action documents, Facebook employees actually referred to the kids as “whales,” a casino term for high rollers.)21 Around the same time, brands such as Nestlé and Disney stopped buying ads on Google-owned YouTube after pedophiles swamped the comment sections on children’s videos with obscene postings. The corporate behavior in question is, of course, wildly different in the two cases. But the connective tissue is that children have been endangered by a business model based on the monetization of user content and data.
This was, again, all part of the plan. The race to capture consumer attention is the focal point of capitalism today. Of all the states of mind that companies and brands seek to induce, addiction is by far the most desirable. It’s not enough to get people to like a product, even to love a product. You want to make them crave it so much they can’t live without it, just as corporations did with tobacco, followed by alcohol and television (sex, of course, is up there, too, but unfortunately for marketers, sex can usually be had for free). In fact, many of today’s biggest companies are finding ways to evoke nicotine-type cravings for whatever they are selling. And many of the methods that allow Big Tech to do just that came straight out of the Persuasive Technology Lab.
“The Devil Lives in Our Phones”
Tristan Harris was a talented student, and after he left Stanford, he launched a couple of successful small companies, including one that designed the type of pop-up ads that were by then all over the Web. In 2011, he joined Google, which had acquired one of his companies, to work on designing boxes of text that would entice people to click for further information. But as he spent more time at the company, he noticed that his fellow Googlers seemed a little “off.” They were easily distracted, twitchy, overwrought, burned-out—even as they professed to be joyfully engaged in their work. It dawned on him that they had many of the characteristics of drug addicts in search of their next fix. Harris tried to initiate mindfulness programs, in an attempt to combat the effects of computers on his coworkers’ attention. They weren’t particularly well attended, but that should have been no surprise—after all, there was work to do, and the Googlers were too busy and distracted to waste their precious time and attention on mindfulness.
“A wealth of information means a dearth of something else,” as Nobel Prize–winning economist Herbert Simon once put it. “What information consumes is rather obvious: It consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.”
The cognitive capture wrought by Big Tech is so all-encompassing, so distracting, it can be hard to see clearly. Certainly it’s hard to think clearly, given that technology forces everyone to move at a pace that is virtually inhumane. That’s one of the things that bothers Harris the most—the fact that the population as a whole is losing the ability to focus and solve problems, particularly the complex kind that we have before us today (how to fix capitalism, how to combat climate change, how to end political polarization). Forget about having the concentration to tackle those—most of us can’t stay on top of our own daily email in-box or social media correspondence. Things have gotten so bad that even when we aren’t using our phones, the mere knowledge of their presence still has the ability to distract us; research shows we actually perform our work better the further our cellphones are away from us (on the desk is better than in the pocket, but not as good as in another room).22
This research speaks to an even deeper issue: Technology is hampering the ability of an entire generation to concentrate enough to truly learn. According to data from the National Survey of Children’s Health, around 3 percent of the population had attention deficit hyperactivity disorder in the 1990s. Today that’s up to around 11 percent, an alarming rise that many doctors link to the rise of digital media.23 At Columbia University, my alma mater, professors are fretting about how incoming freshmen can’t focus long enough to learn the basics of the core curriculum. They simply don’t have the attention span to handle 200 to 300 pages of reading per week. As Lisa Hollibaugh, the dean of academic affairs at Columbia, has noted, professors are now “constantly thinking about how we’re teaching when attention spans have changed since 50 years ago.”24
It’s quite the catch-22: If we can’t focus on long-form reading and absor
b the information needed to forge complex ideas and thoughts, then we certainly can’t solve the big-picture problems of the day—which include how to manage our interactions with technology in a way that doesn’t result in myriad nefarious side effects.
It’s not that the makers of these distractive technologies aren’t aware of these problems; they are, in fact, quite aware—when it comes to their own lives, at least. It’s quite telling that many technologists take regular digital detox breaks. Fogg was on one himself, in Hawaii, when I spoke to him by phone; he told me, “I try to stay off Wi-Fi and keep my gadgets away from me while I sleep—it drives me crazy how many people come out here and stay on their phones the whole time!” They go to great lengths to keep their own children as far away from the digital world as possible. Waldorf schools, known for their unconventional teaching methods—which include eschewing the use of electronic media and devices in the classroom—are quite popular in the Valley, as are nannies with strict instructions to police phone usage while parents are at work, busily writing algorithms and building or marketing the very devices, apps, and platforms that keep people hooked. As one parent—a former Facebook employee—put it to a reporter at The New York Times, “I am convinced that the devil lives in our phones.”25
Utter hypocrisy? Or true repentance? Maybe a little of both. But it’s true that a growing number of technologists are finally waking up to the sheer force of the destruction they’ve unwittingly unleashed—and working to absolve the sins of their past. Witness techies-turned-activists like Tim Berners-Lee, who created the World Wide Web, and is now trying to wrest it out of Big Tech’s all-too-powerful hands. Or James Williams, a Googler-turned-philosopher who left the Valley for Oxford to research the ethics of persuasive technology. Or Jaron Lanier, the pioneer of virtual reality, whose recent book, Ten Arguments for Deleting Your Social Media Accounts Right Now, argues that social media is creating a culture of victims and diminishing diversity of thought in a way that will undermine not only our economy and democracy, but free thought itself.
Don't Be Evil Page 14