The word “targeted” is another euphemism. It evokes notions of precision, efficiency, and competence. Who would guess that targeting conceals a new political equation in which Google’s concentrations of computational power brush aside users’ decision rights as easily as King Kong might shoo away an ant, all accomplished offstage where no one can see?
These euphemisms operate in exactly the same way as those found on the earliest maps of the North American continent, in which whole regions were labeled with terms such as “heathens,” “infidels,” “idolaters,” “primitives,” “vassals,” and “rebels.” On the strength of those euphemisms, native peoples—their places and claims—were deleted from the invaders’ moral and legal equations, legitimating the acts of taking and breaking that paved the way for church and monarchy.
The intentional work of hiding naked facts in rhetoric, omission, complexity, exclusivity, scale, abusive contracts, design, and euphemism is another factor that helps explain why during Google’s breakthrough to profitability, few noticed the foundational mechanisms of its success and their larger significance. In this picture, commercial surveillance is not merely an unfortunate accident or occasional lapse. It is neither a necessary development of information capitalism nor a necessary product of digital technology or the internet. It is a specifically constructed human choice, an unprecedented market form, an original solution to emergency, and the underlying mechanism through which a new asset class is created on the cheap and converted to revenue. Surveillance is the path to profit that overrides “we the people,” taking our decision rights without permission and even when we say “no.” The discovery of behavioral surplus marks a critical turning point not only in Google’s biography but also in the history of capitalism.
In the years following its IPO in 2004, Google’s spectacular financial breakthrough first astonished and then magnetized the online world. Silicon Valley investors had doubled down on risk for years, in search of that elusive business model that would make it all worthwhile. When Google’s financial results went public, the hunt for mythic treasure was officially over.83
The new logic of accumulation spread first to Facebook, which launched the same year that Google went public. CEO Mark Zuckerberg had rejected the strategy of charging users a fee for service as the telephone companies had done in an earlier century. “Our mission is to connect every person in the world. You don’t do that by having a service people pay for,” he insisted.84 In May 2007 he introduced the Facebook platform, opening up the social network to everyone, not just people with a college e-mail address. Six months later, in November, he launched his big advertising product, Beacon, which would automatically share transactions from partner websites with all of a user’s “friends.” These posts would appear even if the user was not currently logged into Facebook, without the user’s knowledge or an opt-in function. The howls of protest—from users but also from some of Facebook’s partners such as Coca-Cola—forced Zuckerberg to back down swiftly. By December, Beacon became an opt-in program. The twenty-three-year-old CEO understood the potential of surveillance capitalism, but he had not yet mastered Google’s facility in obscuring its operations and intent.
The pressing question in Facebook’s headquarters—“How do we turn all those Facebook users into money?”—still required an answer.85 In March 2008, just three months after having to kill his first attempt at emulating Google’s logic of accumulation, Zuckerberg hired Google executive Sheryl Sandberg to be Facebook’s chief operating officer. The onetime chief of staff to US Treasury Secretary Larry Summers, Sandberg had joined Google in 2001, ultimately rising to be its vice president of global online sales and operations. At Google she led the development of surveillance capitalism through the expansion of AdWords and other aspects of online sales operations.86 One investor who had observed the company’s growth during that period concluded, “Sheryl created AdWords.”87
In signing on with Facebook, the talented Sandberg became the “Typhoid Mary” of surveillance capitalism as she led Facebook’s transformation from a social networking site to an advertising behemoth. Sandberg understood that Facebook’s social graph represented an awe-inspiring source of behavioral surplus: the extractor’s equivalent of a nineteenth-century prospector stumbling into a valley that sheltered the largest diamond mine and the deepest gold mine ever to be discovered. “We have better information than anyone else. We know gender, age, location, and it’s real data as opposed to the stuff other people infer,” Sandberg said. Facebook would learn to track, scrape, store, and analyze UPI to fabricate its own targeting algorithms, and like Google it would not restrict extraction operations to what people voluntarily shared with the company. Sandberg understood that through the artful manipulation of Facebook’s culture of intimacy and sharing, it would be possible to use behavioral surplus not only to satisfy demand but also to create demand. For starters, that meant inserting advertisers into the fabric of Facebook’s online culture, where they could “invite” users into a “conversation.”88
VIII. Summarizing the Logic and Operations of Surveillance Capitalism
With Google in the lead, surveillance capitalism rapidly became the default model of information capitalism on the web and, as we shall see in coming chapters, gradually drew competitors from every sector. This new market form declares that serving the genuine needs of people is less lucrative, and therefore less important, than selling predictions of their behavior. Google discovered that we are less valuable than others’ bets on our future behavior. This changed everything.
Behavioral surplus defines Google’s earnings success. In 2016, 89 percent of the revenues of its parent company, Alphabet, derived from Google’s targeted advertising programs.89 The scale of raw-material flows is reflected in Google’s domination of the internet, processing over 40,000 search queries every second on average: more than 3.5 billion searches per day and 1.2 trillion searches per year worldwide in 2017.90
On the strength of its unprecedented inventions, Google’s $400 billion market value edged out ExxonMobil for the number-two spot in market capitalization in 2014, only sixteen years after its founding, making it the second-richest company in the world behind Apple.91 By 2016, Alphabet/Google occasionally wrested the number-one position from Apple and was ranked number two globally as of September 20, 2017.92
It is useful to stand back from this complexity to grasp the overall pattern and how the puzzle pieces fit together:
1. The logic: Google and other surveillance platforms are sometimes described as “two-sided” or “multi-sided” markets, but the mechanisms of surveillance capitalism suggest something different.93 Google had discovered a way to translate its nonmarket interactions with users into surplus raw material for the fabrication of products aimed at genuine market transactions with its real customers: advertisers.94 The translation of behavioral surplus from outside to inside the market finally enabled Google to convert investment into revenue. The corporation thus created out of thin air and at zero marginal cost an asset class of vital raw materials derived from users’ nonmarket online behavior. At first those raw materials were simply “found,” a by-product of users’ search actions. Later those assets were hunted aggressively and procured largely through surveillance. The corporation simultaneously created a new kind of marketplace in which its proprietary “prediction products” manufactured from these raw materials could be bought and sold.
The summary of these developments is that the behavioral surplus upon which Google’s fortune rests can be considered as surveillance assets. These assets are critical raw materials in the pursuit of surveillance revenues and their translation into surveillance capital. The entire logic of this capital accumulation is most accurately understood as surveillance capitalism, which is the foundational framework for a surveillance-based economic order: a surveillance economy. The big pattern here is one of subordination and hierarchy, in which earlier reciprocities between the firm and its users are subordinated to the derivat
ive project of our behavioral surplus captured for others’ aims. We are no longer the subjects of value realization. Nor are we, as some have insisted, the “product” of Google’s sales. Instead, we are the objects from which raw materials are extracted and expropriated for Google’s prediction factories. Predictions about our behavior are Google’s products, and they are sold to its actual customers but not to us. We are the means to others’ ends.
Industrial capitalism transformed nature’s raw materials into commodities, and surveillance capitalism lays its claims to the stuff of human nature for a new commodity invention. Now it is human nature that is scraped, torn, and taken for another century’s market project. It is obscene to suppose that this harm can be reduced to the obvious fact that users receive no fee for the raw material they supply. That critique is a feat of misdirection that would use a pricing mechanism to institutionalize and therefore legitimate the extraction of human behavior for manufacturing and sale. It ignores the key point that the essence of the exploitation here is the rendering of our lives as behavioral data for the sake of others’ improved control of us. The remarkable questions here concern the facts that our lives are rendered as behavioral data in the first place; that ignorance is a condition of this ubiquitous rendition; that decision rights vanish before one even knows that there is a decision to make; that there are consequences to this diminishment of rights that we can neither see nor foretell; that there is no exit, no voice, and no loyalty, only helplessness, resignation, and psychic numbing; and that encryption is the only positive action left to discuss when we sit around the dinner table and casually ponder how to hide from the forces that hide from us.
2. The means of production: Google’s internet-age manufacturing process is a critical component of the unprecedented. Its specific technologies and techniques, which I summarize as “machine intelligence,” are constantly evolving, and it is easy to be intimidated by their complexity. The same term may mean one thing today and something very different in one year or in five years. For example, Google has been described as developing and deploying “artificial intelligence” since at least 2003, but the term itself is a moving target, as capabilities have evolved from primitive programs that can play tic-tac-toe to systems that can operate whole fleets of driverless cars.
Google’s machine intelligence capabilities feed on behavioral surplus, and the more surplus they consume, the more accurate the prediction products that result. Wired magazine’s founding editor, Kevin Kelly, once suggested that although it seems like Google is committed to developing its artificial intelligence capabilities to improve Search, it’s more likely that Google develops Search as a means of continuously training its evolving AI capabilities.95 This is the essence of the machine intelligence project. As the ultimate tapeworm, the machine’s intelligence depends upon how much data it eats. In this important respect the new means of production differs fundamentally from the industrial model, in which there is a tension between quantity and quality. Machine intelligence is the synthesis of this tension, for it reaches its full potential for quality only as it approximates totality.
As more companies chase Google-style surveillance profits, a significant fraction of global genius in data science and related fields is dedicated to the fabrication of prediction products that increase click-through rates for targeted advertising. For example, Chinese researchers employed by Microsoft’s Bing’s research unit in Beijing published breakthrough findings in 2017. “Accurately estimating the click-through rate (CTR) of ads has a vital impact on the revenue of search businesses; even a 0.1% accuracy improvement in our production would yield hundreds of millions of dollars in additional earnings,” they begin. They go on to demonstrate a new application of advanced neural networks that promises 0.9 percent improvement on one measure of identification and “significant click yield gains in online traffic.”96 Similarly, a team of Google researchers introduced a new deep-neural network model, all for the sake of capturing “predictive feature interactions” and delivering “state-of-the-art performance” to improve click-through rates.97 Thousands of contributions like these, some incremental and some dramatic, equate to an expensive, sophisticated, opaque, and exclusive twenty-first-century “means of production.”
3. The products: Machine intelligence processes behavioral surplus into prediction products designed to forecast what we will feel, think, and do: now, soon, and later. These methodologies are among Google’s most closely guarded secrets. The nature of its products explains why Google repeatedly claims that it does not sell personal data. What? Never! Google executives like to claim their privacy purity because they do not sell their raw material. Instead, the company sells the predictions that only it can fabricate from its world-historic private hoard of behavioral surplus.
Prediction products reduce risks for customers, advising them where and when to place their bets. The quality and competitiveness of the product are a function of its approximation to certainty: the more predictive the product, the lower the risks for buyers and the greater the volume of sales. Google has learned to be a data-based fortune-teller that replaces intuition with science at scale in order to tell and sell our fortunes for profit to its customers, but not to us. Early on, Google’s prediction products were largely aimed at sales of targeted advertising, but as we shall see, advertising was the beginning of the surveillance project, not the end.
4. The marketplace: Prediction products are sold into a new kind of market that trades exclusively in future behavior. Surveillance capitalism’s profits derive primarily from these behavioral futures markets. Although advertisers were the dominant players in the early history of this new kind of marketplace, there is no reason why such markets are limited to this group. The new prediction systems are only incidentally about ads, in the same way that Ford’s new system of mass production was only incidentally about automobiles. In both cases the systems can be applied to many other domains. The already visible trend, as we shall see in the coming chapters, is that any actor with an interest in purchasing probabilistic information about our behavior and/or influencing future behavior can pay to play in markets where the behavioral fortunes of individuals, groups, bodies, and things are told and sold (see Figure 2).
Figure 2: The Discovery of Behavioral Surplus
CHAPTER FOUR
THE MOAT AROUND THE CASTLE
The hour of birth their only time in college,
They were content with their precocious knowledge,
To know their station and be right forever.
—W. H. AUDEN
SONNETS FROM CHINA, I
I. Human Natural Resources
Google’s former CEO Eric Schmidt credits Hal Varian’s early examination of the firm’s ad auctions with providing the eureka moment that clarified the true nature of Google’s business: “All of a sudden, we realized we were in the auction business.”1 Larry Page is credited with a very different and far more profound answer to the question “What is Google?” Douglas Edwards recounts a 2001 session with the founders that probed their answers to that precise query. It was Page who ruminated, “If we did have a category, it would be personal information.… The places you’ve seen. Communications.… Sensors are really cheap.… Storage is cheap. Cameras are cheap. People will generate enormous amounts of data.… Everything you’ve ever heard or seen or experienced will become searchable. Your whole life will be searchable.”2
Page’s vision perfectly reflects the history of capitalism, marked by taking things that live outside the market sphere and declaring their new life as market commodities. In historian Karl Polanyi’s 1944 grand narrative of the “great transformation” to a self-regulating market economy, he described the origins of this translation process in three astonishing and crucial mental inventions that he called “commodity fictions.” The first was that human life could be subordinated to market dynamics and reborn as “labor” to be bought and sold. The second was that nature could be translated into the market and reborn a
s “land” or “real estate.” The third was that exchange could be reborn as “money.”3 Nearly eighty years earlier, Karl Marx had described the taking of lands and natural resources as the original “big bang” that ignited modern capital formation, calling it “primitive accumulation.”4
The philosopher Hannah Arendt complicated both Polanyi’s and Marx’s notion. She observed that primitive accumulation wasn’t just a one-time primal explosion that gave birth to capitalism. Rather, it is a recurring phase in a repeating cycle as more aspects of the social and natural world are subordinated to the market dynamic. Marx’s “original sin of simple robbery,” she wrote, “had eventually to be repeated lest the motor of capital accumulation suddenly die down.”5
In our time of pro-market ideology and practice, this cycle has become so pervasive that we eventually fail to notice its audacity or contest its claims. For example, you can now “purchase” human blood and organs, someone to have your baby or stand in line for you or hold a public parking space, a person to comfort you in your grief, and the right to kill an endangered animal. The list grows longer each day.6
Social theorist David Harvey builds on Arendt’s insight with his notion of “accumulation by dispossession”: “What accumulation by dispossession does is to release a set of assets… at very low (and in some instances zero) cost. Overaccumulated capital can seize hold of such assets and immediately turn them to profitable use.” He adds that entrepreneurs who are determined to “join the system” and enjoy “the benefits of capital accumulation” are often the ones who drive this process of dispossession into new, undefended territories.7
The Age of Surveillance Capitalism Page 12