Book Read Free

The Age of Surveillance Capitalism

Page 11

by Shoshana Zuboff


  Another key metric called the “quality score” helped determine the price of an ad and its specific position on the page, in addition to advertisers’ own auction bids. The quality score was determined in part by click-through rates and in part by the firm’s analyses of behavioral surplus. “The clickthrough rate needed to be a predictive thing,” one top executive insisted, and that would require “all the information we had about the query right then.”57 It would take enormous computing power and leading-edge algorithmic programs to produce powerful predictions of user behavior that became the criteria for estimating the relevance of an ad. Ads that scored high would sell at a lower price than those that scored poorly. Google’s customers, its advertisers, complained that the quality score was a black box, and Google was determined to keep it so. Nonetheless, when customers followed its disciplines and produced high-scoring ads, their click-through rates soared.

  AdWords quickly became so successful that it inspired significant expansion of the surveillance logic. Advertisers demanded more clicks.58 The answer was to extend the model beyond Google’s search pages and convert the entire internet into a canvas for Google’s targeted ads. This required turning Google’s newfound skills at “data extraction and analysis,” as Hal Varian put it, toward the content of any web page or user action by employing Google’s rapidly expanding semantic analysis and artificial intelligence capabilities to efficiently “squeeze” meaning from them. Only then could Google accurately assess the content of a page and how users interact with that content. This “content-targeted advertising” based on Google’s patented methods was eventually named AdSense. By 2004, AdSense had achieved a run rate of a million dollars per day, and by 2010, it produced annual revenues of more than $10 billion.

  So here was an unprecedented and lucrative brew: behavioral surplus, data science, material infrastructure, computational power, algorithmic systems, and automated platforms. This convergence produced unprecedented “relevance” and billions of auctions. Click-through rates skyrocketed. Work on AdWords and AdSense became just as important as work on Search.

  With click-through rates as the measure of relevance accomplished, behavioral surplus was institutionalized as the cornerstone of a new kind of commerce that depended upon online surveillance at scale. Insiders referred to Google’s new science of behavioral prediction as the “physics of clicks.”59 Mastery of this new domain required a specialized breed of click physicists who would secure Google’s preeminence within the nascent priesthood of behavioral prediction. The firm’s substantial revenue flows summoned the greatest minds of our age from fields such as artificial intelligence, statistics, machine learning, data science, and predictive analytics to converge on the prediction of human behavior as measured by click-through rates: computer-mediated fortune-telling and selling. The firm would recruit an authority on information economics, and consultant to Google since 2001, as the patriarch of this auspicious group and the still-young science: Hal Varian was the chosen shepherd of this flock.

  Page and Brin had been reluctant to embrace advertising, but as the evidence mounted that ads could save the company from crisis, their attitudes shifted.60 Saving the company also meant saving themselves from being just another couple of very smart guys who couldn’t figure out how to make real money, insignificant players in the intensely material and competitive culture of Silicon Valley. Page was haunted by the example of the brilliant but impoverished scientist Nikola Tesla, who died without ever benefiting financially from his inventions. “You need to do more than just invent things,” Page reflected.61 Brin had his own take: “Honestly, when we were still in the dot-com boom days, I felt like a schmuck. I had an internet startup—so did everybody else. It was unprofitable, like everybody else’s.”62 Exceptional threats to their financial and social status appear to have awakened a survival instinct in Page and Brin that required exceptional adaptive measures.63 The Google founders’ response to the fear that stalked their community effectively declared a “state of exception” in which it was judged necessary to suspend the values and principles that had guided Google’s founding and early practices.

  Later, Sequoia’s Moritz recalled the crisis conditions that provoked the firm’s “ingenious” self-reinvention, when crisis opened a fork in the road and drew the company in a wholly new direction. He stressed the specificity of Google’s inventions, their origins in emergency, and the 180-degree turn from serving users to surveilling them. Most of all, he credited the discovery of behavioral surplus as the game-changing asset that turned Google into a fortune-telling giant, pinpointing Google’s breakthrough transformation of the Overture model, when the young company first applied its analytics of behavioral surplus to predict the likelihood of a click:

  The first 12 months of Google were not a cakewalk, because the company didn’t start off in the business that it eventually tapped. At first it went in a different direction, which was selling its technology—selling licenses for its search engines to larger internet properties and to corporations.… Cash was going out of the window at a feral rate during the first six, seven months. And then, very ingeniously, Larry… and Sergey… and others fastened on a model that they had seen this other company, Overture, develop, which was ranked advertisements. They saw how it could be improved and enhanced and made it their own, and that transformed the business.64

  Moritz’s reflections suggest that without the discovery of behavioral surplus and the turn toward surveillance operations, Google’s “feral” rate of spending was not sustainable and the firm’s survival was imperiled. We will never know what Google might have made of itself without the state of exception fueled by the emergency of impatient money that shaped those crucial years of development. What other pathways to sustainable revenue might have been explored or invented? What alternative futures might have been summoned to keep faith with the founders’ principles and with their users’ rights to self-determination? Instead, Google loosed a new incarnation of capitalism upon the world, a Pandora’s box whose contents we are only beginning to understand.

  VI. A Human Invention

  Key to our conversation is this fact: surveillance capitalism was invented by a specific group of human beings in a specific time and place. It is not an inherent result of digital technology, nor is it a necessary expression of information capitalism. It was intentionally constructed at a moment in history, in much the same way that the engineers and tinkerers at the Ford Motor Company invented mass production in the Detroit of 1913.

  Henry Ford set out to prove that he could maximize profits by driving up volumes, radically decreasing costs, and widening demand. It was an unproven commercial equation for which no economic theory or body of practice existed. Fragments of the formula had surfaced before—in meatpacking plants, flour-milling operations, sewing machine and bicycle factories, armories, canneries, and breweries. There was a growing body of practical knowledge about the interchangeability of parts and absolute standardization, precision machines, and continuous flow production. But no one had achieved the grand symphony that Ford heard in his imagination.

  As historian David Hounshell tells it, there was a time, April 1, 1913, and a place, Detroit, when the first moving assembly line seemed to be “just another step in the years of development at Ford yet somehow suddenly dropped out of the sky. Even before the end of the day, some of the engineers sensed that they had made a fundamental breakthrough.”65 Within a year, productivity increases across the plant ranged from 50 percent to as much as ten times the output of the old fixed-assembly methods.66 The Model T that sold for $825 in 1908 was priced at a record low for a four-cylinder automobile in 1924, just $260.67

  Much as with Ford, some elements of the economic surveillance logic in the online environment had been operational for years, familiar only to a rarefied group of early computer experts. For example, the software mechanism known as the “cookie”—bits of code that allow information to be passed between a server and a client computer—was dev
eloped in 1994 at Netscape, the first commercial web browser company.68 Similarly, “web bugs”—tiny (often invisible) graphics embedded in web pages and e-mail and designed to monitor user activity and collect personal information—were well-known to experts in the late 1990s.69

  These experts were deeply concerned about the privacy implications of such monitoring mechanisms, and at least in the case of cookies, there were institutional efforts to design internet policies that would prohibit their invasive capabilities to monitor and profile users.70 By 1996, the function of cookies had become a contested public policy issue. Federal Trade Commission workshops in 1996 and 1997 discussed proposals that would assign control of all personal information to users by default with a simple automated protocol. Advertisers bitterly contested this scheme, collaborating instead to avert government regulation by forming a “self-regulating” association known as the Network Advertising Initiative. Still, in June 2000 the Clinton administration banned cookies from all federal websites, and by April 2001, three bills before Congress included provisions to regulate cookies.71

  Google brought new life to these practices. As had occurred at Ford a century earlier, the company’s engineers and scientists were the first to conduct the entire commercial surveillance symphony, integrating a wide range of mechanisms from cookies to proprietary analytics and algorithmic software capabilities in a sweeping new logic that enshrined surveillance and the unilateral expropriation of behavioral data as the basis for a new market form. The impact of this invention was just as dramatic as Ford’s. In 2001, as Google’s new systems to exploit its discovery of behavioral surplus were being tested, net revenues jumped to $86 million (more than a 400 percent increase over 2000), and the company turned its first profit. By 2002, the cash began to flow and has never stopped, definitive evidence that behavioral surplus combined with Google’s proprietary analytics were sending arrows to their marks. Revenues leapt to $347 million in 2002, then $1.5 billion in 2003, and $3.5 billion in 2004, the year the company went public.72 The discovery of behavioral surplus had produced a stunning 3,590 percent increase in revenue in less than four years.

  VII. The Secrets of Extraction

  It is important to note the vital differences for capitalism in these two moments of originality at Ford and Google. Ford’s inventions revolutionized production. Google’s inventions revolutionized extraction and established surveillance capitalism’s first economic imperative: the extraction imperative. The extraction imperative meant that raw-material supplies must be procured at an ever-expanding scale. Industrial capitalism had demanded economies of scale in production in order to achieve high throughput combined with low unit cost. In contrast, surveillance capitalism demands economies of scale in the extraction of behavioral surplus.

  Mass production was aimed at new sources of demand in the early twentieth century’s first mass consumers. Ford was clear on this point: “Mass production begins in the perception of a public need.”73 Supply and demand were linked effects of the new “conditions of existence” that defined the lives of my great-grandparents Sophie and Max and other travelers in the first modernity. Ford’s invention deepened the reciprocities between capitalism and these populations.

  In contrast, Google’s inventions destroyed the reciprocities of its original social contract with users. The role of the behavioral value reinvestment cycle that had once aligned Google with its users changed dramatically. Instead of deepening the unity of supply and demand with its populations, Google chose to reinvent its business around the burgeoning demand of advertisers eager to squeeze and scrape online behavior by any available means in the competition for market advantage. In the new operation, users were no longer ends in themselves but rather became the means to others’ ends.

  Reinvestment in user services became the method for attracting behavioral surplus, and users became the unwitting suppliers of raw material for a larger cycle of revenue generation. The scale of surplus expropriation that was possible at Google would soon eliminate all serious competitors to its core search business as the windfall earnings from leveraging behavioral surplus were used to continuously draw more users into its net, thus establishing its de facto monopoly in Search. On the strength of Google’s inventions, discoveries, and strategies, it became the mother ship and ideal type of a new economic logic based on fortune-telling and selling—an ancient and eternally lucrative craft that has fed on humanity’s confrontation with uncertainty from the beginning of the human story.

  It was one thing to proselytize achievements in production, as Henry Ford had done, but quite another to boast about the continuous intensification of hidden processes aimed at the extraction of behavioral data and personal information. The last thing that Google wanted was to reveal the secrets of how it had rewritten its own rules and, in the process, enslaved itself to the extraction imperative. Behavioral surplus was necessary for revenue, and secrecy would be necessary for the sustained accumulation of behavioral surplus.

  This is how secrecy came to be institutionalized in the policies and practices that govern every aspect of Google’s behavior onstage and offstage. Once Google’s leadership understood the commercial power of behavioral surplus, Schmidt instituted what he called the “hiding strategy.”74 Google employees were told not to speak about what the patent had referred to as its “novel methods, apparatus, message formats and/or data structures” or confirm any rumors about flowing cash. Hiding was not a post hoc strategy; it was baked into the cake that would become surveillance capitalism.

  Former Google executive Douglas Edwards writes compellingly about this predicament and the culture of secrecy it shaped. According to his account, Page and Brin were “hawks,” insisting on aggressive data capture and retention: “Larry opposed any path that would reveal our technological secrets or stir the privacy pot and endanger our ability to gather data.” Page wanted to avoid arousing users’ curiosity by minimizing their exposure to any clues about the reach of the firm’s data operations. He questioned the prudence of the electronic scroll in the reception lobby that displays a continuous stream of search queries, and he “tried to kill” the annual Google Zeitgeist conference that summarizes the year’s trends in search terms.75

  Journalist John Battelle, who chronicled Google during the 2002–2004 period, described the company’s “aloofness,” “limited information sharing,” and “alienating and unnecessary secrecy and isolation.”76 Another early company biographer notes, “What made this information easier to keep is that almost none of the experts tracking the business of the internet believed that Google’s secret was even possible.”77 As Schmidt told the New York Times, “You need to win, but you are better off winning softly.”78 The scientific and material complexity that supported the capture and analysis of behavioral surplus also enabled the hiding strategy, an invisibility cloak over the whole operation. “Managing search at our scale is a very serious barrier to entry,” Schmidt warned would-be competitors.79

  To be sure, there are always sound business reasons for hiding the location of your gold mine. In Google’s case, the hiding strategy accrued to its competitive advantage, but there were other reasons for concealment and obfuscation. What might the response have been back then if the public were told that Google’s magic derived from its exclusive capabilities in unilateral surveillance of online behavior and its methods specifically designed to override individual decision rights? Google policies had to enforce secrecy in order to protect operations that were designed to be undetectable because they took things from users without asking and employed those unilaterally claimed resources to work in the service of others’ purposes.

  That Google had the power to choose secrecy is itself testament to the success of its own claims. This power is a crucial illustration of the difference between “decision rights” and “privacy.” Decision rights confer the power to choose whether to keep something secret or to share it. One can choose the degree of privacy or transparency for each situation. US Supreme
Court Justice William O. Douglas articulated this view of privacy in 1967: “Privacy involves the choice of the individual to disclose or to reveal what he believes, what he thinks, what he possesses.…”80

  Surveillance capitalism lays claim to these decision rights. The typical complaint is that privacy is eroded, but that is misleading. In the larger societal pattern, privacy is not eroded but redistributed, as decision rights over privacy are claimed for surveillance capital. Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism. Google discovered this necessary element of the new logic of accumulation: it must assert the rights to take the information upon which its success depends.

  The corporation’s ability to hide this rights grab depends on language as much as it does on technical methods or corporate policies of secrecy. George Orwell once observed that euphemisms are used in politics, war, and business as instruments that “make lies sound truthful and murder respectable.”81 Google has been careful to camouflage the significance of its behavioral surplus operations in industry jargon. Two popular terms—“digital exhaust” and “digital breadcrumbs”—connote worthless waste: leftovers lying around for the taking.82 Why allow exhaust to drift in the atmosphere when it can be recycled into useful data? Who would think to call such recycling an act of exploitation, expropriation, or plunder? Who would dare to redefine “digital exhaust” as booty or contraband, or imagine that Google had learned how to purposefully construct that so-called “exhaust” with its methods, apparatus, and data structures?

 

‹ Prev