The Age of Surveillance Capitalism

Home > Other > The Age of Surveillance Capitalism > Page 48
The Age of Surveillance Capitalism Page 48

by Shoshana Zuboff


  For example, in the aftermath of the December 2015 terror attacks in Paris, President Obama, US legislators, and public officials around the world exhorted the tech companies, especially Google, Facebook, and Twitter, to identify and remove terrorist content. The companies were reportedly reluctant to be, or at least to be perceived as, “tools of government.”10 Journalists noted that public officials developed “workarounds” aimed at achieving access to instrumentarian power without imposing new burdens on the companies’ public standing. For example, a government agency could assert that offending online content violates the internet company’s terms of service, thus initiating the quick removal of offending material “without the paper trail that would go with a court order.” Similarly, Google expanded its “trusted flagger” program, through which officials and others could identify problematic content for immediate action.11

  The companies responded with their own initiatives. Eric Schmidt suggested new instruments, including a “spell check for hate,” to target and eliminate terrorist accounts, remove content before it spreads, and accelerate the dissemination of counter-messages.12 Top Obama administration officials endorsed that prospect on a pilgrimage to Silicon Valley in January 2016 for a “terror summit” with tech leaders. The agenda included discussions on how to disrupt terror activities on the internet, amplify alternative content, disrupt paths to radicalization, and enable security agencies to prevent attacks.13 A White House briefing memo encouraged the companies to develop a “radicalism algorithm” that would digest social media and other sources of surplus to produce something comparable to a credit score, but aimed at evaluating the “radicalness” of online content.14

  The turn to instrumentarian power as the solution to uncertainty is not restricted to the US government. Terrorism triggers similar responses in Germany, France, the UK, and around the world. After the 2016 attack on a Berlin Christmas market, German officials announced plans to require suspected extremists to wear electronic tags for perpetual tracking.15 In 2017 surveillance capitalists, including Facebook, Microsoft, Google, and Twitter, established the Global Internet Forum to Counter Terrorism. The objective was to tighten the net of instrumentarian power through “collaboration on engineering solutions to the problem of sharing content classification techniques,” “counterspeech initiatives,” and a shared database of “unique digital fingerprints” for violent terrorist imagery to accelerate the identification of terrorist accounts.16 The 2017 joint report of five countries—Australia, Canada, New Zealand, the United Kingdom, and the United States—included four key commitments, the very first of which was engagement with the internet companies to address online terrorism activities and to support the industry forum led by Google and Facebook.17 That year, the European Council announced its expectation that “industry” would live up to its responsibility “to develop new technology and tools to improve the automatic detection and removal of content that incites to terrorist acts.”18 Meeting in Hamburg in 2017, the G20 countries vowed to work with the internet companies, insisting on the need for better instruments to filter, detect, and remove content, and “encouraging” the industry to invest in the technology and human capital able to detect and eliminate terrorist activity online.19

  There are other emerging configurations of instrumentarian and state power. For example, former US Director of National Intelligence James Clapper told Congress in 2016 that the intelligence services might use the “internet of things” for “identification, surveillance, monitoring, location tracking, and targeting for recruitment, or to gain access to networks or user credentials.”20 Indeed, a research report from Harvard’s Berkman Klein Center for Internet & Society concluded that surveillance capitalism’s wave of “smart” appliances and products, networked sensors, and the “internet of things” would open up “numerous avenues for government actors to demand access to real-time and recorded communications.”21

  That “smart” and “connected” signal new channels for commercial and government surveillance is neither conjecture nor limited to federal intelligence agencies. In a 2015 murder case, police used data from a “smart” utility meter, an iPhone 6s Plus, and audio files captured by an Amazon Echo device to identify a suspect.22 In 2014 data from a Fitbit wristband were used in a personal injury case, and in 2017 police used data from a pacemaker to charge a man with arson and insurance fraud.23

  In the US, local law enforcement has joined the queue of institutions seeking access to instrumentarian power. Surveillance-as-a-service companies eagerly sell their wares to local police departments also determined to find a shortcut to certainty. One startup, Geofeedia, specializes in detailed location tracking of activists and protesters, such as Greenpeace members or union organizers, and the computation of individualized “threat scores” using data drawn from social media. Law-enforcement agencies have been among Geofeedia’s most prominent clients.24 When the Boston Police Department announced its interest in joining this roster in 2016, the city’s police commissioner described to the Boston Globe his belief in machine certainty as the antidote to social breakdown: “The attack… on the Ohio State University campus is just the latest illustration of why local law-enforcement authorities need every tool they can muster to stop terrorism and other violence before it starts.”25 An ACLU attorney countered that the government is using tech companies “to build massive dossiers on people” based on nothing more than their constitutionally protected speech.26 Another, more prominent surveillance-as-a-service company, Palantir, once touted by Bloomberg Businessweek as “the war on terror’s secret weapon,” was found to be in a secret collaboration with the New Orleans Police Department to test its “predictive policing” technology. Palantir’s software not only identified gang members but also “traced people’s ties to other gang members, outlined criminal histories, analyzed social media, and predicted the likelihood that individuals would commit violence or become a victim.”27

  IV. The China Syndrome

  It is now possible to imagine one logical conclusion of this trend toward the substitution of certainty for society as the Chinese government develops a comprehensive “social credit” system described by one China scholar as the “core” of China’s internet agenda. The aim is “to leverage the explosion of personal data… in order to improve citizens’ behavior.… Individuals and enterprises are to be scored on various aspects of their conduct—where you go, what you buy and who you know—and these scores will be integrated within a comprehensive database that not only links into government information, but also to data collected by private businesses.”28

  The system tracks “good” and “bad” behavior across a variety of financial and social activities, automatically assigning punishments and rewards to decisively shape behavior toward “building sincerity” in economic, social, and political life: “The aim is for every Chinese citizen to be trailed by a file compiling data from public and private sources… searchable by fingerprints and other biometric characteristics.”29

  Although China’s social credit vision is invariably described as “digital totalitarianism” and is often compared to the world of Orwell’s 1984, it is better understood as the apotheosis of instrumentarian power fed by public and private data sources and controlled by an authoritarian state. The accounts of its pilot programs describe powerful examples of surveillance capitalism’s economies of action and the intricate construction of superscale means of behavior modification. The aim is the automation of society through tuning, herding, and conditioning people to produce preselected behaviors judged as desirable by the state and thus able to “preempt instability,” as one strategic studies expert put it.30 In other words, the aim is to achieve guaranteed social rather than market outcomes using instrumentarian means of behavioral modification. The result is an emergent system that allows us to peer into one version of a future defined by a comprehensive fusion of instrumentarian and state power.

  China’s vision is intended as the solution to its own unique v
ersion of the curse of social dissolution. Writing in Foreign Policy, journalist Amy Hawkins explains that China’s pandemic of social distrust is the problem to which the social credit system is addressed as the cure: “To be Chinese today is to live in a society of distrust, where every opportunity is a potential con and every act of generosity a risk of exploitation.”31 A fascinating empirical study of social trust in contemporary China actually finds high levels of “in-group” trust but discovers that these are correlated with negative health outcomes. The conclusion is that many Chinese trust only the people who are well-known to them. All other relationships are regarded with suspicion and anxiety, with obvious consequences for social trust as well as well-being.32 This rampant distrust, typically assigned to the traumas of rapid modernization and the shift to a quasi-capitalist economy, is also the legacy of Chinese totalitarianism. The Chinese Communist Party dismantled traditional domains of affiliation, identity, and social meaning—family, religion, civil society, intellectual discourse, political freedom—recalling Arendt’s description of the “atomization” that destroys bonds of trust.33 As Hawkins writes, “But rather than promoting the organic return of traditional morality to reduce the gulf of distrust, the Chinese government has preferred to invest its energy in technological fixes… and it’s being welcomed by a public fed up of not knowing who to trust… in part because there’s no alternative.”34 The Chinese government intends to commandeer instrumentarian power to replace a broken society with certain outcomes.

  In 2015 the Chinese central bank announced a pilot project in which the top e-commerce companies would pioneer the data integration and software development for personal credit scoring. Among the largest of the pilots was Alibaba’s Ant Financial and its “personal credit scoring” operation, “Sesame Credit.” The Sesame Credit system produces a “holistic” rating of “character” with algorithmic learning that goes far beyond the timely payment of bills and loans. Algorithms evaluate and rank purchases (video games versus children’s books), education degrees, and the quantity and “quality” of friends. One reporter’s account of her Sesame Credit experience warns that the algorithm veers into “voodoo,” considering the credit scores of her social contacts, the car she drives, her job, school, and a host of unspecified behavioral variables that supposedly “correlate with good credit.” The shadow text remains out of reach, and users are left to guess how to improve their scores, including shedding friends with low scores and bulking up on high-scoring individuals who, some believe, can boost one’s own rank.35

  The company’s CEO boasts that the scoring system “will ensure that the bad people in society don’t have a place to go, while good people can move freely and without obstruction.” Those with high scores receive honors and rewards from Sesame Credit’s customers in its behavioral futures markets. They can rent a car without a deposit, receive favorable terms on loans and apartment rentals, receive fast-tracking for visa permits, enjoy being showcased on dating apps, and a host of other perks. However, one report warns that the privileges linked to a high personal credit score can suddenly tumble for reasons unrelated to consumption behavior, such as cheating on a university exam.36

  In 2017 the central bank retracted its support for the private-sector personal credit programs, perhaps because they were too successful, their concentrations of knowledge and power too great. Sesame Credit had acquired more than 400 million users in just two years, staking a claim to just about every aspect of those users’ lives.37 A journalist who wrote a book on Ant Financial anticipates that the government is preparing to assert control over the whole system: “The government doesn’t want this very important infrastructure of the people’s credit in one big company’s hands.” The Chinese government appears to understand that power accrues to the owners of the means of behavioral modification. It is the owners who set the behavioral parameters that define guaranteed outcomes. Thus, fusion advances.

  A sense of the kind of social world that might be produced by the fusion of instrumentarian and state power begins with the “judgment defaulter’s list,” described by the Economist as the heart of the social credit system and perhaps the best indicator of its larger ambitions. The list includes debtors and anyone who has ever defied a court order:

  People on the list can be prevented from buying aeroplane, bullet-train or first- or business-class rail tickets; selling, buying or building a house; or enrolling their children in expensive fee-paying schools. There are restrictions on offenders joining or being promoted in the party and army, and on receiving honours and titles. If the defaulter is a company, it may not issue shares or bonds, accept foreign investment or work on government projects.38

  According to a report in China Daily, debtors on the list were automatically prevented from flying 6.15 million times since the blacklist was launched in 2013. Those in contempt of court were denied sales of high-speed train tickets 2.22 million times. Some 71,000 defaulters have missed out on executive positions at enterprises as a result of their debts. The Industrial and Commercial Bank of China said it had refused loans worth more than 6.97 billion yuan ($1.01 billion) to debtors on the list.39 No one is sent to a reeducation camp, but they may not be allowed to purchase luxury goods. According to the director of the Institute of the Chinese Academy of International Trade and Economic Cooperation, “Given this inconvenience, 10 percent of people on the list started to pay back the money they owed spontaneously. This shows the system is starting to work.”40 Economies of action were performing to plan.

  For the 400 million users of Sesame Credit, the fusion of instrumentarian and state power bites hard. Those who might find themselves on the blacklist discover that the credit system is designed to thrust their scores into an inexorable downward spiral: “First your score drops. Then your friends hear you are on the blacklist and, fearful that their scores might be affected, quietly drop you as a contact. The algorithm notices, and your score plummets further.”41

  The Chinese government’s vision may be impossibly ambitious: the big dream of total awareness and perfect certainty mediated by algorithms that filter a perpetual flood of data flows from private and public supplies, including online and offline experience culled from every domain and able to ricochet back into the individual lives of 1.5 billion people, automating social behavior as the algorithms reward, punish, and shape action right down to the latest bus ticket. So far the project is fragmented across many pilots, not only in the tech companies but also in cities and regions, so there is no real test of the scale that the government envisions. There are plenty of experts who believe that a single system of that scale and complexity will be difficult if not impossible to achieve.

  There are other good reasons to discount the social credit system’s relevance for our story. To state the obvious, China is not a democracy and its culture differs profoundly from Western culture. Syracuse University researcher Yang Wang observes that Chinese culture places less value on privacy than does Western culture and that most Chinese have accommodated to the certain knowledge of online government surveillance and censorship. The most common word for privacy, yinsi, didn’t even appear in popular Chinese dictionaries until the mid-1990s.42 Chinese citizens have accepted national ID cards with biometric chips, “birth permits,” and now social credit rankings because their society has been saturated with surveillance and profiling for decades. For example, the “dang’an” is a wide-ranging personal dossier compiled on hundreds of millions of urban residents from childhood and maintained throughout life. This “Mao-era system for recording the most intimate details of life” is updated by teachers, Communist Party officials, and employers. Citizens have no rights to see its contents, let alone contest them.

  The dossier is only one feature of long-institutionalized and pervasive administrative systems of behavioral control and surveillance in daily life that bestow honors on some and punishments on others. Social control programs have expanded with the growth of the internet. For example, the “Golden Shield” i
s an extensive online surveillance system. China’s cyber-censors can suspend internet or social media accounts if their users send messages containing sensitive terms such as “Tibetan independence” or “Tiananmen Square incident.”43

  As distinct as our politics and cultures may be or have been, the emerging evidence of the Chinese social credit initiatives broadcasts the logic of surveillance capitalism and the instrumentarian power that it produces. Sesame Credit doubles down on every aspect of surveillance capitalist operations, with hundreds of millions of people caught in the gears of an automated behavioral modification machine and its bubbling behavioral futures markets dispersing perks and honors like Pokémon fairy dust in return for guaranteed outcomes.

  Chinese users are rendered, classified, and queued up for prediction with every digital touch, and so are we. We are ranked on Uber, on eBay, on Facebook, and on many other web businesses, and those are only the rankings that we see. Chinese users are assigned a “character” score, whereas the US government urges the tech companies to train their algorithms for a “radicalism” score. Indeed, the work of the shadow text is to evaluate, categorize, and predict our behavior in millions of ways that we can neither know nor combat—these are our digital dossiers. When it comes to credit scoring, US and UK banks and financial services firms have floated business models based on the mining and analysis of social media data for credit scores. Facebook itself has hinted of its interest, even filing a patent.44 These efforts receded only because the Federal Trade Commission threatened regulatory intervention.45

 

‹ Prev