Facebook makes clear that the shadow text is not available to users, despite the promotion of its self-service download tools that promise to give users access to their personal data retained by the company. Indeed, the competitive dynamics of surveillance capitalism make the shadow text a crucial proprietary source of advantage. Any attempt to breach its content will be experienced as an existential threat; no surveillance capitalist will voluntarily provide data from the shadow text. Only law can compel this challenge to the pathological division of learning.
In the wake of the Cambridge Analytica scandal in March 2018, Facebook announced it would expand the range of personal data that it allows users to download, but even these data remain wholly contained within the first text, composed largely of the information that users themselves have provided, including information that they have deleted: friends, photos, video, ads that have been clicked, pokes, posts, location, and so on. These data do not include behavioral surplus, prediction products, and the fate of those predictions as they are used for behavioral modification, bought, and sold. When you download your “personal information,” you access the stage, not the backstage: the curtain, not the wizard.20
Facebook’s response to Dehaye illustrates another consequence of the extreme asymmetries of knowledge at play. The company insisted that access to the requested data required it to surmount “huge technical challenges.” As behavioral surplus flows converge in machine-learning–based manufacturing operations, the sheer volume of data inputs and methods of analysis moves beyond human comprehension. Consider something as trivial as the case of Instagram’s machines selecting what images to show you. Its computations are based on varied streams of behavioral surplus from a subject user, then more streams from the friends in that user’s network, then more from the activities of people who follow the same accounts as the subject user, then the data and social links from the user’s Facebook activity. When it finally applies a ranking logic to predict what images the user will want to see next, that analysis also includes data on the past behavior of the subject user. Instagram has machines doing this “learning” because humans cannot.21 In the case of more “consequential” analyses, the operations are likely to be equally or even more complex.
This recalls our discussion of Facebook’s “prediction engine” FBLearner Flow, where the machines are fed tens of thousands of data points derived from behavioral surplus, diminishing the very notion of the right to contest “automatic decision making.” If the algorithms are to be contestable in any meaningful way, it will require new countervailing authority and power, including machine resources and expertise to reach into the core disciplines of machine intelligence and construct new approaches that are available for inspection, debate, and combat. Indeed, one expert has already proposed the creation of a government agency—an “FDA for algorithms”—to oversee the development, distribution, sale, and use of complex algorithms, arguing that existing laws “will prove no match for the difficult regulatory puzzles algorithms pose.”22
Dehaye’s experience is but one illustration of the self-sustaining nature of a pathological division of learning and the insuperable burden placed on individuals moved to challenge its injustice. Dehaye is an activist, and his aim is not only to access data but also to document the arduousness and even absurdity of the undertaking. Given these realities, he suggests that the data-protection regulations are comparable to freedom-of-information laws. The procedures for requesting and receiving information under these laws are imperfect and onerous, typically undertaken by legal specialists, but nevertheless essential to democratic freedom.23 Although effective contest will require determined individuals, the individual alone cannot shoulder the burden of justice, any more than an individual worker in the first years of the twentieth century could bear the burden of fighting for fair wages and working conditions. Those twentieth-century challenges required collective action, and so do our own.24
In her discussion of “the life of the law,” anthropologist Laura Nader reminds us that law projects “possibilities of democratic empowerment” but that these are pulled forward into real life only when citizens actively contest injustice, using the law as a means to higher purpose. “The life of the law is the plaintiff,” Nader writes, a truth we saw brought to life in the action of the Spanish citizens who claimed the right to be forgotten. “By contesting their injustices by means of law, plaintiffs and their lawyers can still decide the place of law in making history.”25 These plaintiffs do not stand alone; they stand for citizens bonded together as the necessary means of confronting collective injustice.
This brings us back to the GDPR and the question of its impact. The only possible answer is that everything will depend upon how European societies interpret the new regulatory regime in legislation and in the courts. It will not be the wording of the regulations but rather the popular movements on the ground that shape these interpretations. A century ago, workers organized for collective action and ultimately tipped the scales of power, and today’s “users” will have to mobilize in new ways that reflect our own unique twenty-first-century “conditions of existence.” We need synthetic declarations that are institutionalized in new centers of democratic power, expertise, and contest that challenge today’s asymmetries of knowledge and power. This quality of collective action will be required if we are finally to replace lawlessness with laws that assert the right to sanctuary and the right to the future tense as essential for effective human life.
It is already possible to see a new awakening to empowering collective action, at least in the privacy domain. One example is None of Your Business (NOYB), a nonprofit organization led by privacy activist Max Schrems. After many years of legal contest, Schrems made history in 2015 when his challenge to Facebook’s data-collection and data-retention practices—which he asserted were in violation of EU privacy law—led the Court of Justice of the European Union to invalidate the Safe Harbor agreement that governed data transfers between the US and the EU. In 2018 Schrems launched NOYB as a vehicle for “professional privacy enforcement.” The idea is to push regulators to close the gap between written regulations and corporate privacy practices, leveraging the threat of significant fines to change a company’s actual procedures. NOYB wants to become “a stable European enforcement platform” that unites groups of users and assists them through the litigation process while building coalitions and advancing “targeted and strategic litigation to maximize the impact ‘on the right to privacy.’”26 However this undertaking progresses, the key point for us is the way in which it points to a social void that must be filled with creative new forms of collective action, if the life of the law is to move against surveillance capitalism.
Only time will tell if the GDPR will be a catalyst for a new phase of combat that wrangles and tames an illegitimate marketplace in behavioral futures, the data operations that feed it, and the instrumentarian society toward which they aim. In the absence of new synthetic declarations, we may be disappointed by the intransigence of the status quo. If the past is a prologue, then privacy, data protection, and antitrust laws will not be enough to interrupt surveillance capitalism. The reasons that we examined in answering the question “How did they get away with it?” suggest that the immense and intricate structures of surveillance capitalism and its imperatives will require a more direct challenge.
This is at least one conclusion from the past decade: despite far more stringent privacy and data-protection laws in the EU as compared to the US, as well as a forceful commitment to antitrust, Facebook and Google have continued to flourish in Europe. For example, between 2010 and 2017 the compounded annual growth rate for Facebook daily active users was 15 percent in Europe compared to 9 percent in the US and Canada.27 During that period the company’s revenue grew by a compounded annual growth rate of 50 percent in both regions.28 Between 2009 and the first quarter of 2018, Google’s share of the search market in Europe declined by about 2 percent while increasing in the US by about 9 percen
t. (Google’s European market share remained high, at 91.5 percent in 2018, compared to 88 percent in the US.) In the case of its Android mobile phones, however, Google’s market share increased by 69 percent in Europe compared to 44 percent in the US. Google’s Chrome browser increased its market share by 55 percent in Europe and 51 percent in the US.29
Those growth rates are not mere good fortune, as our list of “how they got away with it” suggests. In recognition of this fact, Europe’s Data Protection Supervisor Giovanni Buttarelli told the New York Times that the GDPR’s impact will be determined by regulators who “will be up against well-funded teams of lobbyists and lawyers.”30 Indeed, corporate lawyers were already honing their strategies for the preservation of business as usual and setting the stage for the contests ahead. For example, a white paper published by one prominent international law firm rallies corporations to the barricades of data processing, arguing that the legal concept of “legitimate interest” offers a promising opportunity to bypass new regulatory obstacles:
Legitimate interest may be the most accountable ground for processing in many contexts, as it requires an assessment and balancing of the risks and benefits of processing for organisations, individuals and society. The legitimate interests of the controller or a third party may also include other rights and freedoms. The balancing test will sometimes also include… freedom of expression, right to engage in economic activity, right to ensure protection of IP rights, etc. These rights must also be taken into account when balancing them against the individuals’ right to privacy.31
Surveillance capitalism’s economic imperatives were already on the move in late April 2018, in anticipation of the GDPR taking effect that May. Earlier in April, Facebook’s CEO had announced that the corporation would apply the GDPR “in spirit” across the globe. In practice, however, the company was making changes to ensure that the GDPR would not circumscribe the majority of its operations. Until then, 1.5 billion of its users, including those in Africa, Asia, Australia, and Latin America, were governed by terms of service issued by the company’s international headquarters in Ireland, meaning that these terms fell under the EU framework. It was in late April that Facebook quietly issued new terms of service, placing those 1.5 billion users under US privacy laws and thus eliminating their ability to file claims in Irish courts.32
III. Every Unicorn Has a Hunter
What life is left to us if taming fails? Without protection from surveillance capitalism and its instrumentarian power—their behavioral aims and societal goals—we are trapped in a condition of “no exit,” where the only walls are made of glass. The natural human yearning for refuge must be extinguished and the ancient institution of sanctuary deleted.
“No exit” is the necessary condition for Big Other to flourish, and its flourishing is the necessary condition for all that is meant to follow: the tides of behavioral surplus and their transformation into revenue, the certainty that will meet every market player with guaranteed outcomes, the bypass of trust in favor of the uncontract’s radical indifference, the paradise of effortless connection that exploits the needs of harried second-modernity individuals and transforms their lives into the means to others’ ends, the plundering of the self, the extinction of autonomous moral judgment for the sake of frictionless control, the actuation and modification that quietly drains the will to will, the forfeit of your voice in the first person in favor of others’ plans, the destruction of the social relations and politics of the old and slow and still-unfulfilled ideals of self-determining citizens bound to the legitimate authority of democratic governance.
Each of these exquisite unicorns has inspired the best that humanity has achieved, however imperfectly they have been fulfilled. But every unicorn has a hunter, and the ideals that have nurtured the liberal order are no exception. For the sake of this hunter, there can be no doors, no locks, no friction, no opposition between intimacy and distance, house and universe. There is no need for “topoanalysis” now because all spaces have collapsed into the one space that is Big Other. Seek not the petal-soft iridescent apex of the shell. There is no purpose to curling up in its dark spire. The shell is just another connected node, and your daydream is already finding an audience in the pulsating net of this clamorous glass life.
In the absence of synthetic declarations that secure the road to a human future, the intolerability of glass life turns us toward a societal arms race of counter-declarations in which we search for and embrace increasingly complex ways to hide in our own lives, seeking respite from lawless machines and their masters. We do this to satisfy our enduring need for sanctuary and as an act of resistance with which to reject the instrumentarian disciplines of the hive, its “extended chilling effects,” and Big Other’s relentless greed. In the context of government surveillance, the practices of “hiding” have been called “privacy protests” and are well-known for drawing the suspicion of law-enforcement agencies.33 Now, hiding is also invoked by Big Other and its market masters, whose reach is far and deep as they install themselves in our walls, our bodies, and on our streets, claiming our faces, our feelings, and our fears of exclusion.
I have suggested that too many of the best and brightest of a new generation devote their genius to the intensification of the click-stream. Equally more poignant is the way in which a new generation of activists, artists, and inventors feels itself called to create the art and science of hiding.34 The intolerable conditions of glass life compel these young artists to dedicate their genius to the prospects of human invisibility, even as their creations demand that we aggressively seek and find our bearings. Their provocations already take many forms: signal-blocking phone cases, false fingerprint prosthetics that prevent your fingertips from being “used as a key to your life,” LED privacy visors to impede facial-recognition cameras, a quilted coat that blocks radio waves and tracking devices, a scent diffuser that releases a metallic fragrance when an unprotected website or network is detected on any of your devices, a “serendipitor app” to disrupt any surveillance “that relies on subjects maintaining predictable routines,” a clothing line called “Glamouflage” featuring shirts covered with representations of celebrity faces to confuse facial-recognition software, anti-neuroimaging surveillance headgear to obstruct digital invasion of brain waves, and an anti-surveillance coat that creates a shield to block invasive signals. Chicago artist Leo Selvaggio produces 3-D–printed resin prosthetic masks to confound facial recognition. He calls his effort “an organized artistic intervention.”35
Perhaps most poignant is the Backslash Tool Kit: “a series of functional devices designed for protests and riots of the future,” including a smart bandana for embedding hidden messages and public keys, independently networked wearable devices, personal black-box devices to register abuse of law enforcement, and fast deployment routers for off-grid communication.36 Backslash was created as part of a master’s thesis project at New York University, and it perfectly reflects the contest for the third modernity that this generation faces. The designer writes that for young, digitally native protesters, “connectivity is a basic human right.” Yet, he laments, “the future of technology in protests looks dark” because of overwhelming surveillance. His tool kit is intended to create “a space to explore and research the tense relationship between protests and technology and a space to cultivate dialogue about freedom of expression, riots and disruptive technology.” In a related development, students at the University of Washington have developed a prototype for “on-body transmissions with commodity devices.” The idea here is that readily available devices “can be used to transmit information to only wireless receivers that are in contact with the body,” thus creating the basis for secure and private communications independent of normal Wi-Fi transmissions, which can easily be detected.37
Take a casual stroll through the shop at the New Museum for Contemporary Art in Manhattan, and you pass a display of its bestseller: table-top mirrors whose reflecting surface is covered with the bright-orange
message “Today’s Selfie Is Tomorrow’s Biometric Profile.” This “Think Privacy Selfie Mirror” is a project of the young Berlin-based artist Adam Harvey, whose work is aimed at the problem of surveillance and foiling the power of those who surveil. Harvey’s art begins with “reverse engineering… computer vision algorithms” in order to detect and exploit their vulnerabilities through camouflage and other forms of hiding. He is perhaps best known for his “Stealth Wear,” a series of wearable fashion pieces intended to overwhelm, confuse, and evade drone surveillance and, more broadly, facial-recognition software. Silver-plated fabrics reflect thermal radiation, “enabling the wearer to avert overhead thermal surveillance.” Harvey’s fashions are inspired by traditional Islamic dress, which expresses the idea that “garments can provide a separation between man and God.” Now he redirects that meaning to create garments that separate human experience from the powers that surveil.38 Another Harvey project created an aesthetic of makeup and hairstyling—blue feathers suspended from thick black bangs, dreadlocks that dangle below the nose, cheekbones covered in thick wedges of black and white paint, tresses that snake around the face and neck like octopus tentacles—all designed to thwart facial-recognition software and other forms of computer vision.
Harvey is one among a growing number of artists, often young artists, who direct their work to the themes of surveillance and resistance. Artist Benjamin Grosser’s Facebook and Twitter “demetricators” are software interfaces that present each site’s pages with their metrics deleted: “The numbers of ‘likes,’ ‘friends,’ followers, retweets… all disappear.” How is an interface that foregrounds our friend count changing our conceptions of friendship? he asks. “Remove the numbers and find out.” Grosser’s “Go Rando” project is a web browser extension that “obfuscates your feelings on Facebook” by randomly choosing an emoji each time you click “Like,” thus undermining the corporation’s surplus analyses as they compute personality and emotional profiles.39 Trevor Paglen’s richly orchestrated performance art combines music, photography, satellite imagery, and artificial intelligence to reveal Big Other’s omnipresent knowing and doing. “It’s trying to look inside the software that is running an AI… to look into the architectures of different computer vision systems and trying to learn what it is that they are seeing,” Paglen says. Chinese artist Ai Weiwei’s 2017 installation “Hansel & Gretel” created a powerful experience in which participants viscerally confront the surveillance implications of their own innocent picture taking, Instagramming, tweeting, texting, tagging, and posting.40
The Age of Surveillance Capitalism Page 60