In November 2016 the experience of three people in the small Illinois town of Belleville was testimony to what we lose when we subordinate ourselves to the dystopian rule of the uncontract. Pat and Stanford Kipping owed their credit union $350 on their 1998 Buick. Once again, they could not make their monthly $95 payment. The Kippings’ credit union enlisted a local repo man, Jim Ford, to take away their car.
When Ford visited the Kippings’ Belleville home, he was disturbed to find an elderly couple who were forced to choose between buying medicine and making their car payments. Ford’s initial response was to waive his repossession fees. The Kippings generously thanked him, invited him in for tea, and shared their story. That’s when Ford decided to bridge the gap between uncertain reality and the stipulations of their contract. He did the human thing, calling the credit union and offering to pay the couple’s debt.
The credit union manager insisted that Ford had to follow the “process.” Ford continued to invoke the ancient social principles of the contract, seeking a way through the maze to something that felt like justice. Ultimately the manager agreed to “work with” the couple to see what could be done. It didn’t end there. Within twenty-four hours, an online fund-raising appeal produced enough to pay off the Kippings’ car, detail it, purchase a Thanksgiving turkey, and give the couple an additional gift of $1,000.
Most interesting is that when a local paper picked up the story, it quickly went viral across the web and traditional media. Millions of people read and responded to this drama, presumably because it stirred memories of something precious and necessary but now threatened with extinction. Jim Ford reminded us of the most cherished requirements of a civilized life: our shared assertion of rights to the future tense and its expression in the joining of wills in mutual commitment to dialogue, problem solving, and empathy. He was eloquent on this point: “Just be nice to people. It’s not that hard. The fact that this has gone so crazy is kind of sad. This should be a daily thing, a normal thing.”15
In the dystopia of the uncontract, this daily human thing is not normal. What if the Kippings’ credit union employed Spireon’s telematics and merely had to instruct the vehicular monitoring system to disable the car? There would be no loan manager engaging in a give-and-take with customers. The algorithm tasked to eliminate the messy, unpredictable, untrustworthy eruptions of human will would have seized the old Buick. There would have been no shared tea time with the Kippings and no one to listen to their story. There would have been no opportunity to find an alternate route through the maze, no opportunity to build trust, no occasion for collective action, no heartwarming holiday story of kindness, no glimmer of hope for a human future in which the best of our institutions is preserved and fortified, no shared challenge of uncertainty, and no shared freedom.
In the dystopia of the uncontract, surveillance capitalism’s drive toward certainty fills the space once occupied by all the human work of building and replenishing social trust, which is now reinterpreted as unnecessary friction in the march toward guaranteed outcomes. The deletion of uncertainty is celebrated as a victory over human nature: our cunning and our opportunism. All that’s left to matter are the rules that translate reasons into action, the objective measures of behavior, and the degree of conformance between the two. Social trust eventually withers, a kind of vestigial oddity like a third nipple or wisdom teeth: traces of an evolutionary past that no longer appear in operational form because their context and therefore their purpose have vanished.16
The uncontract and the for-profit circuits of behavior modification in which it executes its objectives construe society as an acrid wasteland in which mistrust is taken for granted. By positing our lives together as already failed, it justifies coercive intervention for the sake of certainty. Against this background of the gradual normalization of the automated plan and its planners, the human response of one repo man bears simple witness to precisely what surveillance capitalism must extinguish.
Human replenishment from the failures and triumphs of choosing the future in the face of uncertainty gives way to the blankness of perpetual compliance. The word trust lingers, but its referent in human experience dissolves into reminiscence, an archaic footnote to a barely remembered dream of a dream that has long since faded for the sake of a new dictatorship of market reasons. As the dream dies, so too does our sense of astonishment and protest. We grow numb, and our numbness paves the way for more compliance. A pathological division of learning forged by unprecedented asymmetries of knowledge and power fixes us in a new inequality marked by the tuners and the tuned, the herders and herded, the raw material and its miners, the experimenters and their unwitting subjects, those who will the future and those who are shunted toward others’ guaranteed outcomes.
So let us establish our bearings. Uncertainty is not chaos but rather the necessary habitat of the present tense. We choose the fallibility of shared promises and problem solving over the certain tyranny imposed by a dominant power or plan because this is the price we pay for the freedom to will, which founds our right to the future tense. In the absence of this freedom, the future collapses into an infinite present of mere behavior, in which there can be no subjects and no projects: only objects.
In the future that the surveillance capitalism prepares for us, my will and yours threaten the flow of surveillance revenues. Its aim is not to destroy us but simply to author us and to profit from that authorship. Such means have been imagined in the past, but only now are they feasible. Such means have been rejected in the past, but only now have they been allowed to root. We are ensnared without awareness, shorn of meaningful alternatives for withdrawal, resistance, or protection.
The promise of the promise and the will to will run deeper than these deformities. They remind us of that place again where we humans heal the breach between the known and the unknowable, navigating the seas of uncertainty in our vessels of shared promises. In the real world of human endeavor, there is no perfect information and no perfect rationality. Life inclines us to take action and to make commitments even when the future is unknown. Anyone who has brought a child into the world or has otherwise given his or her heart in love knows this to be true.
Gods know the future, but we move forward, take risks, and bind ourselves to others despite the fact that we can’t know everything about the present, let alone the future. This is the essence of our freedom, expressed as the elemental right to the future tense. With the construction and ownership of the new means of behavioral modification, the fate of this right conforms to a pattern that we already have identified. It is not extinguished, but rather it is usurped: commandeered and accumulated by surveillance capital’s exclusive claims on our futures.
III. How Did They Get Away with It?
In the course of the last ten chapters I have argued that surveillance capitalism represents an unprecedented logic of accumulation defined by new economic imperatives whose mechanisms and effects cannot be grasped with existing models and assumptions. This is not to say that the old imperatives—a compulsion toward profit maximization along with the intensification of the means of production, growth, and competition—have vanished. However, these must now operate through the novel aims and mechanisms of surveillance capitalism. I briefly review the new imperatives here, both as a summary of the ground that we have covered and as prelude to the question How did they get away with it?
Surveillance capitalism’s new story begins with behavioral surplus discovered more or less ready-made in the online environment, when it was realized that the “data exhaust” clogging Google’s servers could be combined with its powerful analytic capabilities to produce predictions of user behavior. Those prediction products became the basis for a preternaturally lucrative sales process that ignited new markets in future behavior.
Google’s “machine intelligence” improved as the volume of data increased, producing better prediction products. This dynamic established the extraction imperative, which expresses the necessity of eco
nomies of scale in surplus accumulation and depends upon automated systems that relentlessly track, hunt, and induce more behavioral surplus. Google imposed the logic of conquest, defining human experience as free for the taking, available to be rendered as data and claimed as surveillance assets. The company learned to employ a range of rhetorical, political, and technological strategies to obfuscate these processes and their implications.
The need for scale drove a relentless search for new high-volume supplies of behavioral surplus, producing competitive dynamics aimed at cornering these supplies of raw material and seeking lawless undefended spaces in which to prosecute these unexpected and poorly understood acts of dispossession. All the while, surveillance capitalists stealthily but steadfastly habituated us to their claims. In the process, our access to necessary information and services became hostage to their operations, our means of social participation fused with their interests.
Lucrative prediction products depend upon behavioral surplus, and competition drove the supply challenges to a new level, expressed in the prediction imperative. More-powerful prediction products required economies of scope as well as scale, variation as well as volume. This variation occurs along two dimensions. The first is extension across a wide range of activities; the second is the depth of predictive detail within each activity.
In this new phase of competitive intensity, surveillance capitalists are forced from the virtual world into the real one. This migration necessitates new machine processes for the rendition of all aspects of human experience into behavioral data. Competition now occurs in the context of a rapidly evolving global architecture of ubiquitous computation and therefore ubiquitous supply opportunities, as prediction products are increasingly expected to approximate certainty and therefore to guarantee behavioral outcomes.
In a third phase of competitive intensity, surveillance capitalists discovered the necessity of economies of action based on new methods that go beyond tracking, capturing, analyzing, and predicting behavior in order to intervene in the state of play and actively shape behavior at the source. The result is that the means of production are subordinated to an elaborate new means of behavioral modification, which relies upon a variety of machine processes, techniques, and tactics (tuning, herding, conditioning) to shape individual, group, and population behavior in ways that continuously improve their approximation to guaranteed outcomes. Just as industrial capitalism was driven to the continuous intensification of the means of production, so surveillance capitalists are now locked in a cycle of continuous intensification of the means of behavioral modification.
Surveillance capitalists’ interests have shifted from using automated machine processes to know about your behavior to using machine processes to shape your behavior according to their interests. In other words, this decade-and-a-half trajectory has taken us from automating information flows about you to automating you. Given the conditions of increasing ubiquity, it has become difficult if not impossible to escape this audacious, implacable web.
In order to reestablish our bearings, I have asked for a rebirth of astonishment and outrage. Most of all, I have asked that we reject the Faustian pact of participation for dispossession that requires our submission to the means of behavioral modification built on the foundation of the Google declarations. I am also mindful, though, that when we ask How did they get away with it? there are many compelling reasons to consider, no one of which stands alone. Instead of simple cause and effect, the answers to our question constitute a broad landscape of history, contingency, quicksand, and coercion.
Our question is even more vexing in light of the fact that in the great majority of surveys designed to probe public attitudes toward the loss of privacy and other elements of surveillance capitalist practices, few of us favor the status quo. In forty-six of the most prominent forty-eight surveys administered between 2008 and 2017, substantial majorities support measures for enhanced privacy and user control over personal data. (Only two early surveys were somewhat less conclusive, because so many participants indicated that they did not understand how or what personal information was being gathered.) Indeed, by 2008 it was well established that the more knowledge one has about “internet privacy practices,” the more one is likely to be very concerned about privacy.17
Although the surveys vary in terms of their specific focus and questions, the general consistency of responses over the decade is noteworthy. For example, an important 2009 survey found that when Americans are informed of the ways that companies gather data for targeted online ads, 73–86 percent reject such advertising. Another substantial survey in 2015 found 91 percent of respondents disagreeing that the collection of personal information “without my knowing” is a fair trade-off for a price discount. Fifty-five percent disagreed that it was a fair exchange for improved services. In 2016 Pew Research reported only 9 percent of respondents as very confident in trusting social media sites with their data and 14 percent as very confident about trusting companies with personal data. More than 60 percent wanted to do more to protect their privacy and believed there should be more regulations to protect privacy.18
Surveillance capitalist firms have tended to dismiss these survey results, pointing instead to the spectacular growth of users and revenue. This discrepancy has confounded research and public policy. With so many people rejecting the practices of surveillance capitalism, even considering how little most of us actually know about these practices, how is it that this market form has been able to succeed? The reasons are plentiful:
1. Unprecedented: Most of us did not resist the early incursions of Google, Facebook, and other surveillance capitalist operations because it was impossible to recognize the ways in which they differed from anything that had gone before. The basic operational mechanisms and business practices were so new and strange, so utterly sui generis, that all we could see was a gaggle of “innovative” horseless carriages. Most significantly, anxiety and vigilance have been fixed on the known threats of surveillance and control associated with state power. Earlier incursions of behavior modification at scale were understood as an extension of the state, and we were not prepared for the onslaught from private firms.
2. Declaration as invasion: The lack of precedence left us disarmed and charmed. Meanwhile, Google learned the art of invasion by declaration, taking what it wanted and calling it theirs. The corporation asserted its rights to bypass our awareness, to take our experience and transform it into data, to claim ownership of and decisions over the uses of those data, to produce strategies and tactics that keep us ignorant of its practices, and to insist on the conditions of lawlessness required for these operations. These declarations institutionalized surveillance capitalism as a market form.
3. Historical context: Surveillance capitalism found shelter in the neoliberal zeitgeist that equated government regulation of business with tyranny. This “paranoid style” favored self-management regimes that imposed few limits on corporate practices. In a parallel development, the “war on terror” shifted the government’s attention from privacy legislation to an urgent interest in the rapidly developing skills and technologies of Google and other rising surveillance capitalists. These “elective affinities” produced a trend toward surveillance exceptionalism, which further sheltered the new market form from scrutiny and nurtured its development.
4. Fortifications: Google aggressively protected its operations by establishing its utilities in the electoral process, strong relationships with elected and appointed officials, a revolving door of staffers between Washington and Silicon Valley, lavish lobbying expenditures, and a steady “soft-power” campaign of cultural influence and capture.
5. The dispossession cycle: First at Google and later at Facebook and other firms, surveillance capitalist leaders mastered the rhythms and stages of dispossession. Audacious incursions are pursued until resistance is met, followed by a range of tactics from elaborate public relations gambits to legal combat, all designed to buy time for gradual habituation to
once-outrageous facts. A third stage features public demonstrations of adaptability and even retreat, while in the final stage resources are redirected to achieve the same objectives camouflaged by new rhetoric and tactics.
6. Dependency: The free services of Google, Facebook, and others appealed to the latent needs of second-modernity individuals seeking resources for effective life in an increasingly hostile institutional environment. Once bitten, the apple was irresistible. As surveillance capitalism spread across the internet, the means of social participation become coextensive with the means of behavioral modification. The exploitation of second-modernity needs that enabled surveillance capitalism from the start eventually imbued nearly every channel of social participation. Most people find it difficult to withdraw from these utilities, and many ponder if it is even possible.
7. Self-Interest: New markets in future behavior give rise to networks of fellow travelers, partners, collaborators, and customers whose revenues depend on the prediction imperative. Institutional facts proliferate. The pizzeria owner on the Pokémon Go map, the merchant who saturates his shop with beacons, and the insurance companies vying for behavioral data unite in the race toward guaranteed outcomes and surveillance revenues.
8. Inclusion: Many people feel that if you are not on Facebook, you do not exist. People all over the world raced to participate in Pokémon Go. With so much energy, success, and capital flowing into the surveillance capitalist domain, standing outside of it, let alone against it, can feel like a lonely and risky prospect.
9. Identification: Surveillance capitalists aggressively present themselves as heroic entrepreneurs. Many people identify with and admire the financial success and popularity of the surveillance capitalists and regard them as role models.
10. Authority: Many also regard these corporations and their leaders as authorities on the future: geniuses who can see farther than the rest of us. It is easy to fall prey to the naturalistic fallacy, which suggests that because the companies are successful, they must also be right. As a result, many of us are respectful of these leaders’ expert status and are eager to participate in innovations that anticipate the future.
The Age of Surveillance Capitalism Page 42