The US, the UK, and most of Europe entered the second decade of the twenty-first century facing economic and social inequalities more extreme than anything since the Gilded Age and comparable to some of the world’s poorest countries.43 Despite a decade of explosive digital growth that included the Apple miracle and the penetration of the internet into everyday life, dangerous social divisions suggested an even more stratified and antidemocratic future. “In the age of new consensus financial policy stabilization,” one US economist wrote, “the economy has witnessed the largest transfer of income to the top in history.”44 A sobering 2016 report from the International Monetary Fund warned of instability, concluding that the global trends toward neoliberalism “have not delivered as expected.” Instead, inequality had significantly diminished “the level and the durability of growth” while increasing volatility and creating permanent vulnerability to economic crisis.45
The quest for effective life had been driven to the breaking point under the aegis of market freedom. Two years after the North London riots, research in the UK showed that by 2013, poverty fueled by lack of education and unemployment already excluded nearly a third of the population from routine social participation.46 Another UK report concluded, “Workers on low and middle incomes are experiencing the biggest decline in their living standards since reliable records began in the mid-19th Century.”47 By 2015, austerity measures had eliminated 19 percent, or 18 billion pounds, from the budgets of local authorities, had forced an 8 percent cut in child protection spending, and had caused 150,000 pensioners to no longer enjoy access to vital services.48 Buy 2014 nearly half of the US population lived in functional poverty, with the highest wage in the bottom half of earners at about $34,000.49 A 2012 US Department of Agriculture survey showed that close to 49 million people lived in “food-insecure” households.50
In Capital in the Twenty-First Century, the French economist Thomas Piketty integrated years of income data to derive a general law of accumulation: the rate of return on capital tends to exceed the rate of economic growth. This tendency, summarized as r > g, is a dynamic that produces ever-more-extreme income divergence and with it a range of antidemocratic social consequences long predicted as harbingers of an eventual crisis of capitalism. In this context, Piketty cites the ways in which financial elites use their outsized earnings to fund a cycle of political capture that protects their interests from political challenge.51 Indeed, a 2015 New York Times report concluded that 158 US families and their corporations provided almost half ($176 million) of all the money that was raised by both political parties in support of presidential candidates in 2016, primarily in support of “Republican candidates who have pledged to pare regulations, cut taxes… and shrink entitlements.”52 Historians, investigative journalists, economists, and political scientists have analyzed the intricate facts of a turn toward oligarchy, shining a light on the systematic campaigns of public influence and political capture that helped drive and preserve an extreme free-market agenda at the expense of democracy.53
A précis of Piketty’s extensive research may be stated simply: capitalism should not be eaten raw. Capitalism, like sausage, is meant to be cooked by a democratic society and its institutions because raw capitalism is antisocial. As Piketty warns, “A market economy… if left to itself… contains powerful forces of divergence, which are potentially threatening to democratic societies and to the values of social justice on which they are based.”54 Many scholars have taken to describing these new conditions as neofeudalism, marked by the consolidation of elite wealth and power far beyond the control of ordinary people and the mechanisms of democratic consent.55 Piketty calls it a return to “patrimonial capitalism,” a reversion to a premodern society in which one’s life chances depend upon inherited wealth rather than meritocratic achievement.56
We now have the tools to grasp the collision in all of its destructive complexity: what is unbearable is that economic and social inequalities have reverted to the preindustrial “feudal” pattern but that we, the people, have not. We are not illiterate peasants, serfs, or slaves. Whether “middle class” or “marginalized,” we share the collective historical condition of individualized persons with complex social experiences and opinions. We are hundreds of millions or even billions of second-modernity people whom history has freed both from the once-immutable facts of a destiny told at birth and from the conditions of mass society. We know ourselves to be worthy of dignity and the opportunity to live an effective life. This is existential toothpaste that, once liberated, cannot be squeezed back into the tube. Like a detonation’s rippling sound waves of destruction, the reverberations of pain and anger that have come to define our era arise from this poisonous collision between inequality’s facts and inequality’s feelings.57
Back in 2011, those 270 interviews of London participants in the riots also reflected the scars of this collision. “They expressed it in different ways,” the report concludes, “but at heart what the rioters talked about was a pervasive sense of injustice. For some, this was economic—the lack of a job, money, or opportunity. For others it was more broadly social, not just the absence of material things, but how they felt they were treated compared with others.…” The “sense of being invisible” was “widespread.” As one woman explained, “The young these days need to be heard. It’s got to be justice for them.” And a young man reflected, “When no one cares about you you’re gonna eventually make them care, you’re gonna cause a disturbance.”58 Other analyses cite “the denial of dignity” expressed in the wordless anger of the North London rampage.59
When the Occupy movement erupted on another continent far from London’s beleaguered neighborhoods, it appeared to have little in common with the violent eruptions that August. The 99 percent that Occupy intended to represent is not marginalized; on the contrary, the very legitimacy of Occupy was its claim to supermajority status. Nevertheless, Occupy revealed a similar conflict between inequality’s facts and inequality’s feelings, expressed in a creatively individualized political culture that insisted on “direct democracy” and “horizontal leadership.”60 Some analysts concluded that it was this conflict that ultimately crippled the movement, with its “inner core” of leaders unwilling to compromise their highly individualized approach in favor of the strategies and tactics required for a durable mass movement.61 However, one thing is certain: there were no serfs in Zuccotti Park. On the contrary, as one close observer of the movement ruminated, “What is different is that from the start very large sections of we, the people, proved to be wiser than our rulers. We saw further and proved to have better judgment, thus reversing the traditional legitimacy of our elite governance that those in charge know better than the unwashed.”62
This is the existential contradiction of the second modernity that defines our conditions of existence: we want to exercise control over our own lives, but everywhere that control is thwarted. Individualization has sent each one of us on the prowl for the resources we need to ensure effective life, but at each turn we are forced to do battle with an economics and politics from whose vantage point we are but ciphers. We live in the knowledge that our lives have unique value, but we are treated as invisible. As the rewards of late-stage financial capitalism slip beyond our grasp, we are left to contemplate the future in a bewilderment that erupts into violence with increasing frequency. Our expectations of psychological self-determination are the grounds upon which our dreams unfold, so the losses we experience in the slow burn of rising inequality, exclusion, pervasive competition, and degrading stratification are not only economic. They slice us to the quick in dismay and bitterness because we know ourselves to be worthy of individual dignity and the right to a life on our own terms.
The deepest contradiction of our time, the social philosopher Zygmunt Bauman wrote, is “the yawning gap between the right of self-assertion and the capacity to control the social settings which render such self-assertion feasible. It is from that abysmal gap that the most poisonous effluvia contami
nating the lives of contemporary individuals emanate.” Any new chapter in the centuries-old story of human emancipation, he insisted, must begin here. Can the instability of the second modernity give way to a new synthesis: a third modernity that transcends the collision, offering a genuine path to a flourishing and effective life for the many, not just the few? What role will information capitalism play?
V. A Third Modernity
Apple once launched itself into that “abysmal gap,” and for a time it seemed that the company’s fusion of capitalism and the digital might set a new course toward a third modernity. The promise of an advocacy-oriented digital capitalism during the first decade of our century galvanized second-modernity populations around the world. New companies such as Google and Facebook appeared to bring the promise of the inversion to life in new domains of critical importance, rescuing information and people from the old institutional confines, enabling us to find what and whom we wanted, when and how we wanted to search or connect.
The Apple inversion implied trustworthy relationships of advocacy and reciprocity embedded in an alignment of commercial operations with consumers’ genuine interests. It held out the promise of a new digital market form that might transcend the collision: an early intimation of a third-modernity capitalism summoned by the self-determining aspirations of individuals and indigenous to the digital milieu. The opportunity for “my life, my way, at a price I can afford” was the human promise that quickly lodged at the very heart of the commercial digital project, from iPhones to one-click ordering to massive open online courses to on-demand services to hundreds of thousands of web-based enterprises, apps, and devices.
There were missteps, shortfalls, and vulnerabilities, to be sure. The potential significance of Apple’s tacit new logic was never fully grasped, even by the company itself. Instead, the corporation produced a steady stream of contradictions that signaled business as usual. Apple was criticized for extractive pricing policies, offshoring jobs, exploiting its retail staff, abrogating responsibility for factory conditions, colluding to depress wages via illicit noncompete agreements in employee recruitment, institutionalized tax evasion, and a lack of environmental stewardship—just to name a few of the violations that seemed to negate the implicit social contract of its own unique logic.
When it comes to genuine economic mutation, there is always a tension between the new features of the form and its mother ship. A combination of old and new is reconfigured in an unprecedented pattern. Occasionally, the elements of a mutation find the right environment in which to be “selected” for propagation. This is when the new form stands a chance of becoming fully institutionalized and establishes its unique migratory path toward the future. But it’s even more likely that potential mutations meet their fate in “transition failure,” drawn back by the gravitational pull of established practices.63
Was the Apple inversion a powerful new economic mutation running the gauntlet of trial and error on its way to fulfilling the needs of a new age, or was it a case of transition failure? In our enthusiasm and growing dependency on technology, we tended to forget that the same forces of capital from which we had fled in the “real” world were rapidly claiming ownership of the wider digital sphere. This left us vulnerable and caught unawares when the early promise of information capitalism took a darker turn. We celebrated the promise of “help is on the way” while troubling questions broke through the haze with increasing regularity, each one followed by a predictable eruption of dismay and anger.
Why did Google’s Gmail, launched in 2004, scan private correspondence to generate advertising? As soon as the first Gmail user saw the first ad targeted to the content of her private correspondence, public reaction was swift. Many were repelled and outraged; others were confused. As Google chronicler Steven Levy put it, “By serving ads related to content, Google seemed almost to be reveling in the fact that users’ privacy was at the mercy of the policies and trustworthiness of the company that owned the servers. And since those ads made profits, Google was making it clear that it would exploit the situation.”64
In 2007 Facebook launched Beacon, touting it as “a new way to socially distribute information.” Beacon enabled Facebook advertisers to track users across the internet, disclosing users’ purchases to their personal networks without permission. Most people were outraged by the company’s audacity, both in tracking them online and in usurping their ability to control the disclosure of their own facts. Facebook founder Mark Zuckerberg shut the program down under duress, but by 2010 he declared that privacy was no longer a social norm and then congratulated himself for relaxing the company’s “privacy policies” to reflect this self-interested assertion of a new social condition.65 Zuckerberg had apparently never read user Jonathan Trenn’s rendering of his Beacon experience:
I purchased a diamond engagement ring set from overstock in preparation for a New Year’s surprise for my girlfriend.… Within hours, I received a shocking call from one of my best friends of surprise and “congratulations” for getting engaged.(!!!) Imagine my horror when I learned that overstock had published the details of my purchase (including a link to the item and its price) on my public Facebook newsfeed, as well as notifications to all of my friends. ALL OF MY FRIENDS, including my girlfriend, and all of her friends, etc.… ALL OF THIS WAS WITHOUT MY CONSENT OR KNOWLEDGE. I am totally distressed that my surprise was ruined, and what was meant to be something special and a lifetime memory for my girlfriend and I was destroyed by a totally underhanded and infuriating privacy invasion. I want to wring the neck of the folks at overstock and facebook who thought that this was a good idea. It sets a terrible precedent on the net, and I feel that it ruined a part of my life.66
Among the many violations of advocacy expectations, ubiquitous “terms-of-service agreements” were among the most pernicious.67 Legal experts call these “contracts of adhesion” because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not. Online “contracts” such as terms-of-service or terms-of-use agreements are also referred to as “click-wrap” because, as a great deal of research shows, most people get wrapped in these oppressive contract terms by simply clicking on the box that says “I agree” without ever reading the agreement.68 In many cases, simply browsing a website obligates you to its terms-of-service agreement even if you don’t know it. Scholars point out that these digital documents are excessively long and complex in part to discourage users from actually reading the terms, safe in the knowledge that most courts have upheld the legitimacy of click-wrap agreements despite the obvious lack of meaningful consent.69 US Supreme Court Chief Justice John Roberts admitted that he “doesn’t read the computer fine print.”70 Adding insult to injury, terms of service can be altered unilaterally by the firm at any time, without specific user knowledge or consent, and the terms typically implicate other companies (partners, suppliers, marketers, advertising intermediaries, etc.) without stating or accepting responsibility for their terms of service. These “contracts” impose an unwinnable infinite regress upon the user that law professor Nancy Kim describes as “sadistic.”
Legal scholar Margaret Radin observes the Alice-in-Wonderland quality of such “contracts.” Indeed, the sacred notions of “agreement” and “promise” so critical to the evolution of the institution of contract since Roman times have devolved to a “talismanic” signal “merely indicating that the firm deploying the boilerplate wants the recipient to be bound.”71 Radin calls this “private eminent domain,” a unilateral seizure of rights without consent. She regards such “contracts” as a moral and democratic “degradation” of the rule of law and the institution of contract, a perversion that restructures the rights of users granted through democratic processes, “substituting for them the system that the firm wishes to impose.… Recipients must enter a legal universe of the firm’s devising in order to engage in transactions with the firm.”72
The digital milieu has been essential to these degradations. Kim points ou
t that paper documents once imposed natural restraints on contracting behavior simply by virtue of their cost to produce, distribute, and archive. Paper contracts require a physical signature, limiting the burden a firm is likely to impose on a customer by requiring her to read multiple pages of fine print. Digital terms, in contrast, are “weightless.” They can be expanded, reproduced, distributed, and archived at no additional cost. Once firms understood that the courts were disposed to validate their click-wrap and browse-wrap agreements, there was nothing to stop them from expanding the reach of these degraded contracts “to extract from consumers additional benefits unrelated to the transaction.”73 This coincided with the discovery of behavioral surplus that we examine in Chapter 3, as terms-of-service agreements were extended to include baroque and perverse “privacy policies,” establishing another infinite regress of these terms of expropriation. Even the former Federal Trade Commission Chairperson Jon Leibowitz publicly stated, “We all agree that consumers don’t read privacy policies.”74 In 2008 two Carnegie Mellon professors calculated that a reasonable reading of all the privacy policies that one encounters in a year would require 76 full workdays at a national opportunity cost of $781 billion.75 The numbers are much higher today. Still, most users remain unaware of these “rapacious” terms that, as Kim puts it, allow firms “to acquire rights without bargaining and to stealthily establish and embed practices before users, and regulators, realize what has happened.”76
At first, it had seemed that the new internet companies had simply failed to grasp the moral, social, and institutional requirements of their own economic logic. But with each corporate transgression, it became more difficult to ignore the possibility that the pattern of violations signaled a feature, not a bug. Although the Apple miracle contained the seeds of economic reformation, it was poorly understood: a mystery even to itself. Long before the death of its legendary founder, Steve Jobs, its frequent abuses of user expectations raised questions about how well the corporation understood the deep structure and historic potential of its own creations. The dramatic success of Apple’s iPod and iTunes instilled internet users with a sense of optimism toward the new digital capitalism, but Apple never did seize the reins on developing the consistent, comprehensive social and institutional processes that would have elevated the iPod’s promise to an explicit market form, as Henry Ford and Alfred Sloan had once done.
The Age of Surveillance Capitalism Page 6