A final question is urgently posed: “how to get the humans in these systems to participate in the plan?” His answers do not lie in persuasion or education but in behavioral modification. He says we need “new predictive theories of human decision making” as well as “incentive mechanism design,” an idea that is comparable to Skinner’s “schedules of reinforcement.” Regarding how to get humans to follow the plan, Pentland offers the principle of “social influence” to explain the design mechanisms through which millions of human beings can be herded toward the guaranteed outcomes of safety, stability, and efficiency. He refers to his own studies, in which “the problems of industry and government” can largely be explained by the pattern of information transfer, especially how people influence and mimic one another.
This notion of social influence is a significant piece in Pentland’s puzzle that anticipates a great deal of what is to come. Pentland understands that Big Other is not only an architecture that monitors and controls things. Big Other’s instrumentation and data flows also make people mutually visible to one another, from the updates on your breakfast to the population flows in cities. Back in 2011, Pentland enthused, “Revolutionary new… infrastructures are providing us with a God’s eye view of ourselves.”36 The aim is a computer-mediated society where our mutual visibility becomes the habitat in which we attune to one another, producing social patterns based on imitation that can be manipulated for confluence, just as the logic of the machine hive suggests.
Regarding incentives, Pentland outlines a principle of “social efficiency,” which means that participation must provide value to the individual but also to the system as a whole.37 For the sake of this wholeness, it is believed, each of us will surrender to a totally measured life of instrumentarian order. Sounding ever so much like Eric Schmidt and Larry Page with their silky promises of Google’s all-knowing preemptive magic, Pentland believes that what we stand to lose is more than compensated for by the social rewards of efficient corporations and governments and the individual rewards that are simply magic, as he baldly appeals to second-modernity stress:
For society, the hope is that we can use this new in-depth understanding of individual behavior to increase the efficiency and responsiveness of industries and governments. For individuals, the attraction is the possibility of a world where everything is arranged for your convenience—your health checkup is magically scheduled just as you begin to get sick, the bus comes just as you get to the bus stop, and there is never a line of waiting people at city hall. As these new abilities become refined by the use of more sophisticated statistical models and sensor capabilities, we could well see the creation of a quantitative, predictive science of human organizations and human society.38
III. The Principles of an Instrumentarian Society
Pentland’s theory of instrumentarian society came to full flower in his 2014 book Social Physics, in which his tools and methods are integrated into an expansive vision of our futures in a data-driven instrumentarian society governed by computation. Pentland transforms Skinner’s fusty, odd utopia into something that sounds sophisticated, magical, and plausible, largely because it resonates with the waves of applied utopistics that wash over our lives each day. In completing Skinner, Pentland fashions more than an updated portrait of a behaviorist utopia. He outlines the principles of a full-blown instrumentarian society based on the pervasive outfitting and measurement of human behavior for the purposes of modification, control, and—in light of surveillance capitalism’s commercial dominance of the networked sphere—profit.
Pentland insists that “social phenomena are really just aggregations of billions of small transactions between individuals.…” This is a key point because it turns out that in order for social physics to replace the old ways of thinking, total knowledge of these billions of small things is required: “Big Data give us a chance to view society in all its complexity, through the millions of networks of person-to-person exchanges. If we had a ‘god’s eye,’ an all seeing view, then we could potentially arrive at a true understanding of how society works and take steps to fix our problems.”39
Pentland is sanguine on this point: total knowledge is within reach. As he states, “In just a few short years we are likely to have incredible rich data available about the behavior of virtually all of humanity—on a continuous basis. The data mostly already exists.”40 The right to the future tense—and with it social trust, authority, and politics—is surrendered to Big Other and the transcendent computational systems that rule society under the watchful eye of a group that Pentland calls “we.” He never defines this “we,” which imposes an us-them relationship, introducing the exclusivity of the shadow text and its one-way mirror. It is an omission that haunts his text. Does it refer to the priesthood of data scientists like Pentland? The priesthood in collaboration with the owners of the means of behavior modification?
The theory aims to establish laws of social behavior comparable to laws of physics, and Pentland introduces two such laws that, as he says, determine the success of every “social organism.” The first is the quality of the “idea flow,” characterized by “exploration” to find new ideas and “engagement” to coordinate behavior around the best ideas. The second is “social learning,” in which people imitate one another until new ideas become population-wide habits. (Social learning is defined as a mathematical relationship derived from “how an entity’s state impacts other entities’ states and vice versa.”) Pentland notes that social learning is “rooted in statistical physics and machine learning.”41 The social hive is meant to reproduce the machine hive, and to this end Pentland advocates methods by which social learning “can be accelerated and shaped by social pressure.”42
The scientific aims of Pentland’s social physics depend upon a tightly integrated set of new social norms and individual adaptations, which I summarize here as five overarching principles that describe the social relations of an instrumentarian society. These principles echo Skinner’s social theory of a behaviorally controlled society, in which knowledge replaces freedom. In exploring each of the five, I compare Pentland’s statements to Skinner’s own formulations on these topics. As we shall see, Skinner’s once reviled thinking now defines this frontier of instrumentarian power.
1. Behavior for the Greater Good
Skinner had emphasized the need for an urgent shift to a collective perspective and values. “The intentional design of a culture and the control of human behavior it implies are essential if the human species is to continue to develop,” he wrote in Beyond Freedom & Dignity.43 The imperative to shift human behavior toward the greater good was already clear in Walden Two, where Frazier, its protagonist, asserts, “The fact is, we not only can control human behavior, we must.”44 Ultimately, this challenge was understood as an engineering problem. “And what are the techniques, the engineering practices, that will shape the behavior of the members of a group so that they will function smoothly for the benefit of all?” Frazier asks.45 Skinner advocated, via Frazier, that the virtue of a “planned society” is “to keep intelligence on the right track, for the good of society rather than of the intelligent individual.… It does this by making sure that the individual will not forget his personal stake in the welfare of society.”46
Pentland understands instrumentarian society as an historical turning point comparable to the invention of the printing press or the internet. It means that for the first time in human history, “We will have the data required to really know ourselves and understand how society evolves.”47 Pentland says that “continuous streams of data about human behavior” mean that everything from traffic, to energy use, to disease, to street crime will be accurately forecast, enabling a “world without war or financial crashes, in which infectious disease is quickly detected and stopped, in which energy, water, and other resources are no longer wasted, and in which governments are part of the solution rather than part of the problem.”48 This new “collective intelligence” operates to serve the gr
eater good as we learn to act “in a coordinated manner” based on “social universals.”
“Great leaps in health care, transportation, energy, and safety are all possible,” Pentland writes, but he laments the obstacles to these achievements: “The main barriers are privacy concerns and the fact that we don’t yet have any consensus around the trade-offs between personal and social values.” Like Skinner, he is emphatic that these attachments to a bygone era of imperfect knowledge threaten to undermine the prospect of a perfectly engineered future society: “We cannot ignore the public goods that such a nervous system could provide.…”49 Pentland avoids the question “Whose greater good?” How is the greater good determined when surveillance capitalism owns the machines and the means of behavioral modification? “Goodness” arrives already oriented toward the interests of the owners of the means of behavioral modification and the clients whose guaranteed outcomes they seek to achieve. The greater good is someone’s, but it may not be ours.
2. Plans Replace Politics
Skinner yearned for the computational capabilities that would perfect behavioral prediction and control, enabling perfect knowledge to supplant politics as the means of collective decision making. In spite of his pre-digital limitations, Skinner had no difficulty in conceptualizing the necessary requirements for species salvation as a new “communal science.” As Frazier explains, “We know almost nothing about the special capacities of the group… the individual, no matter how extraordinary… can’t think thoughts big enough.”50
Smooth operations leave no room for unreasonable or unintentional outcomes, and Skinner viewed the creative and often messy conflicts of politics, especially democratic politics, as a source of friction that threatens the rational efficiency of the community as a single, high-functioning “superorganism.” He laments our inclination to try to change things with “political action,” and he endorses what he perceives as a widespread loss of faith in democracy. In Walden Two Frazier insists that “I don’t like the despotism of ignorance. I don’t like the despotism of neglect, of irresponsibility, the despotism of accident, even. And I don’t like the despotism of democracy!”51
Capitalism and socialism are equally tainted by their shared emphasis on economic growth, which breeds overconsumption and pollution. Skinner is intrigued by the Chinese system but rejects it on the grounds of the bloody revolution that any effort to convert Westerners would entail. “Fortunately,” Skinner concludes in the preface to Walden Two, “there is another possibility.” This option is Skinner’s version of a behaviorist society that provides a way in which “political action is to be avoided.” In Walden Two a “plan” replaces politics, overseen by a “noncompetitive” group of “Planners” who eschew power in favor of the dispassionate administration of the schedules of reinforcement aimed at the greater good.52 Planners exercise unique control over society but “only because that control is necessary for the proper functioning of the community.”53
Like Skinner, Pentland argues that computational truth must necessarily replace politics as the basis for instrumentarian governance. We recall Nadella’s enthusiasm over persons and relationships as “objects in the cloud,” when considering Pentland’s allegiance to the notion that certainty machines will displace earlier forms of governance. “Having a mathematical, predictive science of society that includes both individual differences and the relationships between individuals,” Pentland writes, “has the potential to dramatically change the way government officials, industry managers, and citizens think and act.…”54
Pentland worries that our political-economic constructs such as “market” and “class” hail from an old, slow world of the eighteenth and nineteenth centuries. The new, “light-speed hyperconnected world” leaves no time for the kind of rational deliberation and face-to-face negotiation and compromise that characterized the social milieu in which such political concepts originated: “We can no longer think of ourselves as only individuals reaching carefully considered decisions; we must include the dynamic social effects that influence our individual decisions and drive economic bubbles, political revolutions, and the internet economy.”55
The velocity of instrumentarian society leaves us no time to get our bearings, and that speed is repurposed here as a moral imperative demanding that we relinquish individual agency to the automated systems that can keep up the pace in order to quickly perceive and impose correct answers for the greater good. There is no room for politics in this instrumentarian society because politics means establishing and asserting our bearings. Individual moral and political bearings are a source of friction that wastes precious time and diverts behavior from confluence.
Instead of politics, markets, classes, and so on, Pentland reduces society to his laws of social physics: a reincarnation of Skinner’s “communal science.” Indeed, Pentland regards his work as the practical foundation of a new “computational theory of behavior” capable of producing a “causal theory of social structure… a mathematical explanation of why society reacts as it does and how these reactions may (or may not) solve human problems.…” These new mathematical analyses not only reveal the deep “mechanisms of social interactions” (Skinner’s “special capacities of the group”) but also combine with “our newly acquired massive amounts of behavior data” in order to reveal the patterns of causality that make it possible to “engineer better social systems,” all of it based on “unprecedented instrumentation.”56
Computation thus replaces the political life of the community as the basis for governance. The depth and breadth of instrumentation make it possible, Pentland says, to calculate idea flow, social network structure, the degree of social influence between people, and even “individual susceptibilities to new ideas.” Most important, instrumentation makes it possible for those with the God view to modify others’ behavior. The data provide a “reliable prediction of how changing any of these variables will change the performance of all the people in the network” and thus achieve the optimum performance of Skinner’s superorganism. This mathematics of idea flow is the basis for Pentland’s version of a “plan” that dictates the targets and objectives of behavior change. Human behavior must be herded and penned within the parameters of the plan, just as behavior at Nadella’s construction site was continuously and automatically molded to policy parameters. Pentland calls this “tuning the network.”
“Tuners” fill the role of Pentland’s “we.” He says, for example, that cities can be understood as “idea engines” and that “we can use the equations of social physics to begin to tune them to perform better.”57 Like Skinner’s planners, Pentland’s tuners oversee pesky anomalies that represent leakage from an old world of ignorance mistaken as freedom. Tuners tweak Big Other’s operations to preemptively steer such misguided behavior back into the fold of harmonious confluence and optimum performance for the greater good of whomever or whatever owns the machines that perform the math and pays the tuners to decipher and impose its parameters. Pentland provides an example from one of his own “living labs”:
This mathematically derived concept of idea flow allows us to “tune” social networks in order to make better decisions and achieve better results.… Within the eToro digital finance world, we have found that we can shape the flows of ideas between people by providing small incentives or nudges to individuals, thus causing isolated traders to engage more with others and those who were too interconnected to engage less.… 58
3. Social Pressure for Harmony
In the community of Walden Two, reinforcement is precisely orchestrated to eliminate emotions that threaten cooperation. Only “productive and strengthening emotions—joy and love” are allowed. Feelings of sorrow and hate “and the high-voltage excitements of anger, fear, and rage” are considered “wasteful and dangerous” threats to “the needs of modern life.” Any form of distinction between persons undermines the harmony of the whole and its capacity to bend to collective purpose. Frazier acknowledges that you cannot coerce people int
o doing the right thing. The solution is far more subtle and sophisticated, based upon scientifically calibrated schedules of reinforcement: “Instead you have to set up certain behavioral processes which will lead the individual to design his own ‘good’ conduct.… We call that sort of thing ‘self-control.’ But don’t be misled, the control always rests in the last analysis in the hands of society.”59
Pentland’s idea is comparable: “The social physics approach to getting everyone to cooperate” is “social network incentives,” his version of “reinforcement.” With such incentives, he explains, “we focus on changing the connections between people rather than focusing on getting people individually to change their behavior.… We can leverage those exchanges to generate social pressure for change.”60 Social media is critical to establishing these tuning capabilities, Pentland believes, because this is the environment in which social pressure can best be controlled, directed, manipulated, and scaled.61
In Pentland’s view Facebook already exemplifies these dynamics. Its contagion experiments reveal active mastery of the ability to manipulate human empathy and attachment with tuning techniques such as priming and suggestion. Indeed, Pentland finds Facebook’s “contagion” experiments particularly enlightening, seeing all sorts of practical insights in their complexities. For example, in the corporation’s 61-million-person voting experiment, Pentland sees confirmation that social pressure can be effectively instrumentalized in social networks, especially among people with “strong ties”: “The knowledge that our face-to-face friends had already voted generated enough social pressure that it convinced people to vote.”62 With this knowledge and more like it, Pentland’s “we,” the tuners, will be able to activate the “right incentives.”
The Age of Surveillance Capitalism Page 53