The Twittering Machine
Page 16
Today, one of the dominant conspiracy theories of the Right is that left-wing intellectuals have been waging a slow, successful battle to overturn the canons of Western reason, logic and science: a process they describe as ‘cultural Marxism’. This theory first gained notoriety when it appeared in the manifesto of neo-Nazi murderer Anders Behring Breivik. It has since gained ground in the alt-right, being repeated by the popular right-wing guru Jordan Peterson, best known for his theory that human gender relations are equivalent to the sexual habits of lobsters, as well as his curmudgeonly potpourri of self-help and Jungian mysticism. The sacked National Security Council officer Rich Higgins blamed ‘cultural Marxism’ for the opposition to Trump.44 It has also appeared in more mainstream quarters. The anti-Trump conservative Australian television news anchor, Chris Uhlmann, has decried the work of ‘neo-Marxists’ using ‘critical theory as a vehicle for. . . the deconstruction of the West’.45
This theory bears some alarming resemblances to the ‘post-truth’ accounts just assayed. The theorists of ‘post-truth’ share with their right-wing opponents a vocabulary, a counter-subversive zeal, a drive to externalize a complex problem, intellectual incuriosity and an authoritarian streak a mile wide. Their ‘postmodernism’ is a straw figure, the bogey-scapegoat of anglophone centrists losing an argument. Their ‘Enlightenment’ is, as Dan Hind once wrote of a similar frenzy of earnest rallying to reason, a kind of ‘folk Enlightenment’, a ‘bowdlerised and historically disembodied Enlightenment’ with eighteenth-century philosophers reduced to ciphers in contemporary battles.46 The native pomophobia of John Bullshitter was once leveraged as a kind of moral blackmail against the anti-war Left, who were blamed for an extreme cultural relativism that supposedly left the West defenceless. Now a similar rhetorical move aligns a disintegrating political consensus with, per Kakutani, ‘the rule of raison’.47 With admirable economy, it thus creates a starkly simple polarity between the reasonable upholders of the status quo, and the beyond-reason hoi polloi. But, in appealing to an ‘age of reason’ that never existed, it seems to be far less interested in moral blackmail than in recovering what has been lost: the sure footing of the liberal state and its solid foundation in reason.
Conspiracy theories, though they have often come from threatened powers, today seem to be emerging from a more radical breakdown in meaning. They are the morbid symptoms, the excrescences, of a declining authority. When long-dominant ideologies break down, and when social interactions are increasingly governed by a confusing war of all against all, paranoia is a natural response. The rise of social industry platforms adds a new dimension to this. For they have created a machinery whose natural hero is the antisocial outsider, the hacker with no ties, the troll, the spammer. They have created a regime of competitive individualism in which perplexity and paranoia are a constant state of being. In that sense, the use of the platforms to create online communities galvanized by amateurish sleuthing, is an attempt to reclaim meaning.
This was already apparent in the early ‘9/11 Truth’ communities. In surprise bestsellers by David Ray Griffin and Nafeez Mosaddeq Ahmed, as well as a plethora of tantalizing websites, the overwhelming thrust of the argument was that the official narrative makes no sense.48 These authors obsessively pored through the unfolding news narratives of events for contradictions, holes, oddities. Of course, there often are holes in the news, not to mention official redactions. And the ‘9/11 Truth’ communities were often trying to exercise the kinds of critical reflection that there is generally little opportunity for. But they poked holes where there were none and interpreted those that did exist tendentiously. They were convinced that there was some hidden, forbidden knowledge somewhere, which only citizen journalists could uncover. This conviction that ‘they’ are hiding something from us was the shared ground of all the ‘Truth’ groups. Specific theories, such as that the Pentagon was hit by a missile strike, were secondary speculations.
Some of those used to being in power now feel embattled, and are beginning to collapse into the same logic. This is not unusual. As Emma Jane and Chris Fleming’s analysis of conspiracy theories shows, the debunkers tend to share ‘the epistemological orientations and rhetorical armoury’ of those they critique.49 The performative contradictions become absurd, as when the behavioural economists Cass Sunstein and Adrian Vermeule recommended to the White House that it should take stringent measures against conspiracy theories – such as covert ‘cognitive infiltration’ of online communities, so as to plant doubts and undermine these groups from within.
Rather than emulate the paranoid style, the displaced centre needs to look deeper, because the collapse in sense that they are just now encountering goes back a long way.
V.
Expertise, as the ebullient Brexiteer Michael Gove reminded us, has made us sick. The crisis of knowing is, in part, a deep-rooted crisis of political authority: a credibility crunch following the credit crunch.
The decline of print giants linked to established parties and ideologies, and the rise of the social industry platforms, has accelerated the crisis. But it has done so largely by sharpening tendencies that were already in play in the old media. The complaints about ‘fake news’ indicate that the embattled political establishment has not yet mastered the new media. But the problem goes even deeper than that and, in a strange way, the myth of a ‘post-truth’ society is a bungled attempt to diagnose the rot.
In the sciences, there is an ongoing ‘replication crisis’ afflicting medicine, economics, psychology and evolutionary biology. The crisis consists of the fact that the results of many scientific studies are impossible to replicate in subsequent tests. In a survey of 1,500 scientists in the journal Nature, 70 per cent of the respondents failed to replicate the findings of another scientist’s experiment.50 Half of them couldn’t even replicate their own findings.
According to the historian of ideas Philip Mirowski, one of the main causes of the problem is that science is becoming commodified.51 As it becomes an outsourced research engine for corporations, quality control collapses. A ‘parallel universe of think tanks and shadowy “experts”’ emerges outside of academia, while inside, the state demands policy-oriented research but is increasingly indifferent to quality controls. Corporations – especially big tech – have little interest in research that doesn’t pay off quickly with monetizable innovations and gizmos. Google has backed a proposal to incentivize scientists to think about the bottom line, wherein they place research ideas on something like a scientific stock market and the most promising ideas are snapped up by venture capitalists.
Among the worst examples of this degradation of scientific research by business might be the world’s pharmaceutical industry and its effects on medicine.52 The industry is riddled with ostensibly scholarly papers ghostwritten by corporations, clinical trials carried out with unrepresentative samples and cherry-picked data: a ‘murderous disaster’ for patients, as Ben Goldacre aptly calls it.53 When a peer-reviewed survey of scientists published in 2009 found that 14 per cent admitted to personal knowledge of a fellow scientist falsifying results, medical researchers were the worst offenders.54
This problem reverberates well beyond academia, because in the modern era the laboratory is the benchmark of legitimate knowledge. It is the historic model for authoritative truth claims, everyone implicitly trusting the boffin in the white coat. The industrial production of scientific deception would probably already be enough to make us sick of experts, even if we hadn’t been through the global financial crisis with its damning implications for the economics profession, the majority of politicians and the global institutions supporting the economic system. If, for example, people were willing to believe that the MMR vaccine gave children autism, against the scientific consensus, or that Aids was a US government conspiracy, this suggests that the authority of science was already diminished. In some cases, this authority was weakened by real abuses, such as the Tuskegee experiment in the US, wherein syphilis-infected black men were mi
sled, used as guinea pigs for medical experiments and never treated for their illness. This may be among the reasons why fact-checking and hectoring about ‘the science’ is so ineffectual.
For a while, ‘big data’ was offered as the answer to the problem of knowledge. Data was hailed as ‘the new oil’, and the raw material for a ‘management revolution’. By turning company processes into readable electronic text, it would replace unscientific management techniques, hunches and intuitions with the brute force of facts. Google boss Eric Schmidt, exulting in the revolutionary potential of data, described it as ‘so powerful that nation-states will fight over how much data matters’.55 In an excitable piece for Wired, former editor Chris Anderson enthused that such a scale of data collection would soon render theory and even the scientific method obsolete: ‘with enough data, the numbers speak for themselves’.56
The bonus of big data is omniscience: ‘a full digital copy of our physical universe’, as scientists Carlo Ratti and Dirk Helbing put it.57 We will be able to see all of existence as a stream of electronic writing. And for a while, it was even possible to believe this, if one set aside just how much of the physical universe is unknown and potentially unknowable. After all, the scale of data production is vast. The scale at which messages were exchanged was already quite enormous in the era of the analogue telephone. In 1948, 125 million telephone conversations were had in the US each day. But this was not captured and commodified information. The internet, as a writing machine, takes a note of everything.58
Already by 2003, more data had been produced since the turn of the millennium than in the entirety of human history.59 By 2016, 90 per cent of the entire bulk of data in the world had been created in the previous two years, at a rate of 2.5 quintillion bytes of data a day.60 An increasing share of this data is on the internet, rather than on television or in print. By 2017, users had shared more than half a million photos on Snapchat, sent almost half a million tweets, made over half a million comments on Facebook and watched over four million YouTube videos every minute of every day. In the same year, Google was processing 3.5 billion searches per day.61
With this much data, surely things would start to work without any applied theory. A prize example of this, for a long time, was Google Flu Trends. Beginning in the mid-2000s, Google began to develop the tool by comparing searches on its own search engine to the likelihood of outbreak. For a while, the results were eerily accurate. Google was able to predict the next outbreak up to ten days before the Centers for Disease Control and Prevention. Then, by 2013, it began to break down. Google’s estimates overstated the spread of illnesses by almost a factor of two. And when that happened, the hyperbole of Google’s promise became obvious.62
The numbers never speak for themselves. Every data set requires treatment, processing and interpretation. The volume of data is not a sufficient criterion for judging how useful it is.63 And the treatment of it always implies a theory, whether or not it is acknowledged. Google, unwilling to concede that its own work implied a theory, simply developed a model for extrapolating from correlations established by the sheer bulk of data. They never tried to work out what the causal relationship was between search terms and the outbreak of flu, because that was a theoretical problem. Ironically, because they were only interested in what worked, their method stopped working.
Big data is no substitute for the scientific method. Far from having the magical cure, the pioneers of data extraction and analysis have contributed to today’s degraded ecologies of information and research.
VI.
If our existing language could adequately account for the rapid degradation of information, we might know what a solution could look like. In shooting the messenger, however, ‘post-truth’ theorists are depriving themselves of some of the ways in which they could make sense of this situation.
Insofar as ‘postmodernism’ means anything, it refers to an attempt by a number of theorists to name something that seems to have changed. The ‘postmodern’ démarche, once faddishly declared across all fields of knowledge and culture, was more diagnostic label than manifesto. Some postmodern eristic came with emancipatory stylings, as though the collapse of totalizing claims and grand narratives would be innately liberating. For the ex-Marxists among the postmodernists, this was clearly an attempt at sublimating their historical defeat. Nonetheless, the identification of a postmodern era was an attempt to describe something that had happened to capitalism. That something – whether it went under the name of the post-industrial society, the knowledge economy or informational capitalism – was the growing importance of images and signs in everyday life.
The rise of information technologies and whole industries based around communications, signs and images, altered not only the economy but the structure of meaning. The growth of information economies fits well with the inherent and ever-increasing celerity of capitalism. Capitalism encounters time and space as obstacles in the way of making money.64 They would ideally like to realize their investment here and now. The development of information technologies enabling the instantaneous transmission of symbols and images around the world makes possible, as Marinetti’s ‘Futurist Manifesto’ anticipated, the death of time and space.65 These technologies have been of most use in the financial sector. But now big data, by way of ‘the cloud’, claims to extend similar advantages to traditional manufacturing firms, by enabling them to choreograph production processes all over the world.
Ironically, the growth of information economies is catastrophic for meaning. No doubt, we have lived through a massive expansion in the amount of information that we are exposed to. In 1986, the average American was exposed to forty newspapers’ worth of information each day. Two decades later, it was 174 newspapers. By 2008, the average American consumed about 36 gigabytes of information each day.66 And most of this information, insofar as it reaches us on social media, is designed to keep us typing and scrolling, producing more information. A headline tells us that a man was stabbed in front of his son on a train. A status argues that the poor and stupid should be sterilized. A viral video shows a politician dancing. A tweet claims that immigration makes us poorer. These snippets of information, appearing within microseconds of one another, have in common that they each set the wheels whirring, triggering mental and emotional work that often goes on throughout the day.
But we make a fundamental mistake if we assume that an increase in information corresponds to an increase in knowledge. When engineer Claude Shannon declared that information is entropy, he was saying something that would become starkly relevant in the age of social media.67 As an engineer, Shannon was interested in information as a storage problem. A coin toss could be said to contain two ‘bits’ of information, whereas a random card selection has fifty-two ‘bits’. The more uncertainty, the more information. The same principle, applied to sentences, means that statements with less sense actually have more informational capacity. An increase in information could be proportionate to a reduction in meaning.
In the social industry, the incentive is to constantly produce more information: a perpetual motion machine, harnessed to passions of which the machine knows nothing. This production is not for the purpose of making meaning. It is for the purpose of producing effects on users that keep us hooked. It is for the purpose of making users the conduits of the machine’s power, keeping its effects in circulation. Faked celebrity deaths, trolling, porn clickbait, advertisements, flurries of food and animal pictures, thirst traps, the endless ticker tape of messages, mean less than they perform. The increase of information corresponds to a decrease in meaning.
Moreover, this production is taking place in a simulacrum much like that described by the theorist of postmodernity Jean Baudrillard.68 A simulacrum is not a representation of reality. It is reality, albeit reality generated from digital writing and simulated models. It is simulation woven into our lives, with effects every bit as real as stock-market values, or the belief in God. It is reality as a cybernetic production. Like vi
deo game images, or virtual reality, the simulacrum is uncannily too perfect, too real: hyperreal, even. We are now far more incorporated into the system of images and signs, from gaming to feeds; but this simulacrum has its roots in capitalist culture’s airbrushed advertising, seductive Hollywood dreams and slick gaming and infotainment industries.
With the coming of new virtual and augmented realities, the Twittering Machine may prove to have been a stage in the spread of the simulacrum – one with darkly dystopian potential. Jaron Lanier, effectively the inventor of virtual reality, argues that to make it work, you need to give the machine far more data about yourself than you do on the platforms. The result could be the most elaborate Skinner Box in history. What seems like a device for adventure and freedom could become ‘the creepiest behaviour-modification device’ invented thus far.69
VII.
‘Post-truth’ politics is what we have long been living under in various forms: a technocracy, in a word. The rule of ‘big data’, now the most plentiful raw material in the history of the world, is not a departure from this: it is nothing but the rule of brute facts. It is the rule of technique, not truth, which has recently been found wanting. Or, as Wilde called it in ‘The Decay of Lying’, the ‘monstrous worship of facts’.