The speed and reach of the computer revolution raised the question of how much further it could go. Hopes and fears intensified of machines that might emulate human minds. Controversy grew over whether artificial intelligence was a threat or a promise. Smart robots excited boundless expectations. In 1950, Alan Turing, the master cryptographer whom artificial intelligence researchers revere, wrote, ‘I believe that at the end of the century the use of words and general educated opinion will have altered so much that one will be able to speak of machines thinking without expecting to be contradicted.’49 The conditions Turing predicted have not yet been met, and may be unrealistic. Human intelligence is probably fundamentally unmechanical: there is a ghost in the human machine. But even without replacing human thought, computers can affect and infect it. Do they corrode memory, or extend its access? Do they erode knowledge when they multiply information? Do they expand networks or trap sociopaths? Do they subvert attention spans or enable multi-tasking? Do they encourage new arts or undermine old ones? Do they squeeze sympathies or broaden minds? If they do all these things, where does the balance lie? We have hardly begun to see how cyberspace can change the psyche.50
Humans may not be machines, but they are organisms, subject to the laws of evolution. Is that all they are? Genetics filled in a gap in Darwin’s description of evolution. It was already obvious to any rational and objective student that Darwin’s account of the origin of species was essentially right but no one could say how the mutations that differentiate one lineage from another get passed across the generations. Gregor Mendel, raising peas in a monastery garden in Austria, supplied the explanation; T. H. Morgan, rearing fruit flies in a lab in downtown New York in the early nineteenth century, confirmed and communicated it. Genes filled what it is tempting to call a missing link in the way evolution works: an explanation of how offspring can inherit parental traits. The discovery made evolution unchallengeable, except by ill-informed obscurantists. It also encouraged enthusiasts to expect too much of the theory, stretching it to cover kinds of change – intellectual and cultural – for which it was ill designed.
In the second half of the twentieth century, the decoding of DNA stimulated the trend, profoundly affecting human self-perceptions along the way. Erwin Schrödinger started the revolution, pondering the nature of genes in lectures in Dublin in 1944. Schrödinger expected a sort of protein, whereas DNA turned out to be a kind of acid, but his speculations about what it might look like proved prophetic. He predicted that it would resemble a chain of basic units, connected like the elements of a code. The search was on for the ‘basic building blocks’ of life, not least in Francis Crick’s lab in Cambridge in England. James Watson, who had read Schrödinger’s work as a biology student in Chicago, joined Crick’s team. He realized that it would be possible to discover the structure Schrödinger had predicted when he saw X-ray pictures of DNA. In a partner laboratory in London, Rosalind Franklin contributed vital criticisms of Crick’s and Watson’s unfolding ideas and helped build up the picture of how strands of DNA intertwined. The Cambridge team incurred criticism on moral grounds for dealing with Franklin unfairly, but there was no denying the validity of their findings. The results were exciting. Realization that genes in individual genetic codes were responsible for some diseases opened new pathways for therapy and prevention. Even more revolutionary was the possibility that many, perhaps all, kinds of behaviour could be regulated by changing the genetic code. The power of genes suggested new thinking about human nature, controlled by an unbreakable code, determined by genetic patterning.
Character, in consequence, seemed computable. At the very least, genetic research seemed to confirm that more of our makeup is inherited than anyone traditionally supposed. Personality could be arrayed as a strand of molecules, and features swapped like walnut shells in a game of Find the Lady. Cognitive scientists expedited materialist thinking of a similar kind by subjecting human brains to ever more searching analysis. Neurological research revealed an electrochemical process, in which synapses fire and proteins are released, alongside thinking. It should be obvious that what such measurements show could be effects, or side effects, rather than causes or constituents of thought. But they made it possible, at least, to claim that everything traditionally classed as a function of mind might take place within the brain. It has become increasingly hard to find room for nonmaterial ingredients, such as mind and soul. ‘The soul has vanished’, Francis Crick announced.51
Meanwhile, experimenters modified the genetic codes of non-human species to obtain results that suit us: producing bigger plant foods, for instance, or animals designed to be more beneficial, more docile, more palatable, or more packageable for human food. Work in these fields has been spectacularly successful, raising the spectre of a world re-crafted, as if by Frankenstein or Dr Moreau. Humans have warped evolution in the past: by inventing agriculture (see here) and shifting biota around the planet (see here). They now have the power to make their biggest intervention yet, selecting ‘unnaturally’ not according to what is best adapted to its environment, but according to what best matches agendas of human devising. We know, for example, that there is a market for ‘designer babies’. Sperm banks already cash in. Well-intentioned robo-obstetrics modifies babies to order in cases where genetically transmitted diseases can be prevented. It is most unusual for technologies, once devised, to be unapplied. Some societies (and some individuals elsewhere) will engineer human beings along the lines that eugenics prescribed in former times (see here). Morally dubious visionaries are already talking about a world from which disease and deviancy have been excised.52
Genetics embraced a paradox: everyone’s nature is innate; yet it can be manipulated. So was Kant wrong when he uttered a dictum that had traditionally attracted a lot of emotional investment in the West: ‘there is in man a power of self-determination, independent of any bodily coercion’? Without such a conviction, individualism would be untenable. Determinism would make Christianity superannuated. Systems of laws based on individual responsibility would crumble. Of course, the world was already familiar with determinist ideas that tied character and chained potential to inescapably fatal inheritances. Craniology, for instance, assigned individuals to ‘criminal’ classes and ‘low’ races by measuring skulls and making inferences about brain size (see here). Nineteenth-century judgements about relative intelligence, in consequence, were unreliable. In 1905, however, searching for a way of identifying children with learning problems, Alfred Binet proposed a new method: simple, neutral tests designed not to establish what children know but to reveal how much they are capable of learning. Within a few years, the concept of IQ – age-related measurable ‘general intelligence’ – came to command universal confidence. The confidence was probably misplaced: intelligence tests in practice only predicted proficiency in a narrow range of skills. I can recall outstanding students of mine who were not particularly good at them. Yet IQ became a new source of tyranny. By the time of the outbreak of the First World War, policymakers used it, inter alia, to justify eugenics, exclude immigrants from the United States, and select promotion candidates in the US Army. It became developed countries’ standard method of social differentiation, singling out the beneficiaries of accelerated or privileged education. The tests could never be fully objective, nor the results reliable; yet, even in the second half of the century, when critics began to point out the problems, educational psychologists preferred to tinker with the idea rather than jettison it.
The problem of IQ blended with one of the century’s most politically charged scientific controversies: the ‘nature versus nurture’ debate, which pitted right against left. In the latter camp were those who thought that social change can affect our moral qualities and collective achievements for the better. Their opponents appealed to evidence that character and capability are largely inherited and therefore unadjustable by social engineering. Partisans of social radicalism contended with conservatives who were reluctant to make things worse by ill-c
onsidered attempts at improvement. Although the IQ evidence was highly unconvincing, rival reports exacerbated debate in the late 1960s. Arthur Jensen at Berkeley claimed that eighty per cent of intelligence is inherited, and, incidentally, that blacks are genetically inferior to whites. Christopher Jencks and others at Harvard used similar IQ statistics to argue that heredity plays a minimal role. The contest raged unchanged, supported by the same sort of data, in the 1990s, when Richard J. Herrnstein and Charles Murray exploded a sociological bombshell. In The Bell Curve, they argued that a hereditary cognitive elite rules a doomed underclass (in which blacks are disproportionately represented). They predicted a future of cognitive class conflict.
Meanwhile, sociobiology, a ‘new synthesis’ devised by the ingenious Harvard entomologist Edward O. Wilson, exacerbated debate. Wilson rapidly created a scientific constituency for the view that evolutionary necessities determine differences between societies, which can therefore be ranked accordingly, rather as we speak of orders of creation as relatively ‘higher’ or ‘lower’ on the evolutionary scale.53 Zoologists and ethologists often extrapolate to humans from whatever other species they study. Chimpanzees and other primates suit the purpose because they are closely related to humans in evolutionary terms. The remoter the kinship between species, however, the less availing the method. Konrad Lorenz, the most influential of Wilson’s predecessors, modelled his understanding of humans on his studies of gulls and geese. Before and during the Second World War he inspired a generation of research into the evolutionary background of violence. He found that in competition for food and sex the birds he worked with were determinedly and increasingly aggressive. He suspected that in humans, too, violent instincts would overpower contrary tendencies. Enthusiasm for Nazism tainted Lorenz. Academic critics disputed the data he selected. Yet he won a Nobel Prize, and exercised enormous influence, especially when his major work became widely available in English in the 1960s.
Whereas Lorenz invoked gulls and geese, ants and bees were Wilson’s exemplars. Humans differ from insects, according to Wilson, mainly in being individually competitive, whereas ants and bees are more deeply social: they function for collective advantage. He often insisted that biological and environmental constraints did not detract from human freedom, but his books seemed bound in iron, with little spinal flexibility, and his papers close-printed without space for freedom between the lines. He imagined a visitor from another planet cataloguing humans along with all the other species on Earth and shrinking ‘the humanities and social sciences to specialized branches of biology’.54
The human–ant comparison led Wilson to think that ‘flexibility’, as he called it, or variation between human cultures, results from individual differences in behaviour ‘magnified at the group level’ as interactions multiply. His suggestion seemed promising: the cultural diversity that intercommunicating groups exhibit is related to their size and numbers, and the range of the exchanges that take place between them. Wilson erred, however, in supposing that the genetic transmissions cause cultural change. He was responding to the latest data of his day. By the time he wrote his most influential text, Sociobiology, in 1975, researchers had already discovered or confidently postulated genes for introversion, neurosis, athleticism, psychosis, and numerous other human variables. Wilson inferred a further theoretical possibility, although there was and is no direct evidence for it: that evolution ‘strongly selected’ genes for social flexibility, too.55
In the decades that followed Wilson’s intervention, most new empirical evidence supported two modifications of his view: first, genes influence behaviour only in unpredictably various combinations, and in subtle and complex ways, involving contingencies that elude easy pattern detection. Second, behaviour in turn influences genes. Acquired characteristics can be transmitted hereditarily. Mother rats’ neglect, for instance, causes a genetic modification in their offspring, who become jittery, irritating adults, whereas nurturing mothers’ infants develop calm characteristics in the same way. From debates about sociobiology, two fundamental convictions have survived in most people’s minds: that individuals make themselves, and that society is worth improving. Nevertheless, suspicion abides that genes perpetuate differences between individuals and societies and make equality an undeliverable ideal. The effect has been to inhibit reform and encourage the prevailing conservatism of the early twenty-first century.56
For a while, the work of Noam Chomsky seemed to support the fight-back for the retrieval of certainty. He was radical in politics and linguistics alike. From the mid-1950s onward, Chomsky argued persistently that language was more than an effect of culture: it was a deep-rooted property of the human mind. His starting point was the speed and ease with which children learn to speak. ‘Children’, he noticed, ‘learn language from positive evidence only (corrections not being required or relevant), and … without relevant experience in a wide array of complex cases.’57 Their ability to combine words in ways they have never heard impressed Chomsky. Differences among languages, he thought, seem superficial compared with the ‘deep structures’ that all share: parts of speech and the grammar and syntax that regulate the way terms relate to each other. Chomsky explained these remarkable observations by postulating that language and brain are linked: the structures of language are innately imbedded in the way we think; so it is easy to learn to speak; one can genuinely say that ‘it comes naturally’. The suggestion was revolutionary when Chomsky made it in 1957, because the prevailing orthodoxies at the time suggested otherwise. We reviewed them in chapter 9: Freud’s psychiatry, Sartre’s philosophy, and Piaget’s educational nostrums all suppose that upbringing is inscribed on a tabula rasa. Behaviourism endorsed a similar notion – the doctrine, fashionable until Chomsky exploded it, that we learn to act, speak, and think as we do because of conditioning: we respond to stimuli, in the form of social approval or disapproval. The language faculty Chomsky identified was, at least according to his early musings, beyond the reach of evolution. He jibbed at calling it an instinct and declined to offer an evolutionary account of it. If the way he formulated his thinking was right, neither experience nor heredity makes us the whole of what we are, nor do both in combination. Part of our nature is hard-wired into our brains. Chomsky went on to propose that other kinds of learning may be like language in these respects: ‘that the same is true in other areas where humans are capable of acquiring rich and highly articulated systems of knowledge under the triggering and shaping effects of experience, and it may well be that similar ideas are relevant for the investigation of how we acquire scientific knowledge … because of our mental constitution’.58
Chomsky rejected the notion that humans devised language to make up for our dearth of evolved skills – the argument that ‘the richness and specificity of instinct of animals … accounts for their remarkable achievements in some domains and lack of ability in others … whereas humans, lacking such … instinctual structure, are free to think, speak and discover’. Rather, he thought, the language prowess on which we tend to congratulate ourselves as a species, and which some people even claim as a uniquely human achievement, may simply resemble the peculiar skills of other species. Cheetahs are specialists in speed, for instance, cows in ruminating, and humans in symbolic communication.59
Dogmatism Versus Pluralism
I love uncertainty. Caution, scepticism, self-doubt, tentativeness: these are the toeholds we grope for on the ascent to truth. It is when people are sure of themselves that I get worried. False certainty is far worse than uncertainty. The latter, however, breeds the former.
In twentieth-century social and political thought, new dogmatisms complemented the new determinisms of science. Change may be good. It is always dangerous. In reaction against uncertainty, electorates succumb to noisy little men and glib solutions. Religions transmute into dogmatisms and fundamentalisms. The herd turns on agents of supposed change, especially – typically – on immigrants and on international institutions. Cruel, costly wars start out
of fear of depleted resources. These are all extreme, generally violent, always risky forms of change, embraced for conservative reasons, in order to cleave to familiar ways of life. Even the revolutions of recent times are often depressingly nostalgic, seeking a golden and usually mythical age of equality or morality or harmony or peace or greatness or ecological balance. The most effective revolutionaries of the twentieth century called for a return to primitive communism or anarchism, or to the medieval glories of Islam, or to apostolic virtue, or to the apple-cheeked innocence of an era before industrialization.
Religion had a surprising role. For much of the twentieth century, secular prophets foretold its death. Material prosperity, they argued, would satiate the needy with alternatives to God. Education would wean the ignorant from thinking about Him. Scientific explanations of the cosmos would make God redundant. But after the failure of politics and the disillusionments of science, religion remained, ripe for revival, for anyone who wanted the universe to be coherent and comfortable to live in. By the end of the century, atheism was no longer the world’s most conspicuous trend. Fundamentalisms in Islam and Christianity, taken together, constituted the biggest movement in the world and potentially the most dangerous. No one should find this surprising: fundamentalism, like scientism and brash political ideologies, was part of the twentieth-century reaction against uncertainty – one of the false certainties people preferred.
Fundamentalism began, as we have seen, in Protestant seminaries in Chicago and Princeton, in revulsion from German academic fashions in critical reading of the Bible. Like other books, the Bible reflects the times in which the books it comprises were written and assembled. The agendas of the authors (or, if you prefer to call them so, the human mediators of divine authorship) and editors warp the text. Yet fundamentalists read it as if the message were uncluttered with historical context and human error, winkling out interpretations they mistake for unchallengeable truths. The faith is founded on the text. No critical exegesis can deconstruct it. No scientific evidence can gainsay it. Any supposedly holy scripture can and usually does attract literal-minded dogmatism. The name of fundamentalism is transferable: though it started in biblical circles, it is now associated with a similar doctrine, traditional in Islam, about the Qur’an.
Out of Our Minds Page 49