The Modern Mind
Page 116
One might think that the repeated rebuffs which Gould received to his attempts to reshape classical Darwinism would have dampened his enthusiasm. Not a bit of it. And in any case, the fourth area where he, Lewontin, and others have differed from their neo-Darwinist colleagues has had a somewhat different history. Between 1981 and 1991, Gould and Lewontin published three books that challenged in general the way ‘the doctrine of DNA,’ as Lewontin put it, had been used, again to quote Lewontin, to ‘justify inequalities within and between societies and to claim that those inequalities can never be changed.’ In The Mismeasure of Man (1981), Gould looked at the history of the controversy over IQ, what it means, and how it is related to class and race.61 In 1984 Lewontin and two others, Steven Rose and Leon J. Kamin, published Not in Our Genes: Biology, Ideology and Human Nature, in which they rooted much biology in a bourgeois political mentality of the nineteenth century, arguing that the quantification of such things as the IQ is crude and that attempts to describe mental illness only as a biochemical illness avoid certain politically inconvenient facts.62 Lewontin took this further in 1991 in The Doctrine of DNA, where he argued that DNA fits perfectly into the prevailing ideology; that the link between cause and effect is simple, mainly one on one; that for the present DNA research holds out no prospect of a cure for the major illnesses that affect mankind – for example, cancer, heart disease and stroke – and that the whole edifice is more designed to reward scientists than help science, or patients. Most subversive of all, he writes, ‘It has been clear since the first discoveries in molecular biology that “genetic engineering,” the creation to order of genetically altered organisms, has an immense possibility for producing private profit…. No prominent molecular biologist of my acquaintance is without a financial stake in the biotechnology business.’63 He believes that human nature, as described by the evolutionary biologists such as E. O. Wilson, is a ‘made-up story,’ designed to fit the theories the theorists already hold.
Given the approach of Gould and Lewontin in particular, it comes as no surprise to find them fully embroiled in yet another (but very familiar) biological controversy, which erupted in 1994. This was the publication of Richard J. Herrnstein and Charles Murray’s The Bell Curve: Intelligence and Class Structure in American Life.64
Ten years in the making, the main argument of The Bell Curve was twofold. In some places, it is straight out of Michael Young’s Rise of the Meritocracy, though Herrnstein and Murray are no satirists but in deadly earnest. In the twentieth century, they say, as more and more colleges have opened up to the general population, as IQ tests have improved and been shown to be better predictors of job performance than other indicators (such as college grades, interviews, or biographical data), and as the social environment has become more uniform for most of the population, a ‘cognitive elite’ has begun to emerge in society. Three phenomena are the result of this sorting process, and mean that it will accelerate in the future: the cognitive elite is getting richer, at a time when everybody else is having to struggle to stay even; the elite is increasingly segregated physically from everyone else, especially at work and in the neighbourhoods they inhabit; and the cognitive elite is increasingly likely to intermarry.65 Herrnstein and Murray also analysed afresh the results of the National Longitudinal Study of Youth (NLSY), a database of about 4 million Americans drawn from a population that was born in the 1960s. This enables them to say, for example, that low intelligence is a stronger precursor of poverty than coming from a low socioeconomic status background, that students who drop out of school come almost entirely from the bottom quartile of the IQ distribution (i.e., the lowest 25 percent), that low-IQ people are more likely to divorce early on in married life and to have illegitimate children. They found that low-IQ parents are more likely to be on welfare and to have low-birthweight children. Low IQ men are more likely to be in prison. Then there was the racial issue. Herrnstein and Murray spend a lot of time prefacing their remarks by saying that a high ‘IQ’ does not necessarily make someone admirable or the kind to be cherished, and they concede that the racial differences in IQ are diminishing. But, after controlling for education and poverty, they still find that people of Asian stock in America outperform ‘whites,’ who outperform blacks on tests of IQ.66 They also find that recent immigrants to America have a lower IQ score than native-born Americans. And finally, they voice their concerns that the IQ level of America is declining. This is due partly, they say, to a dysgenic trend – people of lower IQ are having more children – but that is not the only reason. In practice, the American schooling system has been ‘dumbed down’ to meet the needs of average and below-average students, which means that the performance of the average students has not, contrary to popular opinion, been adversely affected. It is the brighter students who have been most affected, their SAT (Scholastic Aptitude Test) scores dropping by 41 percent between 1972 and 1993. They also blame parents, who seem not to want their children to work harder anymore, and television, which has replaced newsprint as a source of information, and the telephone, which has replaced letter writing as a form of self-expression.67 Further, they express their view that affirmative-action programs have not helped disadvantaged people, indeed have made their situation worse. But it is the emergence of the cognitive elite, this ‘invisible migration,’ the ‘secession of the successful,’ and the blending of the interests of the affluent with the cognitive elite that Herrnstein and Murray see as the most important, and pessimistic, of their findings. This elite, they say, will fear the ‘underclass’ that is emerging, and will in effect control it with ‘kindness’ (which is basically what Murray’s rival, J. K. Galbraith had said in The Culture of Contentment). They will provide welfare for the underclass so long as it is out of sight and out of mind. They hint, though, that such measures are likely to fail: ‘racism will re-emerge in a new and more virulent form.’68
Herrnstein and Murray are traditionalists. They would like to see a return to old-fashioned families, small communities, and the familiar forms of education, where pupils are taught history, literature, arts, ethics, and the sciences in such a way as to be able to weigh, analyse, and evaluate arguments according to exacting standards.69 For them, the IQ test not only works – it is a watershed in human society. Allied to the politics of democracy and the homogenising successes of modern capitalism, the IQ aids what R. A. Fisher called runaway evolution, promoting the rapid layering of society, divided according to IQ – which, of course, is mainly inherited. We are indeed witnessing the rise of the meritocracy.
The Bell Curve provoked a major controversy on both sides of the Atlanric. This was no surprise. Throughout the century white people, people on the ‘right’ side of the divide they were describing, have concluded that whole segments of the population were dumb. What sort of reaction did they expect? Many people countered the claims of Herrnstein and Murray, with at least six other books being produced in 1995 or 1996 to examine (and in many cases refute) the arguments of The Bell Curve. Stephen Jay Gould’s The Mismeasure of Man was reissued in 1996 with an extra chapter giving his response to The Bell Curve. His main point was that this was a debate that needed technical expertise. Too many of the reviewers who had joined the debate (and the book provoked nearly two hundred reviews or associated articles) did not feel themselves competent to judge the statistics, for example. Gould did, and dismissed them. In particular, he attacked Herrnstein and Murray’s habit of giving the form of the statistical association but not the strength. When this was examined, he said, the links they had found always explained less than 20 percent of the variance, ‘usually less than 10 percent and often less than 5 percent. What this means in English is that you cannot predict what a given person will do from his IQ score.’70 This was the conclusion Christopher Jencks had arrived at, thirty years before.
By the time The Bell Curve rumpus erupted, the infrastructure was in place for a biological project capable of generating controversy on an even bigger scale. This was the scramble to map the huma
n genome, to draw up a plan to describe exactly all the nucleotides that constitute man’s inheritance and that, in time, will offer at least the possibility of interfering in our genetic makeup.
Interest in this idea grew throughout the 1980s. Indeed, it could be said that the Human Genome Project (HGP), as it came to be called, had been simmering since Victor McKusick, a Boston doctor, began collecting a comprehensive record, ‘Mendelian Inheritance in Man,’ a list of all known genetic diseases, first published in 1966.71 But then, as research progressed, first one scientist then another began to see sense in mapping the entire genome. On 7 March 1986, in Science, Renato Dulbecco, Nobel Prize-winning president of the Salk Institute, startled his colleagues by asserting that the war on cancer would be over quicker if geneticists were to sequence the human genome.72 Various U.S. government departments, including the Department of Energy and the National Institutes of Health, became interested at this point, as did scientists in Italy, the United Kingdom, Russia, Japan, and France (in roughly that order; Germany was backward, owing to the controversial role biology had played in Nazi times). A major conference, organised by the Howard Hughes Medical Institute, was held in Washington in July 1986 to bring together the various interested parties, and this had two effects. In February 1988 the US. National Research Council issued its report, Mapping and Sequencing the Human Genome, which recommended a concerted research program with a budget of $200 million a year.73 James Watson, appropriately enough, was appointed associate director of NIH, later that year, with special responsibility for human genome research. And in April 1988, HUGO, the Human Genome Organisation, was founded. This was a consortium of international scientists to spread the load of research, and to make sure there was as little duplication as possible, the aim being to finalise the mapping as early as possible in the twenty-first century. The experience of the Human Genome Project has not been especially happy. In April 1992 James Watson resigned his position over an application by certain NIH scientists to patent their sequences. Watson, like many others, felt that the human genome should belong to everyone.74
The genome project came on stream in 1988–89. This was precisely the time that communism was collapsing in the Soviet Union and the Berlin Wall was dismantled. A new era was beginning politically, but so too in the intellectual field. For HUGO was not the only major innovation introduced in 1988. That year also saw the birth of the Internet.
Whereas James Watson took a leading role in the genome project, his former colleague and co-discoverer of the double helix, Francis Crick, took a similar position in what is perhaps the hottest topic in biology as we enter the twenty-first century: consciousness studies. In 1994 Crick published The Astonishing Hypothesis, which advocated a research assault on this final mystery/problem.75 Consciousness studies naturally overlap with neurological studies, where there have been many advances in identifying different structures of the brain, such as language centres, and where MRI, magnetic resonance imaging, can show which areas are being used when people are merely thinking about the meaning of words. But the study of consciousness itself is still as much a matter for philosophers as biologists. As John Maddox put it in his 1998 book, What Remains to be Discovered, ‘No amount of introspection can enable a person to discover just which set of neurons in which part of his or her head is executing some thought-process. Such information seems to be hidden from the human user. ’76
It should be said that some people think there is nothing to explain as regards consciousness. They believe it is an ‘emergent property’ that automatically arises when you put a ‘bag of neurons’ together. Others think this view absurd. A good explanation of emergent property is given by John Searle, Mills Professor of Philosophy at the University of California, Berkeley, regarding the liquidity of water. The behaviour of the H20 molecules explains liquidity, but the individual molecules are not liquid. At the moment, the problem with consciousness is that our understanding is so rudimentary that we don’t even know how to talk about it – even after the ‘Decade of the Brain,’ which was adopted by the U.S. Congress on 1 January 1990.77 This inaugurated many innovations and meetings that underlined the new fashion for consciousness studies. For example, the first international symposium on the science of consciousness was held at the University of Arizona at Tucson in April 1994, attended by no fewer than a thousand delegates.78 In that same year the first issue of the Journal of Consciousness Studies was published, with a bibliography of more than 1,000 recent articles. At the same time a whole raft of books about consciousness appeared, of which the most important were: Neural Darwinism: The Theory of Neuronal Group Selection, by Gerald Edelman (1987), The Remembered Present: A Biological Theory of Consciousness, by Edelman (1989), The Emperor’s New Mind, by Roger Penrose (1989), The Problem of Consciousness, by Colin McGinn (1991), Consciousness Explained, by Daniel Dennett (1991), The Rediscovery of the Mind, by John Searle (1992), Bright Air, Brilliant Fire, by Edelman (1992), The Astonishing Hypothesis, by Francis Crick (1994), Shadows of the Mind: A Search for the Missing Science of Consciousness, by Roger Penrose (1994), and The Conscious Mind: In Search of a Fundamental Theory, by David Chalmers (1996). Other journals on consciousness were also started, and there were two international symposia on the subject at Jesus College, Cambridge, published as Nature’s Imagination (1994) and Consciousness and Human Identity (1998), both edited by John Cornwell.
Thus consciousness has been very much the flavour of the decade, and it is fair to say that those involved in the subject fall into four camps. There are those, like the British philosopher Colin McGinn, who argue that consciousness is resistant to explanation in principle and for all time.79 Philosophers we have met before – such as Thomas Nagel and Hilary Putnam – also add that at the present (and maybe for all time) science cannot account for qualia, the first-person phenomenal experience that we understand as consciousness. Then there are two types of reductionist. Those like Daniel Dennett, who claim not only that consciousness can be explained by science but that construction of an artificially intelligent machine that will be conscious is not far off, may be called the ‘hard’ reductionists.80 The soft reductionists, typified by John Searle, believe that consciousness does depend on the physical properties of the brain but think we are nowhere near solving just how these processes work, and dismiss the very idea that machines will ever be conscious.81 Finally, there are those like Roger Penrose who believe that a new kind of dualism is needed, that in effect a whole new set of physical laws may apply inside the brain, which account for consciousness.82 Penrose’s particular contribution is that quantum physics operate within tiny structures, known as tubules, within the nerve cells of the brain to produce – in some as yet unspecified way – the phenomena we recognise as consciousness.83 Penrose actually thinks that we live in three worlds – the physical, the mental, and the mathematical: ‘The physical world grounds the mental world, which in turn grounds the mathematical world and the mathematical world is the ground of the physical world and so on around the circle.’84 Many people, who find this tantalising, nonetheless don’t feel Penrose has proved anything. His speculation is enticing and original, but it is still speculation.
Instead, it is the two forms of reductionism that in the present climate attract most interest. For people like Dennett, human consciousness and identity arise from the narrative of their lives, and this can be related to specific brain states. For example, there is growing evidence that the ability to ‘apply intentional predicates to other people is a human universal’ and is associated with a specific area of the brain (the orbitofrontal cortex); in certain states of autism, this ability is defective. There is also evidence that the blood supply to the orbitofrontal cortex increases when people ‘process’ intentional verbs as opposed to non-intentional ones, and that damage to this area of the brain can lead to a failure to introspect.85 Suggestive as this is, it is also the case that the microanatomy of the brain varies quite considerably from individual to individual, and that a particular phen
omenal experience is represented at several different points in the brain, which clearly require integration. Any ‘deep’ patterns relating experience to brain activity have yet to be discovered, and seem to be a long way off, though this is still the most likely way forward.
A related approach – perhaps to be expected, given other developments in recent years – is to look at the brain and consciousness in a Darwinian light. In what sense is consciousness adaptive? This approach has produced two views – one that the brain was in effect ‘jerry-built’ in evolution to accomplish very many and very different tasks. On this view, there are at base three organs: a reptilian core (the seat of our basic drives), a palaeomammalian layer, which produces such things as affection for offspring, and a neomammalian brain, the seat of reasoning, language, and other ‘higher functions.’86 The second approach is to argue that throughout evolution (and throughout our bodies) there have been emergent properties: for example, there is always a biochemical explanation underlying a physiological phenomenon – sodium/potassium flux across a membrane being also nerve action potential.87 In this sense, then, consciousness is nothing new in principle even if, at the moment, we don’t fully understand it.
Studies of nerve action through the animal kingdom have also shown that nerves work by either firing or not firing; intensity is represented by the rate of firing – the more intense the stimulation, the faster the turning on and off of any particular nerve. This of course is very similar to the way computers work, in ‘bits’ of information, where everything is represented by a configuration of either os or is. The arrival of the concept of parallel processing in computing led the philosopher Daniel Dennett to consider whether an analogous process might happen in the brain between different evolutionary levels, giving rise to consciousness. Again such reasoning, though tantalising, has not gone much further than preliminary exploration. At the moment, no one seems able to think of the next step.