Book Read Free

Science in the Soul

Page 10

by Richard Dawkins

Sir Peter Medawar, that swashbuckling Nobel Prize-winner whom I’ve already quoted, said something similar about ‘trade’:

  It is said that in ancient China the mandarins allowed their fingernails – or anyhow one of them – to grow so extremely long as manifestly to unfit them for any manual activity, thus making it perfectly clear to all that they were creatures too refined and elevated ever to engage in such employments. It is a gesture that cannot but appeal to the English, who surpass all other nations in snobbishness; our fastidious distaste for the applied sciences and for trade has played a large part in bringing England to the position in the world which she occupies today.

  So, if I have difficulties with quantum theory, it is not for want of trying and certainly not a source of pride. As an evolutionist, I endorse Steven Pinker’s view, that Darwinian natural selection has designed our brains to understand the slow dynamics of large objects on the African savannahs. Perhaps somebody should devise a computer game in which bats and balls behave according to a screened illusion of quantum dynamics. Children brought up on such a game might find modern physics no more impenetrable than we find the concept of stalking a wildebeest.

  Personal uncertainty about the uncertainty principle reminds me of another hallmark that will be alleged for twentieth-century science. This is the century, it will be claimed, in which the deterministic confidence of the previous one was shattered. Partly by quantum theory. Partly by chaos (in the trendy, not the ordinary language, meaning). And partly by relativism (cultural relativism, not the sensible, Einsteinian meaning).

  Quantum uncertainty and chaos theory have had deplorable effects upon popular culture, much to the annoyance of genuine aficionados. Both are regularly exploited by obscurantists, ranging from professional quacks to daffy New Agers. In America, the self-help ‘healing’ industry coins millions, and it has not been slow to cash in on quantum theory’s formidable talent to bewilder. This has been documented by the American physicist Victor Stenger. One well-heeled healer wrote a string of best-selling books on what he calls ‘Quantum Healing’. Another book in my possession has sections on quantum psychology, quantum responsibility, quantum morality, quantum aesthetics, quantum immortality and quantum theology.

  Chaos theory, a more recent invention, is equally fertile ground for those with a bent for abusing sense. It is unfortunately named, for ‘chaos’ implies randomness. Chaos in the technical sense is not random at all. It is completely determined, but it depends hugely, in strangely hard-to-predict ways, on tiny differences in initial conditions. Undoubtedly it is mathematically interesting. If it impinges on the real world, it would rule out ultimate prediction. If the weather is technically chaotic, weather forecasting in detail becomes impossible. Major events like hurricanes might be determined by tiny causes in the past – such as the now proverbial flap of a butterfly’s wing. This does not mean that you can flap the equivalent of a wing and hope to generate a hurricane. As the physicist Robert Park says, this is ‘a total misunderstanding of what chaos is about…while the flapping of a butterfly’s wings might conceivably trigger a hurricane, killing butterflies is unlikely to reduce the incidence of hurricanes’.

  Quantum theory and chaos theory, each in its own peculiar ways, may call into question the predictability of the universe, in deep principle. This could be seen as a retreat from nineteenth-century confidence. But nobody really thought such fine details would ever be predicted in practice, anyway. The most confident determinist would always have admitted that, in practice, the sheer complexity of interacting causes would defeat accurate prediction of weather or turbulence. So chaos doesn’t make a lot of difference in practice. Conversely, quantum events are statistically smothered, and massively so, in most realms that impinge on us. So the possibility of prediction is, for practical purposes, restored.

  In the late twentieth century, prediction of future events in practice has never been more confident or more accurate. This is dramatic in the feats of space engineers. Previous centuries could predict the return of Halley’s Comet. Twentieth-century science can hurl a projectile along the right trajectory to intercept it, precisely computing and exploiting the gravitational slings of the solar system.*7 Quantum theory itself, whatever the indeterminacy at its heart, is spectacularly accurate in the experimental accuracy of its predictions. The late Richard Feynman assessed this accuracy as equivalent to knowing the distance between New York and Los Angeles to the width of one human hair. Here is no licence for anything-goes, intellectual flappers, with their quantum theology and quantum you-name-it.

  Cultural relativism is the most pernicious of these myths of twentieth-century retreat from Victorian certainty. A modish fad sees science as only one of many cultural myths, no more true or valid than the myths of any other culture. Many in the academic community have discovered a new form of anti-scientific rhetoric, sometimes called the ‘postmodern critique’ of science. The most thorough whistle-blowing on this kind of thing is Paul Gross and Norman Levitt’s splendid book Higher Superstition: the academic left and its quarrels with science. The American anthropologist Matt Cartmill sums up the basic credo:

  Anybody who claims to have objective knowledge about anything is trying to control and dominate the rest of us…There are no objective facts. All supposed ‘facts’ are contaminated with theories, and all theories are infested with moral and political doctrines…Therefore, when some guy in a lab coat tells you that such and such is an objective fact…he must have a political agenda up his starched white sleeve.

  There are even a few, but very vocal, fifth columnists within science itself who hold exactly these views, and use them to waste the time of the rest of us.

  Cartmill’s thesis is that there is an unexpected and pernicious alliance between the know-nothing fundamentalist religious right and the sophisticated academic left. A bizarre manifestation of the alliance is joint opposition to the theory of evolution. The opposition of the fundamentalists is obvious. That of the left is a compound of hostility to science in general, of ‘respect’ for tribal creation myths, and various political agendas. Both these strange bedfellows share a concern for ‘human dignity’ and take offence at treating humans as ‘animals’. Moreover, in Cartmill’s words,

  Both camps believe that the big truths about the world are moral truths. They view the universe in terms of good and evil, not truth and falsehood. The first question they ask about any supposed fact is whether it serves the cause of righteousness.

  And there is a feminist angle, which saddens me, for I am sympathetic to true feminism.

  Instead of exhorting young women to prepare for a variety of technical subjects by studying science, logic, and mathematics, Women’s Studies students are now being taught that logic is a tool of domination…the standard norms and methods of scientific inquiry are sexist because they are incompatible with ‘women’s ways of knowing’.*8 The authors of the prize-winning book with this title report that the majority of the women they interviewed fell into the category of ‘subjective knowers’, characterized by a ‘passionate rejection of science and scientists’. These ‘subjectivist’ women see the methods of logic, analysis and abstraction as ‘alien territory belonging to men’ and ‘value intuition as a safer and more fruitful approach to truth’.

  That was a quotation from the historian and philosopher of science Noretta Koertge, who is understandably worried about a subversion of feminism which could have a malign influence upon women’s education. Indeed, there is an ugly, hectoring streak in this kind of thinking. Barbara Ehrenreich and Janet McIntosh witnessed a woman psychologist speaking at an interdisciplinary conference. Various members of the audience attacked her use of the ‘oppressive, sexist, imperialist, and capitalist scientific method. The psychologist tried to defend science by pointing to its great discoveries – for example, DNA. The retort came back: “You believe in DNA?” ’

  Fortunately, there are still many intelligent young women prepared to enter a scientific career, and I should like to
pay tribute to their courage in the face of such bullying intimidation.*9

  I have come so far with scarcely a mention of Charles Darwin. His life spanned most of the nineteenth century, and he died with every right to be satisfied that he had cured humanity of its greatest and grandest illusion. Darwin brought life itself within the pale of the explicable. No longer a baffling mystery demanding supernatural explanation, life, with the complexity and elegance that define it, grows and gradually emerges, by easily understood rules, from simple beginnings. Darwin’s legacy to the twentieth century was to demystify the greatest mystery of all.

  Would Darwin be pleased with our stewardship of that legacy, and with what we are now in a position to pass to the twenty-first century? I think he would feel an odd mixture of exhilaration and exasperation. Exhilaration at the detailed knowledge, the comprehensiveness of understanding, that science can now offer, and the polish with which his own theory is being brought to fulfilment. Exasperation at the ignorant suspicion of science, and the air-headed superstition, that still persist.

  Exasperation is too weak a word. Darwin might justifiably be saddened, given our huge advantages over himself and his contemporaries, at how little we seem to have done to deploy our superior knowledge in our culture. Late twentieth-century civilization, Darwin would be dismayed to note, though imbued with and surrounded by the products and advantages of science, has yet to draw science into its sensibility. Is there even a sense in which we have slipped backwards since Darwin’s co-discoverer, Alfred Russel Wallace, wrote The Wonderful Century, a glowing scientific retrospective on his era?

  Perhaps there was undue complacency in late nineteenth-century science, about how much had been achieved and how little more advancement could be expected. William Thomson, the first Lord Kelvin, President of the Royal Society, pioneered the transatlantic cable – symbol of Victorian progress – and also the second law of thermodynamics – C. P. Snow’s litmus of scientific literacy. Kelvin is credited with the following three confident predictions: ‘Radio has no future.’ ‘Heavier than air flying machines are impossible.’ ‘X-rays will prove to be a hoax.’

  Kelvin also gave Darwin a lot of grief by ‘proving’, using all the prestige of the senior science of physics, that the sun was too young to have allowed time for evolution. Kelvin, in effect, said: ‘Physics argues against evolution, so your biology must be wrong.’ Darwin could have retorted: ‘Biology shows that evolution is a fact, so your physics must be wrong.’ Instead, he bowed to the prevailing assumption that physics automatically trumps biology, and fretted. Twentieth-century physics, of course, showed Kelvin wrong by powers of ten. But Darwin did not live to see his vindication,*10 and he never had the confidence to tell the senior physicist of his day where to get off.

  In my attacks on millenarian superstition, I must beware of Kelvinian overconfidence. Undoubtedly there is much that we still don’t know. Part of our legacy to the twenty-first century must be unanswered questions, and some of them are big ones. The science of any age must prepare to be superseded. It would be arrogant and rash to claim our present knowledge as all there is to know. Today’s commonplaces, such as mobile telephones, would have seemed to previous ages pure magic. And that should be our warning. Arthur C. Clarke, distinguished novelist and evangelist for the limitless power of science, has said: ‘Any sufficiently advanced technology is indistinguishable from magic.’ This is Clarke’s Third Law.

  Maybe, some day in the future, physicists will fully understand gravity, and build an anti-gravity machine. Levitating people may one day become as commonplace to our descendants as jet planes are to us. So, if someone claims to have witnessed a magic carpet zooming over the minarets, should we believe him, on the grounds that those of our ancestors who doubted the possibility of radio turned out to be wrong? No, of course not. But why not?

  Clarke’s Third Law doesn’t work in reverse. Given that ‘any sufficiently advanced technology is indistinguishable from magic’, it does not follow that ‘any magical claim that anybody may make at any time is indistinguishable from a technological advance that will come some time in the future’.

  Yes, there have been occasions when authoritative sceptics have come away with egg on their pontificating faces. But a far greater number of magical claims have been made and never vindicated. A few things that would surprise us today will come true in the future. But lots and lots of things will not come true in the future. History suggests that the very surprising things that do come true are in a minority. The trick is to sort them out from the rubbish – from claims that will forever remain in the realm of fiction and magic.

  It is right that, at the end of our century, we should show the humility that Kelvin, at the end of his, did not. But it is also right to acknowledge all that we have learned during the past hundred years. The digital century was the best I could come up with, as a single theme. But it covers only a fraction of what twentieth-century science will bequeath. We now know, as Darwin and Kelvin did not, how old the world is. About 4.6 billion years. We understand what Alfred Wegener was ridiculed for suggesting: that the shape of geography has not always been the same. South America not only looks as if it might jigsaw neatly under the bulge of Africa. It once did exactly that, until they split apart some 125 million years ago. Madagascar once touched Africa on one side and India on the other. That was before India set off across the widening ocean and crashed into Asia to raise the Himalayas. The map of the world’s continents has a time dimension, and we who are privileged to live in the plate tectonic age know exactly how it has changed, when, and why.

  We know roughly how old the universe is, and, indeed, that it has an age, which is the same as the age of time itself, and less than twenty billion years. Having begun as a singularity with huge mass and temperature and very small volume, the universe has been expanding ever since. The twenty-first century will probably settle the question whether the expansion is to go on for ever, or go into reverse. The matter in the cosmos is not homogeneous, but is gathered into some hundred billion galaxies, each averaging a hundred billion stars. We can read the composition of any star in some detail, by spreading its light in a glorified rainbow. Among the stars, our sun is generally unremarkable. It is unremarkable, too, in having planets in orbit, as we know from detecting tiny rhythmic shifts in the spectrums of other stars.*11 There is no direct evidence that any other planets house life. If they do, such inhabited islands may be so scattered as to make it unlikely that one will ever encounter another.

  We know in some detail the principles governing the evolution of our own island of life. It is a fair bet that the most fundamental principle – Darwinian natural selection – underlies, in some form, other islands of life, if any there be. We know that our kind of life is built of cells, where a cell is either a bacterium or a colony of bacteria. The detailed mechanics of our kind of life depend upon the near-infinite variety of shapes assumed by a special class of molecules called proteins. We know that those all-important three-dimensional shapes are exactly specified by a one-dimensional code, the genetic code, carried by DNA molecules which are replicated through geological time. We understand why there are so many different species, although we don’t know how many. We cannot predict in detail how evolution will go in the future, but we can predict the general patterns that are to be expected.

  Among the unsolved problems we shall bequeath to our successors, physicists such as Steven Weinberg will point to their ‘dreams of a final theory’, otherwise known as the grand universal theory (GUT) or theory of everything (TOE). Theorists differ about whether it will ever be attained. Those who think it will would probably date this scientific epiphany somewhere in the twenty-first century. Physicists famously resort to religious language when discussing such deep matters. Some of them really mean it. The others are at risk of being taken literally, when really they intend no more than I do when I say ‘God knows’ to mean that I don’t.

  Biologists will reach their grail of writing down t
he human genome, early in the next century. They will then discover that it is not so final as some once hoped. The human embryo project – working out how the genes interact with their environments, including each other, to build a body – may take at least as long to complete. But it too will probably be finished during the twenty-first century, and artificial wombs built, if these should be thought desirable.

  I am less confident about what is for me, as for most biologists, the outstanding scientific problem that remains: the question of how the human brain works, especially the nature of subjective consciousness. The last decade of this century has seen a flurry of big guns take aim at it, including Francis Crick no less, and Daniel Dennett, Steven Pinker and Sir Roger Penrose. It is a big, profound problem, worthy of minds like these. Obviously I have no solution. If I had, I’d deserve a Nobel Prize. It isn’t even clear what kind of a problem it is, and therefore what kind of a brilliant idea would constitute a solution. Some people think the problem of consciousness an illusion: there’s nobody home, and no problem to be solved. But before Darwin solved the riddle of life’s provenance, in the last century, I don’t think anybody had clearly posed what sort of a problem it was. It was only after Darwin had solved it that most people realized what it had been in the first place. I do not know whether consciousness will prove to be a big problem, solved by a genius, or will fritter unsatisfactorily away into a series of small problems and non-problems.

  I am by no means confident that the twenty-first century will solve the human mind. But if it does, there may be an additional by-product. Our successors may then be in a position to understand the paradox of twentieth-century science. On the one hand, our century arguably added as much new knowledge to the human store as all previous centuries put together; while on the other hand the twentieth century ended with approximately the same level of supernatural credulity as the nineteenth, and rather more outright hostility to science. With hope, if not with confidence, I look forward to the twenty-first century and what it may teach us.

 

‹ Prev