Book Read Free

Science in the Soul

Page 9

by Richard Dawkins


  Of the dictionary meanings of sensibility, I intend ‘discernment, awareness’ and ‘the capacity for responding to aesthetic stimuli’. One might have hoped that, by century’s end, science would have been incorporated into our culture, and our aesthetic sense have risen to meet the poetry of science. Without reviving the midcentury pessimism of C. P. Snow, I reluctantly find that, with only two years to run, these hopes are not realized. Science provokes more hostility than ever, sometimes with good reason, often from people who know nothing about it and use their hostility as an excuse not to learn. Depressingly many people still fall for the discredited cliché that scientific explanation corrodes poetic sensibility. Astrology books outsell astronomy. Television beats a path to the door of second-rate conjurors masquerading as psychics and clairvoyants. Cult leaders mine the millennium and find rich seams of gullibility: Heaven’s Gate, Waco, poison gas in the Tokyo underground. The biggest difference from the last millennium is that folk Christianity has been joined by folk science fiction.

  It should have been so different. The previous millennium, there was some excuse. In 1066, if only with hindsight, Halley’s Comet could forebode Hastings, sealing Harold’s fate and Duke William’s victory. Comet Hale-Bopp in 1997 should have been different. Why do we feel gratitude when a newspaper astrologer reassures his readers that Hale-Bopp was not directly responsible for Princess Diana’s death? And what is going on when thirty-nine people, driven by a theology compounded of Star Trek and the Book of Revelation, commit collective suicide, neatly dressed and with overnight bags packed by their sides, because they all believed that Hale-Bopp was accompanied by a spaceship come to ‘raise them to a new plane of existence’? Incidentally, the same Heaven’s Gate commune had ordered an astronomical telescope to look at Hale-Bopp. They sent it back when it came, because it was obviously defective: it failed to show the accompanying spaceship.

  Hijacking by pseudoscience and bad science fiction is a threat to our legitimate sense of wonder. Hostility from academics sophisticated in fashionable disciplines is another, and I shall return to this. Populist ‘dumbing down’ is a third. The ‘public understanding of science’ movement, provoked in America by Sputnik and driven in Britain by alarm over a decline in science applicants at universities, is going demotic. A spate of ‘Science Fortnights’ and the like betrays a desperate anxiety among scientists to be loved. Whacky ‘personalities’, with funny hats and larky voices, perform explosions and funky tricks to show that science is fun, fun, fun.

  I recently attended a briefing session urging scientists to put on ‘events’ in shopping malls, designed to lure people into the joys of science. We were advised to do nothing that might conceivably be a ‘turn-off’. Always make your science ‘relevant’ to ordinary people – to what goes on in their own kitchen or bathroom. If possible, choose experimental materials that your audience can eat at the end. At the last event organized by the speaker himself, the scientific feat that really grabbed attention was the urinal which automatically flushed as soon as you stepped away. The very word science is best avoided, because ‘ordinary people’ find it threatening.*2

  When I protest, I am rebuked for my ‘elitism’. A terrible word, but maybe not such a terrible thing? There’s a great difference between an exclusive snobbery, which no one should condone, and a striving to help people raise their game and swell the elite. A calculated dumbing down is the worst, condescending and patronizing. When I said this in a recent lecture in the United States, a questioner at the end, no doubt with a warm glow in his white male heart, had the remarkable cheek to suggest that dumbing down might be especially necessary to bring ‘minorities and women’ to science.

  I worry that to promote science as all larky and easy is to store up trouble for the future. Recruiting advertisements for the army don’t promise a picnic, for the same reason. Real science can be hard but, like classical literature or playing the violin, worth the struggle. If children are lured into science, or any other worthwhile occupation, by the promise of easy frolics, what happens when they finally confront the reality? ‘Fun’ sends the wrong signals and might attract recruits for the wrong reasons.

  Literary studies are at risk of becoming similarly undermined. Idle students are seduced into a debased ‘Cultural Studies’, where they will spend their time ‘deconstructing’ soap operas, tabloid princesses and tellytubbies. Science, like proper literary studies, can be hard and challenging but science is – again like proper literary studies – wonderful. Science is also useful; but useful is not all it is. Science can pay its way but, like great art, it shouldn’t have to. And we shouldn’t need whacky personalities and explosions to persuade us of the value of a life spent finding out why we have life in the first place.

  Perhaps I’m being too negative, but there are times when a pendulum has swung too far and needs a push in the other direction. Certainly, practical demonstrations can make ideas vivid and preserve them in the mind. From Michael Faraday’s Royal Institution Christmas Lectures to Richard Gregory’s Bristol Exploratory, children have been excited by hands-on experience of true science. I was myself honoured to give the Christmas Lectures, in their modern televised form, with plenty of hands-on demonstrations. Faraday never dumbed down. I am attacking only the kind of populist whoring that defiles the wonder of science.

  Annually in London there is a large dinner, at which prizes for the year’s best science books are presented. One prize is for children’s science books, and it recently went to a book about insects and other so-called ‘ugly bugs’. Such language is not best calculated to arouse the poetic sense of wonder, but let that pass. Harder to forgive were the antics of the chair of the judges, a well-known television personality (who had credentials to present real science, before she sold out to ‘paranormal’ television). Squeaking with game-show levity, she incited the audience to join her in repeated choruses of audible grimaces at the contemplation of the horrible ‘ugly bugs’. ‘Eeeuurrrgh! Yuck! Yeeyuck! Eeeeeuurrrgh!’ That kind of vulgarity demeans the wonder of science, and risks ‘turning off’ the very people best qualified to appreciate it and inspire others: real poets and true scholars of literature.

  The true poetry of science, especially twentieth-century science, led the late Carl Sagan to ask the following acute question.

  How is it that hardly any major religion has looked at science and concluded, ‘This is better than we thought! The Universe is much bigger than our prophets said, grander, more subtle, more elegant’? Instead they say, ‘No, no, no! My god is a little god, and I want him to stay that way.’ A religion, old or new, that stressed the magnificence of the Universe as revealed by modern science might be able to draw forth reserves of reverence and awe hardly tapped by the conventional faiths.

  Given a hundred clones of Carl Sagan, we might have some hope for the next century. Meanwhile, in its closing years, the twentieth must be rated a disappointment as far as public understanding of science is concerned, while being a spectacular and unprecedented success with respect to scientific achievements themselves.*3

  What if we let our sensibility play over the whole of twentieth-century science? Is it possible to pick out a theme, a scientific leitmotif? My best candidate comes nowhere near doing justice to the richness on offer. The twentieth is the Digital Century. Digital discontinuity pervades the engineering of our time, but there is a sense in which it spills over into the biology and perhaps even the physics of our century.

  The opposite of digital is analogue. When the Spanish Armada was expected, a signalling system was devised to spread the news across southern England. Bonfires were set on a chain of hilltops. When any coastal observer spotted the Armada he was to light his fire. It would be seen by neighbouring observers, their fires would be lit, and a wave of beacons would spread the news at great speed far across the coastal counties.

  How could we adapt the bonfire telegraph to convey more information? Not just ‘The Spanish are here’ but, say, the size of their fleet? Here’
s one way. Make your bonfire’s size proportional to the size of the fleet. This is an analogue code. Clearly, inaccuracies would be cumulative. So, by the time the message reached the other side of the kingdom, the information about fleet size would have degraded to nothing. This is a general problem with analogue codes.

  But now here’s a simple digital code. Never mind the size of the fire, just build any serviceable blaze and place a large screen around it. Lift the screen and lower it again, to send the next hill a discrete flash. Repeat the flash a particular number of times, then lower the screen for a period of darkness. Repeat. The number of flashes per burst should be made proportional to the size of the fleet.

  This digital code has huge virtues over the previous analogue code. If a hilltop observer sees eight flashes, eight flashes is what he passes along to the next hill in the chain. The message has a good chance of spreading from Plymouth to Dover without serious degradation. The superior power of digital codes has been clearly understood only in the twentieth century.

  Nerve cells are like Armada beacons. They ‘fire’. What travels along a nerve fibre is not electric current. It’s more like a trail of gunpowder laid along the ground. Ignite one end with a spark, and the fire fizzes along to the other end.

  We’ve long known that nerve fibres don’t use purely analogue codes. Theoretical calculations show that they couldn’t. Instead, they do something more like my flashing Armada beacons. Nerve impulses are trains of voltage spikes, repeated as in a machine gun. The difference between a strong message and a weak is not conveyed by the height of the spikes – that would be an analogue code and the message would be distorted out of existence. It is conveyed by the pattern of spikes, especially the firing rate of the machine gun. When you see yellow or hear middle C, when you smell turpentine or touch satin, when you feel hot or cold, the differences are being rendered, somewhere in your nervous system, by different rates of machine-gun pulses. The brain, if we could listen in, would sound like Passchendaele. In our meaning, it is digital. In a fuller sense it is still partly analogue: rate of firing is a continuously varying quantity. Fully digital codes, like Morse, or computer codes, where pulse patterns form a discrete alphabet, are even more reliable.

  If nerves carry information about the world as it is now, genes are a coded description of the distant past. This insight follows from the ‘selfish gene’ view of evolution.

  Living organisms are beautifully built to survive and reproduce in their environments. Or that is what Darwinians say. But actually it isn’t quite right. They are beautifully built for survival in their ancestors’ environments. It is because their ancestors survived – long enough to pass on their DNA – that our modern animals are well built. For they inherit the very same successful DNA. The genes that survive down the generations add up, in effect, to a description of what it took to survive back then. And that is tantamount to saying that modern DNA is a coded description of the environments in which ancestors survived. A survival manual is handed down the generations. A Genetic Book of the Dead.*4

  Like the longest chain of beacon fires, the generations are uncountably many. No surprise, then, that genes are digital. Theoretically the ancient book of DNA could have been analogue. But, for the same reason as for our analogue Armada beacons, any ancient book copied and recopied in analogue language would degrade to meaninglessness in very few scribe generations. Fortunately, human writing is digital, at least in the sense we care about here. And the same is true of the DNA books of ancestral wisdom that we carry around inside us. Genes are digital, and in the full sense not shared by nerves.

  Digital genetics was discovered in the nineteenth century, but Gregor Mendel was ahead of his time and ignored. The only serious error in Darwin’s world-view derived from the conventional wisdom of his age, that inheritance was ‘blending’ – analogue genetics. It was dimly realized in Darwin’s time that analogue genetics was incompatible with his whole theory of natural selection. Even less clearly realized, it was also incompatible with obvious facts of inheritance.*5 The solution had to wait for the twentieth century, especially the neo-Darwinian synthesis of Ronald Fisher and others in the 1930s. The essential difference between classical Darwinism (which we now understand could not have worked) and neo-Darwinism (which does) is that digital genetics has replaced analogue.

  But when it comes to digital genetics, Fisher and his colleagues of the synthesis didn’t know the half of it. Watson and Crick opened floodgates to what has been, by any standards, a spectacular intellectual revolution – even if Peter Medawar was going too far when he wrote, in his 1968 review of Watson’s The Double Helix, ‘It is simply not worth arguing with anyone so obtuse as not to realise that this complex of discoveries is the greatest achievement of science in the twentieth century.’ My misgiving about this engagingly calculated piece of arrogance is that I’d have a hard time defending it against a rival claim for, say, quantum theory or relativity.

  Watson and Crick’s was a digital revolution and it has gone exponential since 1953. You can read a gene today, write it out precisely on a piece of paper, put it in a library, then at any time in the future reconstitute that exact gene and put it back into an animal or plant. When the human genome project is completed, probably around 2003,*6 it will be possible to write the entire human genome on a couple of standard CDs, with enough space over for a large textbook of explanation. Send the boxed set of two CDs out into deep space and the human race can go extinct, happy in the knowledge that there is now at least a faint chance for an alien civilization to reconstitute a living human being. In one respect (though not in another), my speculation is at least more plausible than the plot of Jurassic Park. And both speculations rest upon the digital accuracy of DNA.

  Of course, digital theory has been most fully worked out not by neurobiologists or geneticists, but by electronics engineers. The digital telephones, televisions, music reproducers and microwave beams of the late twentieth century are incomparably faster and more accurate than their analogue forerunners, and this is critically because they are digital. Digital computers are the crowning achievement of this electronic age, and they are heavily implicated in telephone switching, satellite communications and data transmission of all kinds, including that phenomenon of the present decade, the World Wide Web. The late Christopher Evans summed up the speed of the twentieth-century digital revolution with a striking analogy to the car industry.

  Today’s car differs from those of the immediate post-war years on a number of counts…But suppose for a moment that the automobile industry had developed at the same rate as computers and over the same period: how much cheaper and more efficient would the current models be? If you have not already heard the analogy the answer is shattering. Today you would be able to buy a Rolls-Royce for £1.35, it would do three million miles to the gallon, and it would deliver enough power to drive the Queen Elizabeth II. And if you were interested in miniaturization, you could place half a dozen of them on a pinhead.

  It is computers that make us notice that the twentieth century is the digital century; lead us to spot the digital in genetics, neurobiology and – though here I lack the confidence of knowledge – physics.

  For it could be argued that quantum theory – the part of physics most distinctive of the twentieth century – is fundamentally digital. The Scottish chemist Graham Cairns-Smith tells how he was first exposed to this apparent graininess:

  I suppose I was about eight when my father told me that nobody knew what electricity was. I went to school the next day, I remember, and made this information generally available to my friends. It did not create the kind of sensation I had been banking on, although it caught the attention of one whose father worked at the local power station. His father actually made electricity so obviously he would know what it was. My friend promised to ask and report back. Well, eventually he did and I cannot say I was much impressed with the result. ‘Wee sandy stuff’ he said, rubbing his thumb and forefinger together to emphasise just
how tiny the grains were. He seemed unable to elaborate further.

  The experimental predictions of quantum theory are upheld to the tenth place of decimals. Any theory with such a spectacular grasp on reality commands our respect. But whether we conclude that the universe itself is grainy – or that discontinuity is forced upon an underlying deep continuity only when we try to measure it – I do not know; and physicists will sense that the matter is too deep for me.

  It should not be necessary to add that this gives me no satisfaction. But sadly there are literary and journalistic circles in which ignorance or incomprehension of science is boasted with pride and even glee. I have made the point often enough to sound plaintive. So let me quote, instead, one of the most justly respected commentators on today’s culture, Melvyn Bragg:

  There are still those who are affected enough to say they know nothing about the sciences as if this somehow makes them superior. What it makes them is rather silly, and it puts them at the fag end of that tired old British tradition of intellectual snobbery which considers all knowledge, especially science, as ‘trade’.

 

‹ Prev