by C. P. Snow
He is a citizen of the world and has devoted much of his influence and energy to help educate young scientists who come from provenances as underprivileged as his own. To this end he established the International Centre for Theoretical Physics, an institute which has its home in Trieste, thanks to the goodwill of the Italian authorities, and Salam has vigorously commuted between the Adriatic and his home and professorship at King’s College in London. The Centre has had many successes in developing the progress of scientists of all races. Few men have done more good than Salam for the talented poor.
It happens that Salam is a devout Moslem, believing passionately in the highest axiom of Islam, the essential brotherhood of man. It is good for us to be reminded that men like Salam can translate this axiom into action.
Incidentally, Salam is probably the only committed religious believer, in the doctrinal sense, among all the great theoreticians. Many have had deep religious feelings, as Einstein had, but couldn’t accept any creed or belief. Most were reverent in the face of nature, had their own personal morality, sometimes a piety towards the religion in which they had been brought up, but in which they ceased to find meaning. But Salam is, in the full sense, a religious man.
10: A Different Harvest
THE story of particle physics continues unabated at the present day. In spite of the 1945 gloom, the ‘beautiful subject’ has gone on with surprises and consequences. But there have been other developments, outside the mainstream of the nuclear explorations, which deserve treatment on their own, not only for their intrinsic interest, but because they may prove to affect human lives more than any of the military applications of nuclear energy. Although most of these had roots dating back before the war, it is only in the last three decades that they have come to affect our everyday lives. All that can be done here is to make a number of perfunctory notes.
In the 1939–45 war a high proportion of physicists (in Britain something like 80 per cent) detached themselves from their own researches and were diverted to radar. Many of them adapted themselves with ease. If you could do one kind of physics, you could do another. They learned about the possibilities of electronics. The same transmutation happened in America, and to an extent in Germany – though, for reasons which are still not completely understood, the German use of scientists was nothing like so thorough as their use in the English-speaking world. In Britain, this concentration on electronics was good war-making. For years, it seemed the only salvation.
When the war ended, it was obvious to many good scientists that the same process could be used the other way round. If you had learned about electronics, you could take it straight into pure science. That was how the British launched into a new domain, which we now know of as radio astronomy. In wartime established scientists like Martin Ryle and Bernard Lovell, and others slightly junior – J S Hey, Antony Hewish, and a good many others – had studied the detection of radio waves. (They had all made important contributions to the development of radar weapons.) It was natural to turn that kind of technique and thinking to radio waves in the cosmos. A flood of discoveries followed, right up to the present day. The techniques of radio astronomy were picked up all over the world.
The meaning of some of the discoveries was argued about with considerable passion, as was anything to do with cosmology. Those controversies will go rumbling on. But it became apparent that, just as the microscopic universe of sub-atomic particles was proving weirder than anyone had imagined, so was the macrocosmic universe of stars and galaxies.
Some of the thoughts about the microcosm brought illumination to the macrocosm, and the reverse. The annihilation of matter, the identity of matter and energy, the existence of anti-matter, had all had a conceptual pre-figuring in the equations of Einstein, Dirac and other theoreticians of the immediate past. Now, in some interpretations of the astronomical data, one can see them happen. The only way to explain the phenomenal energy of the quasars is by matter turning into energy. And Einstein had predicted that in compact objects gravity could become so strong that nothing could escape: astronomers now have evidence that such black holes really do exist.
These findings are going to give a sense of wonder for long enough. To some of the speculations, there may not be an answer. Pessimistic scientists have been known to say that not only is the universe weirder than we can now understand, it may be weirder than we shall ever understand. That meant that there are kinds of comprehension which we can’t transcend. That, however, remains very much a minority view.
In our immediate period, say 1950–80, physicists have also made sensational inroads into biological problems. Crystallography had always been off the mainstream of modern physics. It deals, not with the structure of nuclei and atoms, but with the geography of atoms – the position of atoms in solid matter, and recently, and far more difficult, in liquid matter also. Crystallography is not only an elegant study, but one with multifarious uses. However, the nuclear scientists didn’t consider it was touching the core of physics. Rutherford didn’t permit it to enter the Cavendish. It might be slightly more respectable than spectroscopy, Kapitsa remarked, but both were like putting things into boxes, or perhaps a form of stamp-collecting.
Nevertheless, W L Bragg (later Sir Laurence), whom everyone agreed was a scientist of the highest class, had devoted his scientific life to the subject. So did another man of great gifts, J D Bernal. Although chemists and geologists had been looking at the exterior forms of crystals for centuries, Bragg and Bernal could bring a twentieth-century technique to bear on the fundamental atomic structure of crystals. The key was X-rays. X-rays are radiation, like light, but with a much shorter wavelength. X-ray wavelengths – at around a ten-thousand-millionth of a metre – are very similar to the spacing between atoms in a crystal. When X-rays shine on a crystal they penetrate it. But some are reflected back from the different layers and rows of atoms, and the reflected pattern gives clues to the atomic structures within the crystal. The patterns are not easy to read. It requires an experienced judgement, or complex computer programmes that have only been available in the past few years. But in principle, all the information is there, cryptically, in the pattern of reflected X-rays.
As early as the 1930s, Bragg and Bernal and their colleagues were considering ways of applying X-ray crystallographic techniques to some of the crucial problems of biology, among them the structure of the genetic material, the molecules within the living cell that determine its structure, and pass on information so that its descendants are similarly constructed. At that time, the full range of crystallographic techniques was not ready for the purpose, but the intellectual foresight was.
By the 1950s, the techniques were ripe. Experimental results on DNA (deoxyribonucleic acid) – now known to be the genetic material – were accumulating. The scientists who interpreted it, who showed that the DNA molecule is twisted around itself, were Francis Crick and James Watson. Watson’s The Double Helix is a brilliant book and has permanent value as showing that scientists are human, or, if you like, only too human. It was generously welcomed, and by Bragg himself with supreme magnanimity. Bragg, like Einstein and Bohr, was one of the saints of science. In cool retrospect, though, the book would be more acceptable if it showed more recognition of the cumulative nature of science. To repeat what has been said already, science is an edifice. To put in a brick, a scientist has to climb on the shoulders of other men, often greater men. Individuals, except for the odd anomaly who occurs once in a hundred years, don’t count all that much. Both Bragg and Bernal, who knew a lot about the history of science, would have accepted that without reserve. But if one is writing a history of a specific discovery at a specific time as with DNA, it would be a distortion to leave those two out.
That said, it was a very great discovery, and will have, when the lesson has sunk in, perhaps in a generation or two, profound – and to many disturbing – human consequences. Human vanity will not be quite the same, nor some of the more ill-founded human hopes.
Francis
Crick was a physicist by training, and had spent the war as a somewhat discontented member of one of the radar teams. Once released, he beat around for something worth doing. As it were not by second nature but by first, he had a deep sense of what was important and what might be soluble. Those two things are, of course, not the same, but a great scientist nosing his way into unexplored territory needs them both. Rutherford had that combination to the highest degree. A number of marvellously accomplished scientists haven’t had it at all. As an example, there is the sad life of Einstein’s closest friend and the only one he turned to for criticism, Paul Ehrenfest. Ehrenfest was a brilliant theoretician, but all his contributions were to the more abstruse branches of physics. Crick does have the necessary combination, and it was the greatest of his gifts, although for his particular gamble he needed also comprehensive intelligence and fighting spirit.
He didn’t know much crystallography, and never became a supreme practitioner as Bragg and Bernal were. He learned enough for his purposes. He didn’t know much biology but decided that he didn’t need to know much. Unravelling the secrets of DNA was a problem where it did more harm than good to be cluttered up with preconceptions. What he did know was that once the material structure of DNA was established by X-ray crystallography, then one ought to be able to make sense out of it.
Then Watson came along, who had another of the valuable gifts, that is an eye for a winner. Each had part of the story. The rest one can learn, or infer, from Watson’s book. There are some obvious points. Rosalind Franklin didn’t get a fair deal – the Nobel prize was shared between Crick, Watson and Maurice Wilkins, who had performed the vital X-ray experiments. Bernal used to say that the Nobel prize should have been split two ways, Crick and Franklin as one pair, Watson and Wilkins as the other. This would have had the advantage that Crick, good with women, would have been protective of Rosalind Franklin, who wasn’t easy to look after.
Another point is very clear – if Crick and Watson hadn’t got there, and published, it wouldn’t have been long, possibly only weeks, certainly not more than months, before others did. Linus Pauling in America was very nearly there, and one fresh look might have clinched it. It was a case even more striking than the Special Theory of Relativity. Several minds were converging on the same solution – and understanding DNA didn’t require as much conceptual apparatus as the Special Theory.
Yet there was no injustice about anything that accrued to Crick. His later work on the genetic code – the way that information is actually stored in the DNA molecule – was a feat of extreme intellectual skill and major significance. Here again there was a convergence. In the United States, Joshua Lederberg was reaching the same conclusions independently. Now the instructions for life are understood, genetic engineering enables man to make new kinds of living cells, to produce huge quantities of useful drugs. ‘Biotechnology’ is becoming a major new industry. Philosophically, the ability to alter the basis of life at will may have even more effect. The meaning of this work hasn’t sunk into popular consciousness, even among intellectual persons, with anything like the rapidity of Darwin’s On the Origin of Species. In the long run it may do as much or more to alter men’s view of themselves.
That, though, will have to wait until the twenty-first century. What will not have to wait until the twenty-first century, but is hitting the industrialized world here and now, is the recent domestication of electronics. For many years, it had been realized that there were great numbers of operations which men had to perform, mechanical, laborious, repetitive, which ought to be given over to machines. This was true of mental operations as well as physical. Routine calculations could be done faster and more efficiently by a mechanical process than by a human mind. Bold thinkers speculated that there could be mechanical memories, more comprehensive than human ones.
There was nothing wrong with the idea. The crippling difficulty was that no one could devise a machine for any such purpose which would work anything like fast enough. Charles Babbage, a fine Victorian mathematician with an inventive flair, actually thought out and constructed a machine we now call a computer. In principle he was completely right. But his machine worked by mechanical energy and that was too slow by an order of magnitude.
Brilliant ideas have often had to wait for new techniques, but the reverse is also true: new techniques have often led to brilliant ideas. It was not until electrons, and electronic currents, began to be understood that there was any chance of a workable Babbage machine. It took a long time, technological ingenuity, concentration upon gadgets and, finally, the pressure of war. As a young man, Rutherford transmitted radio waves over a mile in Cambridge. But he immediately gave that up as a plaything, too remote from the heart of physics. It took an inventor like Marconi to persist, and make radio a commercial proposition. Improved electronic valves were an industrial development. Electronic circuits were to physicists a complicated study, but not profound – the kind of topic they shied from. To see their fundamental significance needed not only inventive ingenuity, but mathematical insight.
At last that happened, and very fast, during the 1939–45 war. Mainly for the purposes of cryptographic decoding, a secret incomparably better kept than the nuclear bomb, computers, primitive by today’s standards but just as functional, were being built. Some of the acutest minds in the Western world were at work: not world-comprehensive minds like Einstein’s, but minds with a peculiarly rare specific gift. There were a number. In America there was the one-time infant prodigy, John von Neumann, born in Budapest; in England the hero was Alan Turing, whose intellect did more practical service to the country than could be credited to most household names of that war. Turing was the nearest English approach to the great von Neumann.
From that time on it was beginning to be realized that computers were going to take over a good many aspects of workaday existence. In fact, there has been too much mystification about them. They can perform many tasks which human intelligence can’t: but they are of course useless without human intelligence. After all, they can always be unplugged. In memory storage, they can be given masses of facts which no human memory can retain, reproduce them when given the necessary instructions, do with precision what they are told. And yet, even there, they can’t have the fluidity and range of a decent human memory, for which, in many commonplace tasks as well as all creative ones, there is no substitute.
It is silly to be frightened by computers. Nevertheless, the social impact is bound to be cumulative. That is already evident all over the industrialized world, in North America, Europe, the USSR, Japan, and increasingly in parts of the Far East. Incidentally, Japan is worth particular notice. The Japanese scientists, technologists, technocrats, have shown skills and originality in all this electronic apotheosis which quite outclass the West’s. That ought to surprise no one who has given the most perfunctory attention to Japanese visual art or literature or pure science. For hundreds of years the culture has been wildly original, something oddly different from any other among the sons of men. It was an instance of Western blindness not to discover that simple fact.
All over the industrialized world, then, computers – using the term as shorthand for all forms of automatic guidance and control – are spreading. Something else is spreading. That is the realization that nearly all the goods that this industrialized world is now producing – which means an enormous proportion of all the goods produced on our planet – could, with such technological knowledge and a little organization, be made by not more than 40 per cent of the present labour force. Even with the present organization, industrial production, which includes modern agriculture, requires nothing like the number of workers who are now employed. If we considered nothing but functional needs, then the advanced societies of the world are already masking a high level of unemployment.
That will increase, and dramatically increase, on account of the newest, quietest, and most irreversible of technological revolutions. This is the extension of electronic control right down to the domesti
c scale. Computers were constructed as soon as complex electronic circuits were feasible. It has been discovered that what are in effect miniature computers can be constructed, without electronic valves at all, and without any of the labyrinthine paraphernalia. There was an element of chance in this discovery, but it came through researches into the curious properties of what are known as semi-conductors, in which electrons can travel, but not with the freedom with which they travel in metals. The element silicon is a semi-conductor. Slivers of silicon can be made into perfectly effective mini-computers which can be carried like a map in a pocket diary. And recently, and even more bizarre, it has been found that a substance previously unknown to fame, gallium arsenide, is even better fitted for the job. Silicon chips will soon be replaced by this hitherto obscure compound which has the peculiar ability to emit light like a miniature laser. There must be other comparable semi-conductors. Rapid searches are presumably being made through textbooks of inorganic chemistry.
None of this sounds specially catastrophic, by the side of nuclear explosions, or even the first blundering waves of the Industrial Revolution. Yet, as was hinted in the first section of this account, it casts a shadow before it. It is likely to affect human life – and immediately in our industrialized world – more than any of those events. For a simple reason. At present, as has just been mentioned, the industrialized world can produce all it now produces with a fraction of the work-force. With these mini-processors now to hand, that fraction could, and in some societies certainly will, almost at once be reduced, not just slightly, but by – what? Half? Three-quarters? If production alone is the measure, more than that.