The combination of private-sector funding plus a more benevolent and informed science-friendly electorate could, if neither cyber-hedonism nor politically related science scandals prevail, create a chance for scientists truly to exercise their imagination. Indeed, scientists can now crack on as never before: automation has freed them from routine lab drudgery. All manner of databases are now only a key tap away, whilst whole new branches of science are opening up in silico – in the silicon memory banks of computers as opposed to in a lab dish (in vitro) or most vanishing of all in vivo (in a whole animal). In the future instead of performing a real experiment you could ‘mine’ data – tease out a line of enquiry from the colossal monolith of facts ‘out there’. Or you might choose to experiment, but on a computer model whereby you can test out a drug on a cyber-heart, or observe other types of biochemical and physiological reactions that, in the old days, would have necessitated setting up a messy and capricious real-life trial.
Like all other aspects of our lives, then, the Information Age has changed the world of scientific research – it is easier to work within, gives answers quickly and reliably, does not demand great physical dexterity or lab-savvy green fingers; scientists now work in a world that saves so much time that even the most daunting information-crunching tasks suddenly become possible. Meanwhile, the fuelling of technology with the latest scientific findings will be more and more voracious – delivering the objects of desire, all the gifts of ‘wealth and comfort’, that have formed the material of the preceding chapters. In the universities as well as behind the walls of the leviathan pharmaceutical and other high-tech industries automation will take away the ingenuity, the luck, the wrestling with a capricious physical world that has characterized research so far, in favour of a frog-spawn approach – all strategies, be they more or less likely, can be tried out regardless by machines. The lack of any insight on behalf of the robot researchers is more than offset by their sheer speed at churning exhaustively through all possibilities, coming up with the right answer in a fraction of the time that it would have taken their more reflective but fallible human counterparts.
Now let's think about what exactly a full-time scientist, freed up from more routine occupations, will pursue as a big new problem. The theoretical physicist Freeman Dyson has suggested that innovative science can be classified into two categories: first, to explain old things in new ways, ‘concept-driven revolution’, such as the ideas of Einstein, Darwin or Freud; secondly, to discover new things that have to be explained, ‘tool-driven revolution’, for example, Galileo's use of the telescope, or the exploitation of crystallography in molecular biology. How far will the 21st-century scientist progress with either type of revolution?
As a prelude, it is most likely that the traditional disciplines of science will not be recognizable as such, but will have merged. For example, computer science as a subject originally arose from diverse areas including mathematics, engineering and physics, whilst the modern study of the brain, neuroscience, is a product of many different disciplines including computer science, physiology, anatomy, biochemistry and psychology. Yet neuroscience itself may well end up lumped with electrical engineering and molecular biology to give rise to a new discipline, ‘neurotechnology’, in which future generations of scientists do not just study the brain, but probe, display and manipulate it with an immediacy and precision that will have far-flung ramifications for how we live.
Meanwhile computer power will soon be such that there could be an exhaustive database relating to all aspects of brain operations, to the whole multidisciplinary subject of neuroscience, that had previously been segregated according to a particular expertise in, say, pharmacology or physiology or genetics. With this great store of multidisciplinary information available even a non-specialist will be able to rove across all subjects relating to the brain, pursuing a pet theory and having it automatically cross-referenced, validated against all known experiments and used to make further predictions. Just as the geneticists are already engaged in this type of silicon exploration with bio-informatics, so we could develop an analogous process of neuroinformatics. In short, 21st-century science, its biology at least, will probably be characterized by a swing away from the reductionist tendency, the analysis that was the hallmark of much of research in the 20th century, in favour of synthesis, describing inter-relations between previously disparate areas, and indeed of developing umbrella concepts that transcend those particular disciplines.
Cross-referencing in this way could give new insights and inspire the next ‘concept-driven revolution’. Some of the greatest science has arisen from interactions at the frontier of classical cultures. For example, Linus Pauling, in the first half of the 20th century, imported the principles of quantum mechanics into chemistry and revolutionized our understanding of the chemical bond. Another cataclysmic insight came to Sir Frank Macfarlane Burnet in 1959 when he realized that the principles of Darwinian evolution could be applied to a completely different domain, the immune system. In the future, such leaps across disciplines may become increasingly frequent as we have more time to survey, from the hilltop vista of the internet and the world wide web, more subjects; with increasing ease we will be able to harness IT to seek similarities in patterns and deep organization, independent of the scale or speed or jargon of any one particular subject.
So, the merging and cross-referencing of different branches of science will provide a fascinating opportunity to identify fundamental concepts, analogous to, say, the survival of the fittest. But such ideas can come only if scientists ask the right kind of questions first. In the future, then, what will need to be explained or discovered? In 1923 the biologist J. B. S. Haldane delivered a paper on the future of science to a Cambridge society, The Heretics. He called his vision – which was to inspire Aldous Huxley's Brave New World – Daedalus, after the father of Icarus in Greek mythology, who did not fly too near the sun with waxen wings. Reeling still from the horrors of mechanized warfare, Haldane explored the future of science, which he defined as ‘the free activity of man's divine faculties of reason and imagination’.
Many of the predictions in Daedelus turned out to be chillingly accurate, and actually articulate fears not unlike those we have been discussing here, nearly a century later. For Haldane, technology did not necessarily bring in its train some easy nirvana, and even if it did, then that in itself would pose a problem – indeed the type of difficulties arising from the unbridled comfort that we have been discussing. Just as we may have to square up to a new cyber-passivity so Haldane entertained the bleak prospect that humans might end up as a ‘mere parasite of machinery’, and predicted that physics would abolish not just the constraints of day and night but also other spatiotemporal checks to our lives – a vision now realized by the web and the internet.
Although Haldane envisaged that chemistry would continue to alter life as it had done before it was even a science, with explosives, dyes and drugs, it was in the application of biology that he foresaw truly big transformations. Looking hard at the nascent eugenics movement of the time, one near-certainty was a ‘eugenics official’ and ‘marriage by numbers’. And although such phenomena have only come to pass indirectly, with virtual dating and the possibility of designer children, other predictions were startlingly accurate. As well as foreseeing the abolition of many infectious diseases, Haldane predicted the development of a ‘nitrogen-fixing’ plant that in a sense anticipated GM food. He even described a possible eco-accident, of the type all too familiar nowadays: the sea turned purple due to contamination by that plant. Haldane could also have written, though from the viewpoint of the 1920s, a credible version of my Chapter 5, concerning future views of life. He actually prophesied the development of IVF and the complete dissociation of sex and reproduction, with his concept of ‘ectogenesis’. Even Prozac was on his horizon: ‘… to control our passions by some more direct method than fasting or flagellation.’ And he scored a direct hit in anticipating HRT: ‘This change seems to
be due to a sudden failure of a definite chemical substance produced by the ovary. When we can isolate and synthesize this body we shall be able to prolong a woman's youth, and allow her to age as gradually as the average man.’
Haldane not only had a wide-ranging grasp of the science of his day but also an impressive instinct for what was to follow. I suggest we therefore turn to him to give us a framework for the most important issue of all in our current discussion. What of fundamental science, the science that will spawn the technologies of the next century: what are the remaining big questions here?
Haldane listed them in Daedalus as problems ‘first of space and time' – in our terminology the big bang; ‘then of matter as such’ – for us the persistent quirkiness of quantum theory and the dream of nanoscience; ‘then of [man's] own body and those of other living beings’ – surely the synthesis of different branches of biomedical science, along with the greatest question of how a brain can generate the subjective experience of consciousness; ‘and finally the subjugation of the dark and evil elements in [mankind's] own soul’ – at a stretch, the big question of how we shall use this knowledge to work out accountability, the degree of biological determinism, the ultimate riddle of free will in the Neurotechnological Age. These same questions also feature in John Horgan's idea of the ‘most ambitious goals’ of science.
Imagine, then, that you are a scientist researching one of these questions a few decades from now. You like using your imagination in this way because it is the only conduit left for such mental indulgence, and because when you are coming up with original ideas it gives you a feeling – unusual now in the mid 21st century – of personal fulfilment. It is the most effective way for you to buck the current trend and feel like a complete yet independent person. You spend most of your time talking to your screen; when necessary you are able to assemble 3D models readily enough. And types of equipment that were once stratospherically expensive, such as a particle accelerator, or merely costly, like special types of microscope for the manipulation of atoms, or even the older technology of humble magnification of some 100,000 times to observe the components inside the cells that make up biological tissue, are all now yours at a voice command.
Most of the time the simulations are breathtakingly faithful to the real thing. Highly sophisticated programs covering all contingencies for any experimental situation enable you to get ‘virtual data’ almost as fast as you can think up and programme in the virtual experiment. More and more of the general public are therefore not only reading science books but doing science, performing virtual experiments. In fact, nowadays even most of you professionals only occasionally access the remote, real apparatus to verify the accuracy of the cyber-studies if the result is particularly surprising, and yet as yet the virtual data is not significantly discrepant from its counterpart, garnered so much more painstakingly, in the real world.
However, science remains the one activity where you still have to be careful to distinguish reality from virtual reality. All your simulations are built on models of interactions, on models of the forces acting between particles, and a preconceived vision of what exists. You know that is no substitute for hands-on ‘real’ science, which constantly compares its hypotheses with reality. Here there is a barrier to progress, for that examination – and the likelihood of unexpected discovery in the real world – will become prohibitively impractical. Some theories about elementary particles might never be tested exhausively, for to do so would require such colossal accelerators that they would have to span the universe and consume the outputs of all economies everywhere.
One investigation where observation is essential and simulation a guide, which encompasses Haldane's first big question concerning space and time, is that of the origin of the universe. There is no doubt in the minds of cosmologists such as yourself that our universe came into existence about fifteen billion years ago and that it has been expanding ever since. Initially, the entire visible universe was confined to a single point, a singularity. But even before you reach that far back the current concepts of space and time break down, and in your current research you are struggling to extend scientific techniques into this region. To achieve understanding of the very, very early universe, a universe so small that quantum events dominate, quantum theory must somehow overcome its current failure to account for the force of gravitation that would have been operating: you need to describe a ‘quantum gravity’. Like the rest of your colleagues, you really do not know how this will be done, although there are currently many suggestive ideas. Scientists, being optimists, believe that a theory of quantum gravity will be achieved so that we will be able to understand the earliest moments of the universe. But will science be able to go beyond that instant of creation and discover how the universe originated? You do not know. Some doubt that we ever will but others you among them are enthralled by the prospect of going back to before the beginning.
For example, you know that even cherished ideas that have been part of our civilization since classical times are likely to give way as science overturns the everyday in its quest for the truth. Up until the end of the 20th century everyone thought of the end point of matter as literally that: a collection of points. At the beginning of the 21st century an alternative view emerged, that the ultimate entities are tiny, rigid lines, known as ‘strings’. A particle was seen as just a vibration of a string.
Are strings the end of the line, or can you expect the overthrow of even that concept before we reach ultimate reality? Most scientists think that there will be a final answer, a ‘theory of everything’, but no one really knows. These strings are thought to exist in ten, perhaps eleven, dimensions of spacetime. This bizarre–sounding concept was first introduced by the 19th-century mathematician Hermann Minkowski: ‘Henceforth, space by itself, and time by itself, are doomed to fade away into mere shadows, and only a kind of union of the two will preserve an independent reality.’
Spacetime, as its name suggests, is a notion that puts space and time on an equal footing so that we are freed from parcelling out our lives, the world and indeed the universe in the obvious commonsense way. As our understanding of it has grown we have discovered new ways to travel through it, and new barriers to travel too. Time travel has always been with us: after all, we all drift inexorably into the future. However, Einstein's theory of special relativity showed that this inexorable drift is more complex than we thought, for the faster we travel through space the slower our clocks run. We could retard our ageing were it possible to travel close to the speed of light, but it seems that we cannot reverse time and grow young, perhaps discarding our memories as we do so. Yet the Holy Grail of quantum gravity might offer this possibility.
Neither you, a late-21st-century scientist, nor your colleagues can give a definite answer: but the ranks of scientists thinking seriously about using the foam-like structure of spacetime as a tunnel to travel faster into the future or even into the past are swelling. Ever since the beginning of the century scientists have been studying these tunnels, wormholes, and are viewing them less and less as science fiction. Travel through such tunnels is still fantasy for the moment, at least for anything bigger than an electron, but it remains a conceivable option for the future. The very notion of time travel is plagued by paradoxes; but now our notions of classical logic are being modified as regards the passage of time and the very nature of causality. Logic, though of extreme importance in science, cannot be regarded as superior to experiment and observation.
Nonsensical leaps of imagination, such as time travel, as well as more immediate, hard-headed questions to which there must be some kind of answer, about the big bang and the expansion of the universe for instance, all hinge on whether it will ever be possible to reconcile the two sets of physics on the one hand the set of rules for subatomic particles, and on the other for the macro, tangible world. If you could discover some overarching framework, a ‘theory of everything’ for understanding the interaction of matter and energy with space and time,
and when and how and what comes into play, then the extreme eventuality is that you would be able to unleash the seemingly bizarre quantum phenomena that defy time and space and causality into our macro everyday world. This would be the breakdown of the biggest barrier of all, wrenching at the fabric of reality. You are currently, in the mid 21st century, living in a world where nothing is of relevance any longer in the real world, because the cyber-world has insulated you from it. But what if that reality itself was now changed to one in which space and time, as such, no longer existed?
Let's leave the physicist trying to conceive of an independence of space and time, in effect a future version of Haldane's first big question. Instead, imagine now that you are a chemist, and as such more fascinated by Haldane's second question ‘of matter’. Ever since the end of the 20th century your predecessors have been trying to manipulate atoms with an unprecedented degree of control. This new branch of science, nanoscience, continues to excite scientists and society alike, perhaps because it qualifies as both a ‘concept-driven revolution’, in that it opens up the possibility of seeing old things in new ways, but at the same time constitutes a ‘tool-driven revolution’, with the opportunity of making new discoveries.
The idea of nanoscience was first introduced in 1959 when the physicist Richard Feynman, later a Nobel-Prize-winner, delivered his famous lecture ‘There's Plenty of Room at the Bottom’. Feynman argued that there was nothing in the laws of quantum physics that precluded the invention of machines the size of molecules. The basic idea was that eventually scientists could adopt a ‘bottom up’ approach, placing atoms exactly where they wanted them. The term ‘nanotechnology’ was coined a few years later by the Japanese scientist Norio Taniguchi, to describe machining in the range of 0.1 to 100 nanometres; hence nanoscience, if we adhere to the strictest definition, deals with materials or systems with at least one dimension less than 100 nanometres, i.e., less than a tenth of a thousandth of a thousandth of a metre.
Tomorrow's People Page 24