Tomorrow's People
Page 15
Finally, the most positive scenario of the satisfied worker, happily retraining and managing a complex career portfolio, is one where work and leisure merge; but even here doubts about motivation might creep in. If the majority can live life without work, then why not you? So you enjoy it – why? In a world that offers every physical comfort, what exactly are you, oddball that you are, now trying to achieve?
Adam Curtis, the TV producer who made the ground-breaking series The Century of the Self, maintains that it was only in the 20th century that we saw the creation of an all-consuming ‘self’. Freud's work on human needs and drives unwittingly led to the belief that the pursuit of satisfaction and happiness was man's ultimate goal. His nephew Edward Bernays then exploited Freud's theories to coax people into wanting something that they did not need: for example, women were persuaded to smoke by associating lighting a cigarette with lighting a tiny ‘torch of freedom’, hence any woman who smoked was a free-spirited individual.
By establishing the notion of the self, an ‘ego’, it was possible to suppress deep destructive desires – Freud's ‘id’. In less humanitarian times, this id had been manipulated directly with the ‘soma’ of the time – ‘bread and circuses’ in the case of ancient Rome – or channelled into tribal rituals or suppressed with harsh and swift punishment. But in a democratic and commercial modern society such courses of action were no longer options. Instead, by cultivating the concept of a self, Bernays and others have been able to cultivate a means of seeming to give people what they think they want – an indulgence and celebration of the ego – and thereby simultaneously suppressing their dangerous mass instincts. Even left-wing policies, suggests Curtis, are moulded to cater for people's individual desires, as capitalism has learnt to do with products.
But now we may be about to revert to the old, pre-Freudian, pre-Bernays ways. We may be about to enter a future where the concept of the self is fragile, due to the shifting impermanence and arbitrariness of the surrounding cyber-world, a future where the boundaries between our bodies and ‘out there’ are blurred. This future might also be one in which IT-based products can trigger or suppress emotions by manipulating the brain landscape as directly as drugs, but remotely and unbidden with far greater precision. If so, then for the first time ever the importance of self-fulfilment and status might fade. The sense of self will be obsolete, no longer fighting against the tide of relentless retraining and re-invention, no longer seeking a sense of purpose whilst unemployed, and no longer working to gain things that you no longer want. Instead you will surrender wholesale to the once-occasional passive reception of the senses, interacting with and to the ebb and flow of inputs and outputs from all other fellow-beings via your cyber-world. You – only there will no longer be any ‘you’ – are now simply a consumer of technology.
The technology of work in the future, if it exists at all, looks set to become increasingly bio-dominated, from the computers we use to the fuel that powers them. The degree to which we work will depend on how efficient that work is, and how sophisticated the technology. But more fundamentally still, by reflecting on what we will be doing in the future, we are exploring the values and agendas of forthcoming generations. All round, it seems likely that traditional boundaries in our lives will be breaking down – between home and work, work and leisure, work and retirement, one generation and the next, and even roles within a family unit. But up to now it has been those very boundaries and stages in our life-narrative that have defined us. The questions and possibilities that have arisen from thinking about the future of work and leisure boil down to the simple yet important question of the extent to which our successors will see themselves as individuals, how robust their sense of self will be, and thus the degree to which they may be manipulated.
If as part of the workforce we need to change all the time, to become reactive rather than consistent as a persona, then our clear-cut sense of who we are might start to unravel. One possibility is that as a substitute for the traditional status of work hobbies might become more important. But then we would need clear time for them to become like alternative jobs. Or we could develop a second-hand ego, by viewing fiction or indeed recorded facts. Then again, we could develop still further the kind of public ego, the collective identity rampant at football stadia, and lose the sense of self altogether by ‘going real’ en masse from time to time, or perhaps do the opposite – seek oblivion in the safer, more familiar cyber-world, or with drugs. In short we will need to cope with a void, the loss of a sense of self previously derived from work. In order to understand how the self of the future might be assembled, and thus how easy it might be to control, let's start at the very beginning: with genes.
5
Reproduction: How will we view life?
The most important thing in life is life – both sustaining it in ourselves for as long as possible and having children to attain immortality by proxy. Now advances in biotechnology are opening up new avenues. The ‘posthuman’ individual, a genetically modified human being, may well become reality before this century is out. Gregory Stock of UCLA, adviser on biotechnology to Bill Clinton, predicts:
A thousand years hence, these future humans – whoever or whatever they may be – will look back on our era as a challenging, difficult, traumatic moment. They will likely see it as a strange and primitive time when people lived only seventy or eighty years, died of awful diseases, and conceived their children outside a laboratory by a random, unpredictable meeting of sperm and egg. But they will also see our era as the fertile, extraordinary epoch that laid the foundation of their lives and their society. The cornerstone will almost certainly be the reworking of human biology and reproduction.
But such promise also casts a shadow. As Francis Fukuyama, Professor of Political Economy at Johns Hopkins University, warns: ‘… the most significant threat posed by contemporary biotechnology is the possibility that it will alter human nature and thereby move us into a “posthuman” state of history.’
For a long time now genes have been big news. Still, as bio-high-tech increasingly seeps into everyday life ethical dilemmas mushroom. As I am writing this, today's papers cover no less than three stories relating to genes. One is the story of Charlie Whitaker, a young sufferer of an anaemia that gives him a life expectancy of less than thirty years, who is being denied treatment that could save him: his illness could be cured by appropriate replacement cells that would then proliferate in his blood. These early-stage stem cells can come only from blood in the umbilical cord, and have to be as closely tissue-matched as possible. So Charlie's parents need to have another child – with the right genes. The technique (preimplantation genetic screening) is available to select an embryo resulting from IVF to ensure that Charlie's mother becomes pregnant with a brother or sister who will save his life. But the British Human Fertilisation and Embryology Authority are denying permission, fearing the prospect of giving the green light to ‘designer’ babies.
On turning the page of the newspaper, I read of the tragic case of a young woman recovering from an ovarian cancer that has left her infertile. However, prior to her chemotherapy treatment, she and her partner underwent IVF to ensure that they would still be able to have children. Now that the couple have split up, the still-fertile ex-boyfriend has withdrawn his permission for the embryos to be implanted; indeed, according to current legislation, they may have to be destroyed.
Finally, I see that scientists have discovered a gene linked to aggression; this could be important news for one Stephen Mobley, whose defence against his imminent execution in the USA is that he comes from a long line of rapists and criminals, and hence his genetic make-up leaves him helpless. So, as genetic profiling becomes more sophisticated and more common will we all soon be able to blame our genes for any transgression? And if not, where and how will we set the defining criteria for drawing the line? It is hard to imagine a clear boundary, on one side of which there will be those who can turn over accountability to their biological inheritan
ce, whilst on the other there will be the rest of us who, despite our genetic destiny, have to face the music.
As we look into the future, the manipulation of DNA will increasingly have implications for how we reproduce; but the new technologies will also touch our own little span of mortality, regarding our expectations of life, and have a dramatic influence on how we deal with disease and ageing. To most people, including molecular biologists themselves, the rate of progress has been startling. So fast have things moved that it seems we are already living in a sci-fi world where the previously most non-negotiable of tenets are no longer inviolate. Just how far along are we at this very moment?
At the centre of every cell of your body, except egg or sperm cells, lurk twenty-three pairs of large pieces (chromosomes) of deoxyribonucleic acid (DNA). A gene is any one particular segment of DNA that will eventually trigger the production of a particular protein. Amazingly, however, only a puny 3 per cent of DNA carries out this central job, and although some portions will be regulating other genes, the vast majority is ‘junk’.
Another surprise is the remarkably small number of genes that we humans possess. The lowly worm C. elegans is made up of only 900 cells and has some 20,000 genes whilst we, whose brains alone are composed of 100 billion neurons and ten times as many ‘glial’ cells, have at the very most merely four times as many genes. When the human genome was completely mapped in 2001 we turned out to have only between 30,000 and 80,000 genes. And yet even though we have this relatively small number of genes, most of them shared with other animals and even plants, there are already as many as 5,000 known genetic disorders. Traits – anything from breast cancer to depression – are not contained completely and independently within a single gene: for example, there are already some ten or so genes ‘for’ Alzheimer's disease. And even in the rarer cases where only a single aberrant gene is the cause of the problem it is hard to imagine that such a condition is actually trapped within the strands of DNA. Even Gregor Mendel's purported gene ‘for’ yellowness in peas was actually the expression of an additional enzyme to destroy the green pigment chlorophyll.
That said, surely the more we know about which range of genes is linked, however indirectly, to which illness the better. After all, such information can but help with improved diagnosis and eventually with the design of new types of treatment. One way to discover new genes that are related to disease is by studying the genetics of populations. It is critical to be able to scrutinize large groups of humanity who have existed as relatively isolated communities, such as the citizens of Iceland, for whom the caprices of history and geography have ordained far less mixing of the overall gene pool than usual. On this smaller genetic stage there is far more chance that any variation in a particular gene can be more easily matched up with any particular disease. A second approach is to identify variations in genes due to mutations over centuries of evolution. These variations are known as ‘polymorphisms’; since they are usually due to a change in only one rung (nucleotide) of the ladder-like structure of DNA the cognoscenti refer to them as single nucleotide polymorphisms (SNPs, pronounced ‘snips’).
Already this type of comparison between genetic deviation and illness has proved useful: gene mutations have given valuable insights into developing screens for individuals at particular risk of a range of conditions from breast cancer to heart disease. For example, a change (mutation) in one particular gene (P53) is linked to over fifty forms of cancer – in fact a colossal 90 per cent of all cervical cancers, 80 per cent of all colon cancers and 50 per cent of all brain cancers are associated with an aberration in this single gene. Indeed, so important is this gene in relation to cancer that in 1994 the journal Science voted it ‘Molecule of the Year’.
Easy to see, then, how genetic tests will become increasingly part of our lives. They will soon, routinely, be detecting whether a host of diseases are already present in your body, or likely to occur, whether you might be a carrier, whether you are at increased risk of a complex condition such as heart disease or cancer, and what specific reactions you might have to medicines, food or environmental factors. Along with establishing your paternity and genealogical record, your individual past, present and future will soon be expressed in cocktails of genes which interact with your environment to give, well, you.
Needless to say, such unprecedented power of diagnosis and awareness of the risks will radically change how future generations think and live. Eventually it will be normal for you, or any third party, to refer to a biochip read-out of your entire genetic profile, thereby allowing personal screening for side effects and possible differential responses to drugs. The downside, of course, is that there is a vast new potential for individual discrimination and inevitable issues over insurance premiums. Moreover, as different types of tests become rapid, easy and inexpensive so debate will increase over whether they should be sold directly to the public. Whereas we are comfortable with home pregnancy tests, imagine a DIY test for Alzheimer's disease… And if pretty much everyone has to get used to living with risks that, with the benefit of early warning, they might be able to reduce perhaps, but not abolish completely, then that could make for a more edgy and self-conscious society. Some might argue that we know more now about the risk of motor vehicle accidents than forty years ago but this has led to a safer society with more choices. Then again, the risk we take when driving in a car is a short-term, acute one, and we take appropriate, direct measures for improving the safety of each journey, which is not quite comparable to the far more indirect and long-term link between risk factors and a disease.
This living with risk will be even more problematic if you discover that you have an aberrant gene ‘for’ some brain disorder or mental function, or if you wish to exploit normal gene function to boost the cerebral prowess of your progeny. The special difficulty here is that, when it comes to the brain, to those most elusive and fascinating mental traits, genetic provenance is particularly tenuous. Take, for example, the gene that would surely be the most coveted of all by prospective parents, that for IQ. Even if a single gene for IQ were to exist, it seems its contribution would not be a major one. Roger Gosden, in his book Designer Babies, has actually put some numbers on the concept. On a scale from o, as a purely environmental causation, to I, completely hereditary, IQ scores show about a 0.3 score of heritability. A very major single gene, were it to be discovered in relation to IQ, would probably account for at most about 5 per cent of the trait so its presence would contribute to some 1.5 per cent of the final IQ scale – if you had an IQ of 100, this gene would then boost it to 101.5. Gosden advises parents of the future to invest any money they might have in their children's education rather than in dreams of genetic enhancement. On the other hand, most people would put the value of heritability for IQ much higher, probably about 0.6. In any event, education seems to add about five IQ points, which is slightly more than a single gene with a heritability of 0.6 would. However, such calculations have only limited value; the real issue is that the gene could influence how you interact with your environment, and the old line drawn between ‘nature’ and ‘nurture’ is becoming so blurred that it really is not very helpful.
In 1994 Richard Hernstein and Charles Murray reached conclusions that were unpalatable to virtually everyone: they showed that much of the variance in intelligence was due to genes, and therefore had a racial bias. But the study should now be revisited. In particular, the finding that African-Americans had a significantly lower IQ than whites seems explicable, not, after all, by their genes but by environmental factors. The psychologist James Flynn has subsequently shown that IQ scores have been rising over the past generation in nearly all developed countries; the IQ gap between white American children and the children of non-Caucasians, and indeed all immigrant groups, could well be closing, as improved nutrition and education leave their mark, literally, on the brain.
In any case, there is the enduring question of what we mean by ‘intelligence’, and how such a vague concept can ever
be realized by the mindless mechanics of physical neurons in your brain. One example of the yawning gap between behaviour and thoughts, on the one hand, and genes and proteins, on the other, cited by Francis Fukuyama, comes from the neuroscientist Joe Tsien. Tsien found that, by eliminating or adding a certain gene, he could respectively worsen or improve memory in mice. Could he have discovered the gene that codes for the protein ‘for’ memory?
Hardly. This gene is one of several controlling the availability of a protein that happens to be involved in a particular chemical communication system in the brain, one which allows large amounts of calcium into the neurons. But the big problem is that calcium is like grapeshot inside a cell: once it has gained entry it will find many different molecular targets, and result, directly or indirectly, in many different effects, some beneficial, some toxic. Memory will not be the only mental property to be affected. And whilst these changes may well be necessary for memory to take place, they are far from being sufficient. Many other factors will also determine whether, and how well, the brain retains a memory. Remember that the protein made by a particular gene does not have a one-to-one correspondence with a mental trait. In most cases, that protein (and the gene that expressed it) will be involved with many traits, and, conversely, any one trait will involve many proteins (and genes).
A similar situation arises with the highly controversial ‘gay gene’. At first glance there does indeed appear to be a strong hereditary component in sexual orientation. (For example, one study has shown that 52 per cent of identical twins with a gay twin are themselves gay; but for non-identical twins, who, of course, share only half their genes, that number falls to 22 per cent.) But how might we then explain the fact that people often switch their sexual preferences as they go through life? Even if there were a gay gene, there is no saying whether or not it is being switched on or off at different times of life by different environmental factors. Moreover, the influence of ‘environment’, since this includes the macro, complex events of the outside world, is impossible to deconstruct and trace to immediate molecular phenomena outside the walls of brain cells. Indeed, being gay could well have much in common with heart disease. There is a very small number of people with familial hypercholesterolaemia who are genetically determined to have heart disease, and do develop it, but there are far more sufferers whose heart disease is the result of an interaction between genes and environment. Meanwhile, a small number of other individuals have genes that predict a very low risk of heart disease but drink or smoke heavily, are obese, or do not exercise – and have heart disease. The relationship between genes and sexual preference could be similar: for most people it is a combination of genetic predisposition plus environment, including a component of choice.