Tamed

Home > Science > Tamed > Page 35
Tamed Page 35

by Alice Roberts


  Genomics – the study of entire genomes, not just bits of DNA parcelled up in mitochondria, or individual genes laid out in our chromosomes – has revealed a rich and complex history that we had no idea about, just ten years ago. Our ancestors met up with a range of other people – people different enough to be considered as separate species – and joined them, interbred with them. As American palaeoanthropologist John Hawks wrote in his blog: ‘It is notable that we now have evidence for interbreeding among every kind of hominin we have DNA from, and some we don’t.’ Geneticist and author Adam Rutherford, always focused on an exquisite turn of phrase, has described the dalliances that led to the inception of humanity as we know it as ‘one big, million-year clusterfuck’. Humans have always been – as Rutherford so neatly encapsulates it – both ‘horny and mobile’.

  As well as providing clues to the origin of Homo sapiens, to the original colonisation of Eurasia, and those astonishing revelations about interbreeding with other human species, genomes also contain traces of later events in prehistory. Hidden deep in our DNA are memories of countless voyages and expeditions – of pioneers and explorers whose names are long forgotten. It’s such a densely overwritten palimpsest, but geneticists are finally managing to draw out some of the detail from that archive.

  In Europe, there are genetic echoes of three major waves of immigration. The first wave represents the Palaeolithic colonisers – although the very first of this group to arrive, reaching Britain at the far western edge of Europe by 40,000 years ago, leave little genetic trace. Their population would have crashed at the last glacial maximum. But after the ice sheets receded, survivors in southern, Mediterranean refugia recolonised the north. These hunter-gatherers, though still nomadic, became a little more settled as the climate improved – as we’ve learnt from Mesolithic sites like that of Star Carr in Yorkshire. These people were soon to be joined by the second major wave of incomers – who would bring a whole new way of life with them. Farmers, originally from Central Anatolia, expanded across Europe, in fits and starts, and probably travelling by boat, reaching the Iberian Peninsula around 7,000 years ago, and settling in Scandinavia and Britain by 6,000 years ago – 2,000 years after that genetic trace of wheat at the bottom of the Solent. Rather than completely replacing local, hunter-gatherer populations, genetic studies reveal that the farmers joined forces with them. The Neolithic had arrived. In some places, foragers quickly switched from hunting and gathering to settling down and farming. In others, such as Iberia, people continued to hunt alongside farming. The third wave of immigrants arrived, with their horses and their new language, in the early Bronze Age – around 5,000 years ago – as the Yamnaya population expanded and spilled over into Europe. If you have broadly European heritage, you may well still have the odd piece of DNA from these ancient horse-riders and herders in your genome – a little bit of Yamnaya tucked away – despite all those generations and dilutions of that DNA in between. Sadly, this doesn’t mean that you have a natural affinity with horses, or an innate ability to ride them – that’s still something that needs to be learnt!

  The horse-riding herders of the steppe pushed east as well, replacing hunter-gatherer populations in southern Siberia. And another west-to-east migration in Asia took place around 3,000 years ago. Winding back much earlier, genetic studies have also helped to answer questions about the colonisation of the Americas. During times of low sea level, north-east Asia was connected to North America by the land bridge of Beringia. Human colonisers crossed it, to gain a toehold in the Yukon, before the peak of the last Ice Age. But they were stuck there until the huge ice sheets that blanketed North America began to melt a little at the edges, around 17,000 years ago. Then they could start to roam further south, probably using boats to travel and settle along the Pacific coast – getting as far as Chile by 14,600 years ago, as shown by the site of Monte Verde. All this we know from archaeology, but there were also some important challenges to this story. The skulls of some early Americans seemed to show morphological links with Polynesian, Japanese and even European populations. It was suggested that there had been an early migration into the Americas, and this population had later been replaced by colonisers from north-east Asia and Beringia. But when ancient DNA was extracted from those bones, it proved to be closest to living Native Americans, and the next closest match was Siberian and East Asian DNA. The idea of a population replacement could finally be put to rest – the first colonisers had come over the Bering land bridge, from north-east Asia, and filled up the continents, from north to south. However, there are genetic traces of significant later migrations in the far north – eastwards expansions of circumpolar people, from north-east Asia to the icy northern reaches of North America and into Greenland. The first was a Palaeo-Eskimo migration around 4,000 to 5,000 years ago, followed by an Inuit expansion between 4,000 and 3,000 years ago.

  In Africa, genomes of living people also bear witness to great population shifts – ancient expansions and migrations. Some 7,000 years ago, Sudanese pastoralists migrated to central and eastern Africa; 5,000 years ago, Ethiopian agro-pastoralists expanded into Kenya and Tanzania; and a major expansion got under way 4,000 years ago, when Bantu-speaking farmers spread from their homeland in Nigeria and Cameroon, southwards. They replaced foragers as they went, pushing them into marginal habitats, where we find the last remaining hunter-gatherers, such as the Bushmen of Namibia, eking out an existence today.

  Sunshine, mountain-tops and germs

  As humans spread around the world, and as the climate fluctuated, they faced new challenges. Our ancestors adapted in various ways. Some adaptations would have been physiological – adjustments occurring within a lifetime; whereas others involved genetic alterations – the real stuff of evolution. Together, these changes allowed people to survive and thrive in testing environments. As humans made their way into northerly latitudes, they would have found themselves entering landscapes where the environment transformed itself around them – as the seasons turned. Summers brought long days, while winter days were short, with sunshine a rare commodity. And as far as your body is concerned, it really is a commodity. Sunny days not only lift the spirit, they provide us with a metabolic benefit – because when you’re out in the sun, your skin is busy making vitamin D. Or at least, we transform a cholesterol-based compound to something that’s almost Vitamin D in the skin, then the liver and the kidneys carry out the last steps, adding hydrogen and oxygen to activate the vitamin.

  The importance of vitamin D to the body was elucidated in the early twentieth century, as researchers sought to understand and cure a condition which caused skeletal deformities in children: rickets. The industrialisation of Europe may have been a great technological step forward, and it may have ultimately led to all sorts of improvements in people’s lives, but there were plenty of casualties along the way. Crowded cities, working in factories, skies full of smog – all left their mark on the children of the Industrial Revolution. They didn’t grow properly, and their soft young bones set in awkward curves. Rickets continued to be both a troubling affliction and a mystery, until 1918, when a British physician by the name of Mellanby found that he could give dogs rickets by keeping them indoors and feeding them porridge – and that he could reverse the effect by giving them cod-liver oil. The following year, a German researcher called Huldschinsky found that shining ultraviolet light on to children with rickets could cure them. Other studies found that all sorts of food treated with ultraviolet light – vegetable oil, eggs, milk and lettuce – could protect against this disease. Without knowing it, the researchers were converting the cholesterol and plant sterols in these foods to pre-vitamin D. When, at last, the chemical identity of the essential compound was uncovered, chemists could begin to artificially synthesise vitamin D: finally, there was a pill for rickets. The German chemist who made the breakthrough, Windhaus, won a Nobel Prize for his efforts in 1928.

  Yet it still wasn’t clear how this substance worked its magic on bones. Over the ensuin
g decades of the twentieth century, research focused on following the compound on its journey through the body. It revealed that the vitamin acted like a hormone – once activated by the kidneys, it travelled in the bloodstream to the gut, carrying a message: ‘Get calcium’. But vitamin D is a busy little chemical, and by the 1980s it was becoming clear that, as well as its important job in calcium metabolism and building bones, it also plays a crucial role in the immune system. A lack of vitamin D means that you’re more likely to develop autoimmune diseases (where the armies of your immune system start to engage in friendly fire, or even mutiny), including diabetes, heart disease and specific types of cancer. A tiny dose of about 30 nano-grams per millilitre of blood seems to be the minimum amount of vitamin D the body needs to function healthily. While you can get some in your diet, most of us get around 90 per cent of the vitamin D we need by making it, in our skin, in the presence of sunlight.

  Of course sunlight, particularly the ultraviolet rays within it, is also potentially damaging. Human skin contains several compounds which act as natural sunscreen, including the pigment melanin. If you’re exposed to more sunlight than usual, your skin will start to make more of it, and you tan. This isn’t only true for pale-skinned people – dark-skinned people tan too. The first modern humans entering Eurasia would probably have had dark skin – perfect for the climate they’d come from. In very sunny places, you need lots of melanin to avoid sunburn – so it’s easy to see why natural selection would favour darker skins in equatorial regions. At the same time, in the tropics, enough UV radiation will make it through that filter to allow skin to photosynthesise vitamin D. But in a place with less sunlight, it seems logical to assume that dark skin could be so effectively filtering out UV that it would become impossible to make enough of this important vitamin. The adverse effects of deficiency, from an impaired immune system to rickets, would then mean that a selective pressure would be operating: anyone with slightly paler skin is likely to have had an advantage in survival and reproduction – they’d be more likely to pass their genes on to the next generation. And so whenever a chance mutation arose that tinkered with melanin production and produced paler skin, it would tend to spread through the population. It seems that this is exactly what happened. As you go further north, skin colour gets progressively paler. Both northern Europeans and northern Asians went through this process of adapting to low levels of sunlight separately – but via different mutations. It looks like a classic case of convergent evolution – a similar outcome achieved by different means.

  The ‘vitamin D hypothesis’, positing that pale skin evolved as an adaptation to the lack of sunlight in northerly latitudes, seems to make a lot of sense. The observation that dark-skinned people today, in the UK and in North America, tend to be subject to vitamin D deficiency more often than their pale-skinned counterparts, appears to support the hypothesis. However, careful measurements of vitamin D levels in living people have thrown a spanner in the works. Studies tracking vitamin D levels and exposure to sunlight have produced interesting and unexpected results. The studies found that vitamin D levels increased (up to a point) as exposure to sunlight increased, just as predicted. Covering up with clothes was understandably associated with lower levels of vitamin D in the bloodstream. But thinly applied sunscreen – while protecting from sunburn – didn’t appear to reduce vitamin D production. And neither did having a darker skin colour. Rather surprisingly, there was no difference in the boost to vitamin D production measured in dark-skinned versus fair-skinned people, exposed to the same amount of sunlight.

  This research certainly suggests that people with dark skin seem to be able to make vitamin D just as efficiently as pale people. At first glance, these new findings look as though they might bring all our theories about the evolution of human skin colour tumbling down. But there are still some real observations that require explanation: the skin colour of indigenous people does get paler as you go further north, and people with dark skin do tend to suffer more from vitamin D deficiency in northern countries.

  The first observation leads to a question about how changes happen in evolution – and they don’t always happen because a certain mutation confers an advantage. Sometimes they occur as mutations that are nearly neutral in terms of selection pressure spread through a population, in what’s known as genetic drift. This is essentially a random process, owing a lot to chance. Perhaps what happened as our ancestors moved north was that a strong selection pressure for dark skin – as defence against sunburn and skin cancer – eased off. Then, mutations for paler skin could occur without being weeded out – and could end up spreading via genetic drift. And actually, there isn’t a steady gradient of paling skin from the equator to northern latitudes; fair pigmentation evolved – probably quite late – only in populations living in the far north of Europe and Asia. The rest of Europe and Asia is full of people whose skin colour isn’t linked to latitude. Another problem with the vitamin D hypothesis is that there aren’t many skeletons bearing evidence of rickets prior to the Industrial Revolution.

  But what about people with dark skin today, in the UK, in North America, and the problem of vitamin D deficiency? One of the studies in contemporary populations found a clue, by asking people to fill in detailed questionnaires about what they did when it was sunny. It turns out that pale-skinned people tended to rush out in the sun when it appeared, while dark-skinned people more often stayed inside. In a place with plenty of strong sun, that’s probably a good strategy – but in the less sunny north, with weaker sun and less of it, you really need to make the most of sunny days – particularly in winter. For early modern humans – Palaeolithic, nomadic hunter-gatherers – spending time outdoors (or, more realistically, outside your tent) on a daily basis, all year round, would have been an inevitability. So while dark skin may represent an adaptation to strong sun in equatorial regions, the converse – pale skin as a general adaptation to more northerly latitudes – doesn’t stand up to scrutiny. Under the skin, however, there are less obvious changes in vitamin D metabolism that may yet represent real adaptations to higher latitudes. Northern European genomes contain mutations that increase the levels of the vitamin D precursor in the body, while dark-skinned people possess other mutations that enhance the absorption and transport of vitamin D around the body. Yet again, a simple, pervasive hypothesis has given way to a much more complex picture, as epidemiological rigour and genomic data are brought to bear. Over recent years, the story of human adaptedness to different latitudes has become more interesting and much less straightforward; much less – you might say – black and white.

  Whilst changing latitude appears to be linked to some metabolic adaptations, high altitudes also present a particular challenge. A specific variant of a gene called EPAS1 has been connected with an ability in some humans to handle low oxygen levels at high altitude. It’s linked to a reduced level of haemoglobin production, perfect for an oxygen-poor environment, along with denser networks of blood vessels. The EPAS1 variant shows clear signs of having been selected for in Tibetans – but its origin was mysterious. The pattern didn’t fit with it being either an existing variant that suddenly came into its own as people started to live at higher altitudes, or with it being a new, fortuitous mutation. Where had it come from? It was absent in all the individuals who provided DNA samples for the ambitious, international ‘1000 Genomes’ project, which was completed in 2015, apart from just two Chinese people. But – it was there in the Denisovan genome. So it looks like this EPAS1 variant in modern Tibetan genomes has been inherited from Denisovan ancestors – and then jealously preserved by positive selection. Like apples acquiring useful new adaptations by interbreeding with crabapples, our ancestors picked up on local, genetic knowledge.

  One of the most significant challenges of new or changing environments is the presence of novel pathogens. We’re constantly fighting a battle against microbes, and the history of this evolutionary arms race is embedded in our genomes. Some of the genetic variants that
have entered modern human genomes quite clearly came from Neanderthal and Denisovan ancestors – presumably they conferred some protection against specific infections, at particular times and in particular places.

  A gene inherited from Neanderthals that’s involved in fighting off viral infections pops up in one in twenty Europeans, but appears in over half of the modern population of Papua, where it seems to have been strongly selected for. Other genes linked to the immune system also seem to have come in from Neanderthals, and to have been selected for more strongly in some populations than others. It’s in patterns like this where we see the crucial importance of contingency in evolution: a genetic variant that may confer some resistance to a pathogen will become important – and selected for – if populations are exposed to that pathogen. If not, the variant may well disappear, or at least drop to low frequency in the gene pool.

  There’s a whole bunch of closely related genes in our genomes that are all involved with the important task of helping the body to recognise foreign invaders, and to mount attacks against them. They’re also involved in self-recognition – they encode proteins which are stuck like flags on the surface of our own cells, so that the immune system doesn’t mistake them for alien pathogens. They’re called HLA genes, and it’s estimated that, in modern Eurasians, more than half of these genes have been inherited from Neanderthals or Denisovans.

  There is a downside to some of the genes we’ve inherited from archaic humans, however. While they may have proved useful at various times in the past, some alleles are linked to deleterious effects today. Certain variants of HLA genes may create a predisposition to developing autoimmune diseases. This is essentially a failure of the self-recognition role of these HLA genes: the flag looks odd – alarmingly foreign to the immune system – and it ends up launching an attack on its own body’s cells. The immune system gene HLA-B*51, inherited from Neanderthals, is associated with a higher risk of developing an inflammatory condition called Behcet’s disease, which causes ulceration of the mouth and genitals, and inflammation of the eyes that can eventually lead to blindness. It’s rare in the UK, but affects about one in 250 people in Turkey. Behcet’s disease is also known as the ‘Silk Road disease’, but its origins appear to be much more ancient than the human trade in cloth. Millennia before the pathways we know as the Silk Road operated as trade routes, they were important for migration and colonisation. Perhaps modern humans were encountering and interbreeding with Neanderthals along these corridors through central Asia in deep antiquity.

 

‹ Prev