Book Read Free

Pandora's Seed

Page 8

by Spencer Wells


  The Yanomami fascinated Neel because they had only recently been contacted by the outside world, meaning that they were presumably living in a state of nature—in effect, the way our ancestors probably lived. Neel thought that by studying such populations scientists could gain an insight into the past selective pressures on the human genome, which (as we have seen in Chapter 1) changed significantly during the transition to agricultural life. One of the main environmental changes, Neel thought, had to do with the increase in readily available food in agricultural societies. Although we now know that Paleolithic hunter-gatherers rarely led lives of privation, and may have had better access to a wide variety of food than their agriculturalist neighbors, it is true that the diet of agriculturalists is fundamentally different from that of the typical hunter-gatherer. In general, agriculturalists have a much higher percentage of carbohydrates—sugars and starches—in their diet. In China, for instance, the large quantities of rice consumed yield a diet that is about 75 percent carbohydrate. High levels of physical activity, however, mean that obesity is less of a problem in largely rural China. In Chinese cities, though, obesity is on the increase, as is diabetes.

  Neel suggested that diabetes, rare among hunter-gatherers, is a physiological reaction to the sudden increase in easily available calories. What might have been highly adaptive to a hunter-gatherer—the ability to maintain physiological function under conditions of low caloric intake—could be maladaptive with a richer diet. He called this the “thrifty genotype,” and it has become a widely accepted hypothesis for the widespread existence of diabetes, which should otherwise, due to the action of natural selection, be extremely rare.

  Diabetes comes in two forms. Type 1 usually manifests itself in childhood; it is the one that is treated using insulin shots. It is caused by a complex interaction between inherited susceptibility factors—DNA variants—and the environment in which the child is raised. Type 2 is more complicated and usually occurs in adulthood (although teenagers and even children are now showing up with this form as well). It is caused in part by genetic factors but also by significant environmental effects, particularly diet. In particular, more than 80 percent of people with type 2 diabetes are overweight.

  The best example of the negative effects of the thrifty genotype is found among Pacific Islanders and Native Americans. Samoans, for instance, settled their islands over 3,000 years ago, during the so-called Polynesian expansion from Southeast Asia into the Pacific. The ocean voyages involved in these extraordinary migrations would have placed people under intense physiological stress for weeks at a time, and it is possible that there was strong selection for people who were able to reduce the rate at which they burned calories. This thrifty genotype would have been a fantastic thing to have on the voyage, but it would have been more of a problem once the group arrived at their destination. However, as with other subsistence agriculturalists around the world, the high levels of physical activity involved in everyday life until quite recently would have prevented an obesity epidemic.

  With the arrival of modern civilization, however, the Samoans have stopped tending fields and fishing from paddled outrigger canoes, and now spend much of their time inactive. Particularly in American Samoa, their diet, once rich in fish and vegetables, has become more like that of the average mainland American, and is primarily composed of imported foods like Spam (which entered the Samoan diet via the United States military during the Second World War) and other highly processed delicacies. The result is that nearly two-thirds of urban Samoans are now clinically obese, and even their rural counterparts have obesity rates of around 50 percent. This might produce fantastic sumo wrestlers (one of the top wrestlers in the 1990s, Musashimaru Koyo, was Samoan), but it also gives this tiny island nation one of the highest incidences of diabetes in the world—25 percent in men and 15 percent in women. Sometimes being thrifty can kill you.

  Similarly, the Pima Indians of the southwestern United States and northern Mexico show a frighteningly strong correlation between lifestyle and diabetes. Known for having the highest incidence of diabetes in the world—around 40 percent—the Pima living in the United States have, like the American Samoans, succumbed to the lure of a modern lifestyle. Their cousins across the border in Mexico, however, with a much more traditional way of life and higher levels of physical activity, have a correspondingly lower (though still relatively high) incidence of diabetes, around 7 percent. Clearly, something about the American lifestyle is more than quintupling the Pima Indians’ diabetes risk. That something has a lot to do with the large number of calories in the typical American diet, many of them in the form of processed carbohydrates and fats, as well as the lower levels of activity in our sedentary culture.

  Overeating and low levels of physical activity, leading to obesity, and cigarette smoking, leading to hypertension and cancer, are the major causes of preventable death in the world today. As my own doctor says, “It’s really not that complicated—exercise and don’t smoke and you’ll be healthy.” But such a prescription is not so easy to follow, as we have a couple of things working against us in the form of evolutionary baggage. First, the desire to eat is a basic survival instinct, so it’s unnatural to try to reduce the amount of food we consume. Second, our hunter-gatherer ancestors would have found the idea of exercise for exercise’s sake ludicrous—their goal, as with most hunter-gatherers today, was to minimize the amount of energy they expended when going about their everyday activities, because every calorie they wasted had to be replaced by finding more food. Finally, chemical compounds like nicotine actually mimic substances that occur naturally in our bodies, which is why our nerve cells have receptors on their surfaces that are stimulated by them. It’s just bad luck that a molecule that evolved to protect the tobacco plant from pests turned out to be a stimulant to humans, and that the easiest way to get this compound into our systems—burning the leaves of the plant—also produces high levels of tar, which is the primary culprit behind smoking’s cancerous effects.

  What clearly seems to be happening is a profound shift in the causes of disease from threats from without to threats from within. More and more, we are causing our own deaths, rather than succumbing to some other force that is largely beyond our control. This can be seen in the increased focus of public health agencies on chronic diseases like diabetes. America’s Centers for Disease Control and Prevention, for instance, was originally founded in 1942 to deal with malaria, then a widespread threat in the southern United States. Later it focused on vaccinations and emerging infectious diseases, but a significant amount of its work these days is on the far greater threats of noncommunicable diseases. Some of this work is epidemiological, but much of it is spent communicating the risks of poor diet, lack of physical activity, and smoking. The informational pamphlet and, increasingly, Web-based communications have become the twenty-first-century equivalents of the smallpox vaccine. Unlike a vaccine, though, protection is granted only through the active participation of the patient population over their lifetimes—something that is clearly less likely than a predictable immune response to a vaccination.

  A recent report compiled by the CDC and the nonprofit institute RTI International blames obesity for much of the massive increase in health-care costs in the United States over the past decade. Overall, they calculate that obese people average $4,871 in medical bills each year, while people with a healthy weight cost $3,442. According to health economist Eric Finkelstein, the lead researcher on the report, “obesity is the single biggest reason for the increase in health care costs.” The increase in American obesity is not only a public health hazard—it’s an economic time bomb as well.

  At Dollywood, I got a view of the future—a possible future, where the human form has changed to such an extent that people waddle rather than walk and can’t sit in seats designed to accommodate even fairly large frames, where even minor physical activity produces breathlessness, and where the ingredients in the food sound more like a chemistry lesson than something you�
��d consider eating. But solving the obesity epidemic, with its complex interactions between genetics, income, education, and culture, requires more than simply prescribing a daily walk and fewer cheeseburgers and sugary sodas. It requires an understanding of the long-term history of human disease, and how friends—the plants we cultivate—ultimately became enemies.

  THREE WAVES

  In late February 2003, a Chinese doctor traveled to Hong Kong to attend his nephew’s wedding. He was sick at the time, but he put it down to a cold or perhaps the flu. He certainly didn’t suspect that he would inadvertently produce a global epidemic that would ultimately infect thousands of people around the world, lead to the deaths of hundreds, and cause a noticeable drop in world economic output, equivalent to around $10 billion. Tourism revenue—plane tickets, hotel stays, meals out—dropped by 9 percent worldwide, and to this day visitors to Singapore and some other Southeast Asian cities are examined by an infrared temperature scanner as they pass through security, like something out of a science fiction film. Too red a face indicates a fever, which will lead to further questioning by the medical authorities to ascertain whether the traveler might be carrying a dangerously infectious disease.

  The doctor had unknowingly been infected with a new virus, one now understood to be a member of the coronavirus family, whose members include viruses that cause the common cold. But our Chinese doctor’s illness was no case of the sniffles. People infected with the new virus spiked high fevers, eventually developed pneumonia, and, in around 10 percent of the cases, died by drowning in their own lung fluid. Severe acute respiratory syndrome, or SARS, had been born.

  The most amazing thing about SARS is the speed with which it spread. What probably began in late 2002 when a chicken or pig virus in southern China hopped from its animal host to a human—possibly a food worker—had spread by late March 2003 to countries as far away as Canada, Switzerland, and South Africa. All of the people infected with the virus initially had spent some time in Southeast Asia, but secondary infections became more common in April as the virus spread from the primary infectees to their friends, family, and hospital staff. This is because of the ease with which it was transmitted—essentially like a cold virus, through sneezing and contact with an infected person.

  In the grand scheme of things, SARS was not a major killer. Far more devastating were the fears and paranoia it engendered, which contributed to the failure of many Asian businesses and several airlines in 2003. The death rate, although three to four times higher than that of the influenza virus that caused the 1918–19 global epidemic, in which around twenty million people died, was not as high as those of many other diseases. Among these are Ebola, whose outbreaks have been limited thus far to remote locations in central Africa, Lassa fever, Marburg virus, and others with death rates approaching 100 percent—a far more sobering figure, although the isolated nature of their outbreaks and their difficulty of transmission makes them somewhat less threatening.

  Not so for the much-discussed potential pandemic threat of so-called H5N1 avian flu, which has resulted in the death of more than half of the people who have been infected so far. Avian flu, as the name suggests, was originally transmitted to humans through close contact with birds. This type of contact occurs most often on farms and in markets, and its initial emergence in China and Southeast Asia is thought to reflect the especially close contact in this region between humans and their domesticated ducks and chickens, something not as common in many other parts of the world today, where factory farming has become more widespread. Similarly, the 2009 spread of a much less deadly form of swine flu—similar in its structure to the 1918 flu strain—has led some experts to suggest that we are on a collision course with a new plague of biblical proportions.

  It turns out, interestingly, that the “new threat” of deadly diseases spread through close contact with animals is not new at all. In his influential 1976 book Plagues and Peoples, Canadian-American historian William McNeill examined the impact of disease—particularly epidemic disease—on human history. Disease, McNeill argued, has long been a catalyst for significant historical events, and he summoned groundbreaking evidence to explain the role of the Mongol Empire in Europe’s fourteenth-century black plague epidemic and the importance of Eurasian diseases in allowing Spanish conquistadors to subdue the empires of the Americas. Jared Diamond’s Guns, Germs, and Steel is perhaps the best known of the many later books influenced by McNeill’s work.

  In Plagues and Peoples, McNeill traces the origin of many diseases common today back to changes in human society during the Neolithic period. Many of these changes we are familiar with from the last chapter, including the increasing number of people living in a relatively small space, allowing rapid transmission of diseases by infected individuals, and a large enough pool of uninfected people to permit the emergence of epidemics. Perhaps the most important factor, though, was the domestication of animals. As the human population increased in early farming communities, hunting was no longer a viable option—as with wild seed-bearing grasses, the supply of wild animals was limited by the natural carrying capacity of the land. This meant that many were soon hunted to near extinction. The necessity of creating a stable food supply led human populations in the Middle East to begin domesticating sheep, goats, pigs, and cattle from their wild progenitors by around 8000 B.C., and the Southeast Asian population to domesticate the chicken by around 6000 B.C. This created a reliable source of meat in the Neolithic diet, but the large numbers of people and animals cohabiting also created an environment that had never before existed in human history.

  For the first time, people and animals were living in the same communities. While Paleolithic hunters had certainly come into contact with their prey after a successful hunt, the number of wild animals contacted was a small fraction of those living in the newly domesticated Neolithic herds. Also, most of these animals were dead; this would have decreased the chances of transmitting many diseases, but perhaps facilitated the transfer of blood-borne infections. When we started living close to animals throughout our lives—particularly as children—the odds of diseases being transmitted increased significantly. Although some such infections had probably always existed to a lesser extent in both the animal and human populations, suddenly there was a brand-new opportunity to swap hosts. The microorganisms had a field day.

  McNeill wrote that of the diseases shared by humans and their domesticated animals, twenty-six are found in chickens, forty-two in pigs, forty-six in sheep and goats, and fifty in cattle. Most of the worst scourges of human health until the advent of vaccination in the eighteenth century were imports from our farm animals, including measles, tuberculosis, smallpox, and influenza. Bubonic plague was transmitted to us by fleas from rats living in human settlements. As far as we can tell from the archaeological record, none of these so-called zoonotic diseases (from the Greek zoon, for animal, and nosos, for disease) afflicted our Paleolithic ancestors—all seem to have arisen in the Neolithic with the spread of farming. McNeill suggests that many of the plagues described in the Bible may coincide with the explosion of zoonotic diseases during the emergence of the urban civilizations of the Neolithic, Bronze, and Iron Ages.

  What is clear is that a new source of human mortality had arrived on the scene. This does, however, raise the question of what people had been dying of before the development of agriculture. Were there really no diseases in the human population? Of course there were. It’s likely that macroparasites—things such as tapeworms that can be seen by the naked eye—were problems for our distant ancestors. Most of these infections generally would have produced little beyond feelings of malaise, though—not acute, debilitating symptoms like high fevers, organ failure, and death—in part because we had probably been evolving together with these parasites for such a long time. Over millions of years, an evolutionary process known as mutualism would have led the parasites to produce less acute physical symptoms in their hosts (us), since it does a parasite little good to kill its host a
nd thus its source of food, and we would have adapted to their presence. In general, the longer an infection has been around, the less virulent it is, the symptoms it elicits in the host becoming less severe over many generations. New diseases that erupt suddenly into a previously unexposed population often have extreme outcomes, including death.

  If macroparasites couldn’t have produced a significant amount of mortality during the Paleolithic period, and most disease-causing microorganisms hadn’t yet had a chance to pass from animals to humans, what did our hunter-gatherer ancestors die of? According to British evolutionary biologist J.B.S. Haldane, traumatic injuries were the most likely cause of death throughout most of human history. Does this mean we spring from a race of klutzes, who tripped and fell their way through the Paleolithic? No: such injuries would have included wounds sustained during hunting and skirmishes with other groups, the traumas associated with childbirth (a significant source of mortality for both mother and child until quite recently), and accidental falls and drownings. All of these hazards, coupled with infections from the wounds, would have been the main cause of hunter-gatherer morbidity and mortality.

  So, we seem to have evidence for an interesting pattern—three waves of mortality as we move from Paleolithic times to the present. The first is trauma, primary from the time of our hominid ancestors until the dawn of the Neolithic period. As people settled down and began to domesticate their animals rather than hunt them, infectious disease began to supersede trauma as a significant cause of mortality. This second wave, of infectious disease, continued to be the most significant cause of death until antibiotics were developed in the mid-twentieth century. The final wave has happened since the mid-twentieth century, in developed countries, where vaccinations and widespread antibiotic use have reduced infectious diseases to a fraction of their former threat. Now that we have stemmed the joint threats of trauma and infection, chronic diseases are becoming a larger threat. Most people prior to the twentieth century would have died relatively young, before these maladies—particularly diabetes, hypertension, stroke, and cancer—would have had a chance to develop. With modern medicine we’ve traded the scourges of trauma and infection for a threat from within our own bodies.

 

‹ Prev