Why We Get Sick

Home > Other > Why We Get Sick > Page 18
Why We Get Sick Page 18

by Randolph M. Nesse


  Novel environments often interact with previously invisible genetic quirks to cause more variation in phenotypes, some of it outside the normal range. As described already in the chapter on genetics, these abnormalities arise only when a vulnerable genotype encounters an environmental novelty. Novel physical, chemical, biological, and social influences will cause problems for some people and not others or will have different effects on different individuals depending on their specific genetic makeup. We have already discussed some human examples; for instance, the genetic quirks that cause myopia impose problems in literate societies, but they caused no difficulties for our ancestors.

  Our ways of getting food changed the environment in ways that created new problems. Thousands of years ago some of our ancestors hunted wild goats or cattle. Hunters followed herds for hours in the hope of killing one of the animals for food, hide, and other resources. Sometimes they may have found, early in the morning, the same herd they had been following the day before. If animals can be followed for two days, why not three, or a week, or a month? How long would this go on before the hunters would start thinking of the herd as their own, driving off wolves or rival groups of hunters or other predators and chasing strays back into the group to maintain a large herd? This process gradually converted hunters into nomadic herdsmen.

  Other ancestors were more vegetarian and found that some plants could produce a lot more food if they were intentionally planted for later harvest. Plowing, weeding, fertilizing, and selecting variants with the highest yields soon became standard practice and resulted in steadily greater and more reliable food production. It has been supposed that local increases in population may have encouraged the invention of agriculture or its adoption from neighboring peoples. Whether this is true or not, agriculture permitted the maintenance of much denser and more sedentary populations than could be supported by hunter-gatherer economies. Increased population density then became a source of other problems, some of which will be discussed in this, others in the next four chapters.

  MODERN DIETARY INADEQUACIES

  Paradoxically, the increased food production made possible by herding and agriculture resulted in nutritional shortages. There are more calories and protein in a bushel of wheat than in a handful of wild berries, but there is more vitamin C in the berries. If wheat provides most of the calories and protein for a farming community, deficiencies of vitamins and other trace nutrients are much more likely to arise than they would be with the more diversified diets of hunter-gatherers. If the wheat or other agricultural produce is also used as feed for the domestic animals that provide meat or eggs or milk, the farmers’ meals are much improved, but shortages, especially of vitamin C, remain a threat.

  Iceland is a good example, with a vitamin C problem that lasted well into this century. Icelandic farmers raised mainly sheep, which grazed the wild grasses of the countryside. The more successful families might have had a dairy cow, but mutton provided a large part of the diet, and wool was the chief commercial export, sold mostly to Danish colonials. The money so earned allowed the farmers to import flour and such luxuries as coffee and sugar. Nothing in the list so far contains vitamin C, which was provided mainly by blueberries and other wild plant foods. Unfortunately, the supply of these commodities was strictly seasonal. During winter and spring, when diets were notably lacking in vitamin C, many a seemingly robust and healthy Icelandic farmer would start bleeding from the gums and feeling lethargic and depressed, the usual symptoms of scurvy. Some members of a family would sicken and others not, with the severity of scurvy varying greatly.

  For those who survived the winter sick with scurvy, folk wisdom came to the rescue. As soon as the marshes thawed, people could dig angelica roots, which are a fair source of vitamin C. The so-called “scurvy grass” might be sprouting at the same time and could be eaten as an alternative. The observation that such wild produce could cure scurvy antedated the use of citrus fruits for preventing the disease among long-distance sailors. Scurvy is a disease of civilization. Before people relied heavily on domestic plants and animals, they never had such abnormal diets as those of Icelandic farmers in the winter or sailors at sea for months at a time.

  Long before there were any ocean voyages such as those of the original limeys or those that took the first settlers to Iceland, people suffered from other dietary deficiencies resulting from agriculture. About fifteen hundred years ago, some native tribes of the south central United States abandoned their hunter-gatherer lifestyles and started growing corn and beans. The change is clearly recorded in their skeletal remains. Compared with earlier skeletons, those of the farmers are on average less robust, and they often show effects of nutritional deficiencies of the B vitamins and perhaps protein. Despite these deficiencies, such farmers may have been less likely to die of starvation than their ancestors. They may even have been more fertile, because cornmeal and beans can facilitate earlier weaning. Nonetheless, in important respects, they were not as healthy.

  These diseases of civilization thus existed fifteen hundred years ago in what would become Tennessee and Alabama, and long before that in earlier agricultural regions of other continents. The same sorts of nutritional deficiencies afflict the impoverished people of many third-world countries today. Our Stone Age ancestors no doubt faced frequent shortages of food, but if they were getting enough calories they were probably getting enough vitamins and other trace nutrients. Shortages of specific vitamins and minerals arose in just the past ten thousand years or so.

  We are now aware of the need for vitamins and minerals, and we get more of them from a modern diversified diet than many early agriculturalists did. Contrary to pharmaceutical sales pitches, few modern people need vitamin supplements. If we eat a diverse array of fruits and vegetables, some of them preferably uncooked, and especially if we also get abundant protein from grains, legumes, and animal products, we are getting all the vitamins, minerals, and other nutrients we need. The current danger for most of us is not the deprivation suffered by our ancestors but an excess of nutrition.

  MODERN NUTRITIONAL EXCESSES

  A wise man once observed that it makes little sense to worry about excessive eating in the festive week from Christmas to New Year’s Day. It makes much more sense to worry about what we eat between New Year’s Day and Christmas. Of course, it is possible to overeat in a week. We can even overeat at one sitting, but this was also a danger in the Stone Age, and we are equipped with instincts to avoid doing so. There comes a point at which we feel stuffed and no longer hungry, even for that honey-cured Christmas ham. This normally puts an end to the meal and keeps us, as it did our ancestors, from overburdening the machinery of digestion, detoxification, and assimilation. Modern overnourishment is mainly the result of steady long-term overeating.

  In the Stone Age it was adaptive to pick the sweetest fruit available. What happens when you take people with this adaptation and put them in a world full of marshmallows and chocolate eclairs? Many will choose these modern delicacies over an equally available peach, itself sweeter than any fruit available in the Stone Age. Marsh-mallows and chocolate eclairs exemplify the supernormal stimuli described by students of animal behavior. The classic example came from observations on geese. If an egg rolls out of a nest, a brooding goose will reach out and roll it back with her chin. Her adaptive programming is “If a conspicuously egglike object is nearby, I must roll it into the nest.” What happens if you put both an egg and a tennis ball near her nest? She prefers the tennis ball. To her it looks more egglike than an egg. There can be supernormal stimuli in any sensory mode, for instance, taste. Next time you find yourself reaching for a slice of apple pie instead of an apple, think of that goose who seems to think she should incubate a tennis ball.

  Our dietary problems arise from a mismatch between the tastes evolved for Stone Age conditions and their likely effects today. Fat, sugar, and salt were in short supply through nearly all of our evolutionary history. Almost everyone, most of the time, would have been b
etter off with more of these substances, and it was consistently adaptive to want more and to try to get it. Today most of us can afford to eat more fat, sugar, and salt than is biologically adaptive, more than would ever have been available to our ancestors of a few thousand years ago. Figure 10-1 shows a plausible relationship between intake and benefit of these substances and proposes a contrast in the foraging capabilities of a Stone Age tribesman and of a high-salaried diner in a gourmet restaurant.

  An overwhelming amount of preventable disease in modern societies results from the devastating effects of a high-fat diet. Strokes and heart attacks, the greatest causes of early death in some social groups, result from arteries clogged with atherosclerotic lesions. Cancer rates are increased substantially by high-fat diets. Much diabetes results from the obesity caused by excess fat consumption. Forty percent of the calories in the average American diet come from fat, while the figure for the average hunter-gatherer is less than 20 percent. Some of our ancestors ate lots of meat, but the fat content of wild game is only about 15 percent. The single thing most people can do to most improve their health is to cut the fat content of their diets.

  FIGURE 10-1.

  Our view of the dependence of health and fitness on resource availability, such as dietary fat intake per month. We propose that fat availability in the Stone Age would seldom exceed the levels indicated. Today an originally adaptive craving for fatty foods may lead to intakes far out on the negative slope to the right.

  One of us once met with three others early one morning to travel to a hearing on claims that agricultural uses of pesticides were endangering the health of nearby suburban residents. A stop at a diner for breakfast yielded a vivid memory. One of the eaters lamented the likelihood that the wheat and eggs in his pancakes were no doubt contaminated with unnatural pesticides and antibiotics that might give him cancer ten or twenty years later. Perhaps so, but these toxins were a minor danger to his future health compared to the grossly unnatural fat content of his sausage and buttery pancakes, and the enormous caloric value of the syrup in which everything was bathed. The cumulative effect of that kind of eating is surely more likely to cause future health problems than are the traces of exotic chemicals.

  Some people are more prone to this sort of overdosing than others. This is indicated by observable variation across the spectrum from underweight to overweight. Overweight people are more likely to suffer the cardiovascular problems associated with excess nutrition and to have higher rates of various cancers. This common impression is supported by recent studies. University of Michigan geneticist James Neel and his associates have noted that efforts to relieve the chronic malnutrition of the Pima Indians of Arizona inadvertently caused an epidemic of obesity and diabetes. He proposed that the affected individuals had what he called “thrifty genotypes,” a genetically based ability to get and store food energy with unusual efficiency. With what seem like normal diets many Pimas steadily increase their stores of body fat. This could well be adaptive in a world that threatens frequent famine. Those who have built up copious fat stores might survive a prolonged food shortage while their less efficient associates perish. Thrifty genotypes are not adaptive in a world in which food shortages never occur. The most famine-adapted individuals may just get fatter and fatter until medical problems or other difficulties intervene.

  Excess nourishment is not an easily corrected health hazard, and many common solutions may do more harm than good. Voluntary restrictions on food intake may be interpreted by the body’s regulatory machinery as a food shortage. The result may be a resetting of the basal metabolism so that calories are used even more efficiently and further fat reserves are amassed. Another consequence of food restriction is intensified hunger, with consequent eating binges. Studies of artificial sweeteners fail to show that they help people to lose weight, a finding that might have been expected. Sweetness in the mouth, throughout human evolution, has reliably predicted sugar in the stomach and shortly thereafter in the bloodstream. It is not surprising that the sweet taste quickly resets metabolic processes so as to curtail the conversion of fat and carbohydrate reserves into blood sugar. This would be adaptive only if, in fact, the stomach contents quickly compensate for the change. If the sugar signal is a lie, there could soon be deficient blood sugar and increased hunger, especially for quick-energy sources like candy. There has been little recognition of such effects of artificial sweeteners. A similar hazard may be anticipated for nonnutritive fat substitutes. There are now desserts that look and taste like ice cream but are not only low in sugar but free of fat. What kind of signals do these send to the metabolic regulatory mechanisms?

  Dental cavities are rare in preagricultural societies. If dental workers had been conscious of Stone Age fitness requirements, they would have realized long ago that the twentieth-century epidemic of dental caries must have been due to some environmental novelty, which we now know to be the frequent and prolonged exposure of the teeth to sugar. It nourishes bacteria on the teeth that generate acid, which in turn erodes the dental enamel. Here likewise there is prehistoric evidence for the harmful effects of dietary sugar. Skeletal remains more than a thousand years old from coastal areas in what is now Georgia (USA) show few dental cavities. They became common with the introduction of maize-based agriculture, and perhaps corn syrup, at about that time. They became still more common with the introduction of other forms of sugar by European settlers.

  Cavities are technically not a nutritional problem, but they are a dietary problem and very much a disease of civilization. The good news is that they are of steadily decreasing concern. They were a serious scourge for adolescents and young adults born in the United States before 1940. Advances in preventive dentistry, such as fluoride treatment, have helped to overcome the difficulty, but before these advances could be made it was crucial to realize that sugar is the culprit.

  Simple rules and illustrative devices such as Figure 10-1 are always based on conceptual simplifications and all-else-equal assumptions. A diet that is too high in calories and fat for one person may be ideal for another. Much depends on age, size, sex, reproductive processes, genetic factors, and especially activity levels. Early subsistence farmers maintained what might be considered, from an evolutionary perspective, a normal activity level. Except for professional athletes, dancers, cowboys, and a few other groups, most people in modern industrial societies have abnormally low energy expenditures. Workers sitting in swivel chairs or in drivers’ seats of cars or even pushing vacuum cleaners or electrically powered lawn mowers are being sedentary, and their leisure hours may be even more so.

  During almost all of human evolution, it was adaptive to conserve energy by being as lazy as circumstances permitted. Energy was a vitally needed resource and could not be wasted. Today this take-it-easy adaptation may lead us to watch tennis on television when we would be better off playing it. This can only aggravate the effects of excess nutrition. The average office worker would be much more healthy if he or she spent the day digging clams or harvesting fruit in scattered tall trees. What would an ancestor of a few thousand years ago have thought of the expensive and complicated exercise machine in the office worker’s basement—especially if it were actually used?

  ADDICTIONS

  Historical and anthropological records show that opium and other psychotropic drugs have been available throughout human history, with almost every inhabited region supplying one or more substances with the potential for abuse. Most addicting substances are elaborated by plants as a way of discouraging insect pests and grazers. Many act on the nervous system, and a few just happen to induce pleasure in humans. Alcohol is present in very ripe fruit, and storage of fruit juices yields a beverage with an alcohol content of up to several percent.

  Substance abuse today is a greater problem than it was in preindustrial societies because of the technological innovations of the past few centuries or millennia. When every household had to make its own wine or other fermented beverage in small vessels and
with primitive equipment, it was unlikely that anyone would have enough for heavy daily consumption. Urban civilizations, with their professional vintners and brewers, were more likely to provide the quantities of alcoholic beverages that would permit the wealthier classes to get all they wanted. Improved methods of storage and transportation, which allowed British tribesmen to get drunk on Roman wines, were another factor in the advance of alcoholism.

  Another contribution to this advance was the invention of distillation. The readily available beverages containing a few percent alcohol could then be distilled into ones with high alcohol concentration. It may be easier to succumb to alcoholism by drinking gin than by drinking wine or beer. More recent innovations facilitated the production of heroin from opium and crack from cocaine, concentrates that are more rapidly addictive than the natural substances. The invention of hypodermic syringes is part of the same story. Similarly, the mass production of cigarettes from newly developed tobaccos that caused relatively little throat irritation greatly increased the incidence of nicotine addiction. Despite the great antiquity of addictive possibilities, the modern scourge of substance abuse is largely a product of our abnormal environment.

 

‹ Prev