Here, then, is the puzzle. Why did the human species lose the precious knowledge of how to manufacture the ten amino acids and one essential fatty acid critical to building youthful smart brains? Why do we thrive better if our diet contains supplemental cholesterol? The human brain’s colossal size commands that its owner resupply this voracious organ with gargantuan quantities of these vital components. Natural Selection favored a Rabelaisian appetite over a dainty one.
Many other animals, including primates, have varying degrees of the same biochemical lapses, but the major difference between all the others and us is that they do not have an impatient brain tapping its toe waiting for its daily shipments. Any early humans who botched his or her access to animal foodstuffs were greatly disadvantaged in the genetic contest to survive and leave offspring. This was especially true for those adventurers who left the bounties of equatorial Africa and inadvertently walked right into the teeth of ice ages that lasted fifty thousand years each. On average, children of these individuals would not have been as strong or as smart as those whose parents supplied them with generous portions of cholesterol, essential amino acids, and the one fatty acid we cannot make. The red spice sprinkling this mix of delicacies would be the metallic presence of iron.* Once Mother Nature had begun to shove this one reluctant prey primate toward becoming a fearsome hunter, She made sure that his retreat would be difficult. Whereas the other primates are primarily vegetarians, with a handful eating occasional mouthfuls of meat, Homo sapiens would not go exploring without some beef jerky in his pouch.†
One reason Homo sapiens became a ruthless hunter was that, as he became more successful in his endeavor, Natural Selection closed off the avenue for him to remain a pure vegetarian. Because we are a transitional creature, we can, if we make a concerted effort, overcome our biochemical lapses. But a vegetarian must spend more time gathering a larger quantity of plant food to equal the same amount of calories available in meat. Vegetarians must spend more time and energy chewing their meals. And they must allot more time to rest until the massive diversion of the body’s blood supply to the gut passes. Attempting to digest a large meal containing hard-to-absorb nutrients mixed with cellulose takes its toll on the brain, as anyone attending a one o’clock lecture after a heavy noontime lunch knows. * This waiting period also occurs in carnivores, but the total time spent resting while digesting is less.
Sexual selection was at work in the human species in the same way it was in all other species. Women chose men who were the best hunters. A man who could provide a woman with a diet containing the fattiest meat won her heart over someone bearing a sling filled with kumquats and papaya.
An animal that abandons critical enzymatic pathways to seek finished products in its immediate environment can be deemed to be using a clever strategy. It has been argued that an organism’s fitness can be measured by how many metabolic processes it can “out-source.” An organism that locates a rich lode of hard-to-manufacture molecules in its habitat has freed up surplus energy that can better be allocated to the business of survival and reproduction. For instance, a carnivore is a more “efficient” eating machine than an herbivore, because rather than expending energy maintaining all the digestive machinery necessary to break down plant food into readily usable constituents, it overcomes its enzymatic deficiencies by simply letting the herbivore fatten up. Then, in a few gulps, the carnivore quickly devours the finished product, overarching the intermediate steps that the herbivore so painstakingly accomplished.
In this sense, it is a dog-eat-dog world, but in another, we are all interwoven in what the Disney classic The Lion King called the “Circle of Life.” A popular rendering of the science of ecology, the Circle of Life emphasizes the interconnectedness of all organisms. Biologists use the concept of symbiosis to describe how two different species coexist to assist each other. The unlikely plover bird that sits in a crocodile’s mouth cleaning the reptile’s jagged teeth busies itself, unconcerned that the crocodile could eat it with a snap of its jaws. The crocodile opens wide for its avian dentist because it has evolved a behavior pattern that allows the plover bird to perform a function beneficial to its survival. The bird symbiotically benefits, dining on tasty leftovers caught in the crevices between the croc’s teeth, and does not have to spend a morning searching for breakfast. The symbiotic relationship that exists between prey and predator is less obvious, but in the ecological overview, each aids the other. Lions that eat the old and the weak wildebeest serve the function of culling the herd and preserving fit genes, so that in the long run the herd prospers.
As hominid predators became bolder and more successful, they began to discard essential enzymatic pathways because they could eat what they needed to maintain their health. At some point in the recent past, Homo sapiens ceased being a highly successful predator living symbiotically within its ecological niche and instead became a parasite—a very large parasite, but a parasite nonetheless.
Parasites derive so many essential nutrients from their hosts that they cannot survive independent of them. Ten percent of all living species are classified as parasites. One familiar example is the tapeworm.
The key to performing as a successful parasite is to siphon just enough nutrients away from the parasite’s unlucky victim so that the host’s ability to continue to provision the pesky hanger-on remains unimpaired. A tapeworm luxuriating in the intestinal tract of a human can achieve lengths of over thirty feet. The person plagued by such a hitchhiker must eat to satisfy his or her needs and feed the stringy, unwelcome guest as well. The tapeworm has cleverly adapted in such a way that it does not have to lift a finger to find dinner, because it can depend on its host to bring it food. And the tapeworm is not such a glutton that it eats more than the host needs to maintain a minimum of health, so the host can continue to forage for the lazy tapeworm.
A parasite can be considered stupid when it makes such excessive demands on its host that it kills it. Then the parasite has sealed its fate, because not only does it lose its meal ticket but it, too, dies. An organism that kills its host ceases to be a parasite and is reclassified as a “pathogen.” It could be argued that Homo sapiens has degenerated from its beginnings as a symbiotic prey to a symbiotic predator to a parasite and has now transformed into a planet-devouring pathogen.
Think of the entire planet, with its blue oceans and pristine mountains, as a host. The roll call of species that humans have dispatched to the Land of the Extinct, when combined with deforestation, pollution, strip mining, overgrazing, or overfarming, has distinguished the bipedal primate as the planet’s most exasperating parasite—all in the space of 150,000 years. We have arrogated many of the earth’s resources simply to satisfy our craving for material comfort. While we have been congratulating ourselves on our species’ unrivaled domination, alarm bells are beginning to sound in all regions of the planet. From the perspective of other life-forms, we have transmogrified into the planet’s most virulent pathogen, and our frenzied degradation of our host, Earth, signals that we may be just another stupid parasite too feebleminded to realize that one should never bite the hand that feeds one.
Another extremely odd feature of the human digestive system pertaining to iron is vitamin C, otherwise known as ascorbic acid and present in abundance in fresh fruit. Linus Pauling, the quirky Nobel Prize laureate biochemist, alerted people to their bizarre vitamin C metabolism by lecturing on the subject with theatrical flair.* Pauling would ascend the stage wearing a rumpled suit whose inner coat pockets fairly bulged. He began his presentation by informing the audience that a healthy 180-pound pig made a certain quantity of grams of ascorbic acid per week. With a flourish, he would reach into his pocket and produce a test tube brimming with a white powder Pauling identified as swine ascorbic acid. He then claimed that a 180-pound goat produced a certain amount of vitamin C in the same time frame, and then retrieved from his voluminous coat pocket another test tube, filled with goat ascorbic acid. After running through nine familiar 180-pound mamm
als and revealing to the audience how each one was capable of producing its required vitamin C entirely by itself, Pauling then produced one last test tube. Pausing for emphasis, he displayed the empty tube and proclaimed, in a stentorian voice, “And this, ladies and gentlemen, is all the ascorbic acid a healthy 180-pound human can produce in the same time period.”
By his dramatic style and the prestige of his reputation as a renowned chemist, Pauling initiated the vitamin-C craze. Anyone sitting in the audience would have to wonder why humans had lost such a vital adaptation as the biochemical pathway to make a substance that is so indispensable to the maintenance of health. Why did the human liver “forget” the secret formula for the ingredient without which gums, bones, blood clots, cell membranes, and myriad other vital structures cannot properly perform their function? Perhaps the answer has to do once again with the quirks of human iron metabolism.
Earlier, I noted that the abundant iron contained in leafy vegetables is very poorly absorbed because a human’s digestive machinery has great difficulty freeing the vital atom from the clutch of compounds binding it. In the presence of vitamin C, however, the chelator reluctantly releases some of its iron atoms, allowing the human digestive tract to more readily absorb plant-based iron. If Popeye ate an orange along with his spinach, he would receive more of an iron jolt. Unable to manufacture sufficient ascorbic acid internally, the human body must locate a source in its diet.
Humans are apes. Other apes, such as chimpanzees and orangutans, principally eat a diet of fresh fruit. They never migrate to environments that lack plentiful sources of this staple of their diet. Scientists estimate that the primate ancestor that evolved into the anthropoid line lost the enzyme to make ascorbic acid twenty million years ago. For this ape or any of its descendants, the inability to internally manufacture vitamin C posed no problem because it was superabundantly present in their diets.
Fresh fruit, however, is not always available in temperate zones, especially during winter. One successful strategy to stave off iron-deficiency anemia secondary to ascorbic-acid scarcity would be to switch to a diet high in heme iron. Although one would still be susceptible to the other protean manifestations of vitamin-C deficiency, at least a lack of iron stores would not be among the most important ones.*
The lack of vitamin C can create unusual side effects. Aldous Huxley in his book Heaven and Hell posed the question why so few people in modern culture see visions or experience the kind of religious ecstasy that was so commonly recorded back in the Middle Ages. The literature of that time is replete with reports of these phenomena, occurring on a regular basis to ordinary people. When someone in contemporary society claims “to talk to God, hear voices, or see visions,” we often admit them to the psychiatric ward to cure them of what the rest of us would consider deranged behavior.
Huxley attributed the relative disappearance of trance states in contemporary society to improvements in diet, and he proposed this as a major factor in lessening the influence of the Church. Prior to these developments, Europeans endured long, cold winters during which there was an absence of fresh fruit. Scurvy, the medical name for vitamin C deficiency, was pandemic.† To make things worse, in the dying days of winter, the Church mandated that everyone fast for Lent. Body reserves of proteins and fats already depleted by poor winter diets would have created the conditions of borderline starvation. Nerve transmissions in the brain, verging on serious disruptions, would begin to falter.
Adding to the complications of near starvation was the religious practice of self-flagellation, which the Church tacitly encouraged. A whip applied to the thick skin of the back excoriated it, leaving long superficial lacerations, not deep enough to cause death but severe enough to cause bleeding and anemia. These wounds, superimposed on a body reeling from the effects of incipient starvation, would then begin to suppurate. A low-grade bacterial infection would incrementally add to the debility of one who had scurvy, anemia, and starvation. These conditions would coalesce just as Easter approached. The mental states of many people were so affected by these dietary deficits and systemic toxicities that whole cities had mass hallucinations. Clerics in an age of extreme religiosity assumed that they were bonafide revelations.
One wonders how many “visions” were due to cholesterol-starved, ascorbic-acid-deficient, linoleic-acid-deprived, amino-acid-depleted neurons trying to cope with the toxicity associated with a subclinical streptococcal and/or staphylococcal infection. The predictable result: the overt syndrome of delirium. Such are the hidden factors that may influence history.
Let us step back and reassess the peculiarities we have just discussed. We apparently lost the vital means to make eight amino acids (ten in youth), one vital fatty acid, and ascorbic acid long ago, when primates differentiated away from other mammals. As long as primates remained in an environment in which there were plentiful dietary sources for these nutritional components, and their brains had not yet embarked on its hyperinflation routine, all was well.
Our metabolism was lazy when trying to make these vital substances internally because they were readily available and we were eating them. A major problem arose when we moved away from the lush vegetable sources containing these components and found ourselves encased in the ice that characterized most of the Pleistocene age. Having lost the ability to manufacture them, and deprived of easy pickings, we had all the more reason to search out and kill the animals that contained these substances. Like the Greek army under Agamemnon attacking Troy, we had burned many of our ships on the beach, so that retreat would not be an option. We had to increase our ruthless ways or die.
You are what you eat. The introduction of fire made meat much easier to chew and digest. A dietary strategy that relied primarily on plants segued into one increasingly dependent on meat. Fire also greatly improved our ancestors’ choices among previously inedible vegetables. A human who attempts to eat raw rice, wheat, or potatoes will experience significant digestive uproar. Tamed by fire, these dietary staples became “staffs of life.”
Keeping in mind the many ancillary nutritional human needs discussed above, let us refocus the discussion on Gyna sapiens’ need for iron. As the key characteristics of her reproductive life history veered away from what had served the multitude of other sexually reproducing females so well for so long, she could only dimly surmise that she was trapped in a quandary. For her and her offspring to survive and thrive, she would have to evolve new adaptations that would balance the ones that caused her to lose iron persistently. Like two runners yoked together in a three-legged sack race, these adaptations had to advance in precise synchrony or the human species would likely be deleted from the taxonomic catalogues. Natural Selection had to furnish Gyna sapiens’ metabolism with a credible strategy.
In a seemingly self-defeating countermaneuver, Natural Selection encouraged adjustments in the enzymes in Gyna sapiens’ digestive tract that would make her goal very difficult to achieve. Despite her frequent forays for food, and despite the variety of comestibles she collected, she was unable to establish a stable iron source by herself. The toddler tugging persistently at the hem of whatever ancestral mothers wore, the infant at her breast, or the fetus in her womb did not make her quest any easier.
Mother Nature had played a cruel trick on human mothers. In every direction a woman looked from her position at the home base, there was an invisible sea brimming with iron atoms. Nuts, roots, fruits, shoots, and leaves were abundant with iron. But, to paraphrase the lament of Coleridge’s Ancient Mariner, there was “Iron, iron everywhere, nor any filing to absorb.” The iron, so close, was present in its inaccessible form. The iron running around on the hoof was better, but how was she to secure it? Menses and the five other major factors that cause a female to lose iron made her search urgent. If meat was the most reliable source of easily absorbed iron, and she, burdened with small children, was unable to attain it by herself, she would have to deploy another strategy. And then she hit upon the solution: There, lumbering around in
the underbrush, was the key to her success and her species’ continued existence—Homo sapiens.*
During the long hominid evolution, Homo sapiens segued from a frightened vegetarian ancestor to a tentative scavenger to a skilled hunter to a fearsome predator. Our closest primate relatives—chimpanzees, gorillas, orangutans, and bonobos—thrive on nuts, fruits, shoots, tubers, and insects. So can we, if we have to. But deciding to hunt big, dangerous animals that would just as soon kill him wasn’t something Homo sapiens was likely to embrace enthusiastically just for the thrill. Considering the vegetable bounty that must have existed in the home of our species’ origin, it is doubtful that hunger was the primary motive. I nominate two other prime candidates—sex and money. Since money had not yet been invented, by elimination that leaves only sex.
One can witness in any romantic restaurant the political and economic ramifications of women’s gaining veto power over sex but losing iron. Despite the dietary-cholesterol awareness presently widely disseminated in the popular press and the recent shift to vegetarianism, the ritual persists. More often than not, a young man can be observed wooing a young woman over dinner by lavishing her with food purchased with his hard-earned hunting money. The table often is lit by candlelight, just as it was by firelight at the dawn of this exchange. Whether he hopes to gain exclusive, permanent sexual access (marriage), temporary sexual access (an affair), or fleeting sexual access (a one-night stand), his ultimate goal is to persuade her to sayYes!
Sex, Time, and Power Page 17