Biomimicry
Page 20
As you can see, smart eating is more than just avoiding or minimizing encounters with nasty toxins; it’s finding the proper mix of nutrients and building blocks that the body needs. It seems that animals have a nose for what is good for them and actually crave it.
Cravings
In his famous “cafeteria” studies of a half-century ago, Johns Hopkins’s Curt P. Richter broke rat chow into its constituent parts and placed them in eleven separate dishes: proteins, oils, fats, sugars, salt, yeast, and so forth. Given unlimited quantities, rats mixed and matched, procuring a diet that, with fewer calories, allowed them to grow faster than rats fed the normal chow. In fact, the nutritionists were surprised by the choices—they finally had to admit that the rats had composed a better nutritional diet than the makers of the rat chow!
Scientists think that a craving for a complete diet may have also influenced how America’s great buffalo herds moved across the landscape. One theory holds that their traditional routes purposely included salt licks and other reliable sources of vital minerals. The continual roaming also may have helped the bison avoid grass tetany, a springtime malady that affects fenced livestock. Turned out to fresh green fields, horses and cows sometimes binge on immature grass, which is high in nitrogen and potassium, but low in available magnesium. If there are no sources of magnesium in the field, the springtime “feast” can leave livestock with the “grass staggers” or even kill them. If livestock have the opportunity to range, however, they will avoid grass tetany by balancing their nutrients. In the same way, white-tailed deer will assiduously search for a balanced diet, moving methodically through the woods and fields, cobbling together the nutrients they need. Bucks are even choosier than does, searching for plants that contain enough potassium, calcium, and magnesium to fuel their fantastic spurt of antler growth. “This nutrient-specific eating looks pretty smart to us,” says Bernadette Marriott, behavioral ecologist and deputy director of the Food and Nutrition Board at the National Academy of Sciences. “We could use some lessons.”
I found Marriott in a chic office building tucked near Canal Park in Georgetown, a plum location for the National Academy of Sciences. The security was oddly strict, but Marriott was welcoming. Petite, dark-haired, and dignified, she impressed me as someone who might have been shy at a former stage of life, but who has now turned her shyness into a personal power that says what is important, but will not shout to be heard. From a window that spanned the width of her office, a corona of afternoon light framed her head and streaked into her soothingly darkened office. Behind her on a credenza were photos of Himalayan peaks and batiks of many-armed dancers. On her desk, a figurine of India’s rhesus monkey—the species she had studied—held her business cards up to visitors.
Like many biomimics I talked to, Marriott was drawn to the zone between two disciplines. To satisfy an interest in biology and psychology, she decided to study why animals choose the foods they do and what effect this has on social evolution. “I studied rhesus monkeys [Macaca mulatta], which are phenomenally finicky eaters. They spend a lot of time preparing food to eat, stripping off the edges of a leaf, or eating just the midrib. I wondered: How do they learn and remember which foods are safe and nutritious? Are they keying in on color, shape, texture, or is it something else?
“When I watched and analyzed behavioral patterns, it turned out that shape—which I thought might be the key to the search image—was not statistically important. This led me into the chemistry lab to do a nutritional analysis of everything they ate. [Like Glander, she was delving into unexplored terrain here.] What I found astounded me. These monkeys managed to pick a diet that was perfectly balanced. The only thing lacking, however, was some minerals that they needed.”
Looking back on it, knowing rhesus needed minerals, she says she shouldn’t have been surprised to see them eating soil. “As a Westerner, your first instinct is to say, don’t put that in your mouth. But we knew from their behavior that it wasn’t a mistake—it was something important.” The monkeys make a special trip to a particular cliff, where they scratch the soil with their fingers and then eat it. After many years of use, an actual cave is formed, big enough for a monkey to stand in. Usually, one site is used religiously by the entire troop. When researchers picked sites at random and tried starting caves for the monkeys, the rhesus came and investigated, but wound up back at their own site. They would actually form lines outside and wait their turn rather than go somewhere else and start digging.
Once she began asking, Marriott learned that many people in Africa eat dirt, as do people in this country. “It’s called geophagy, and in the United States, it’s very covert, taboo. Whenever I speak on this subject, people come up to me and tell me they have an aunt or a neighbor who eats dirt. It’s never them, of course,” she says with a slight wink.
Turns out there’s an industry in dirt eating. In the Italian markets of Philadelphia, you can buy commercially produced cakes of soil stamped with insignias of their origin. “Georgia dirt is supposed to be prime,” says Marriott, “but when I have tried asking merchants what it is, they simply say, ‘It’s good for you. It’ll help you have strong babies.’ They never say it’s dirt.”
When Marriott first watched the rhesus eating soil, she thought maybe they were after bugs or tubers, but an analysis of the soil showed nothing of the kind. Donald E. Vermeer of George Washington University in Washington, D.C., theorized that the dirt might chemically bind to and neutralize stomach acids, helping to quell stomach upset. Sure enough, his structural analysis showed the presence of kaolin, which is the active ingredient in Kaopectate. Timothy Johns, a biochemical botanist at the University of Toronto and author of With Bitter Herbs They Shall Eat It, believes that the benefits of dirt are more physical than chemical. He thinks clay particles in the soil physically bind to the secondary compounds in ingested plants, thus occupying them so they cannot be incorporated into the body. Johns bases his belief on the observation that Bolivian Indians coat their wild potatoes (which are full of toxins) with a slurry of soil before cooking.
Marriott departs from both the chemical binding hypothesis and the physical binding hypothesis. “I came away with the theory that geophagy is more a quest for something good than a way to rid the body of something bad. I think the rhesus eat soil as a broad-based mineral supplement. While clay or kaolin may provide a feeling of well-being—by settling the stomach—that’s perhaps a secondary benefit acting to reinforce the mineral-gathering behavior.”
To verify what she suspected, Marriott took soil samples back to her lab and analyzed them. Sure enough, soils from the troop’s traditional feeding places showed spikes in certain minerals like iron that the monkeys were missing in their diets. Might the monkeys be waiting in line for their one-a-day mineral pills? Marriott smiles and shrugs. “At least they don’t have to pay sixteen dollars a bottle.”
Are You What You Eat?
It’s easy to understand how safe eating would evolve, but how about smart eating? Are those animals that can locate a particularly rich form of fat, protein, or mineral being rewarded in some evolutionary way? Michael Crawford and David Marsh, authors of The Driving Force: Food, Evolution, and the Future, argue that evolution is indeed substrate-driven and that the key substrate is food. If you want a new and improved body to put you in a better survival position, they say, you have to first snag yourself the building blocks to make that change.
Morphologists tell us that certain body structures would be impossible to build without enough of the right kinds of food. To build a brain, for instance, you need miles of lipid (fatty) membrane to wrap around neurons, and lots of vascular tissue to feed those neurons. Both components are made of essential (long-chain) fatty acid derivatives, which are chemically manufactured in a herbivore’s body, starting with the fats in leaves and seeds. An easier way to amass large quantities of these “neural” fatty acids is to eat animals that have already manufactured them for you. Switching from a leaf-only diet to one with meat
, therefore, might have given carnivores a larger supply of neural building blocks—the ticket to advanced structures like keen eyes and a bigger brain.
Second, in addition to being a structural material, food is also a batch of chemicals, which, by their nature, are reactive. When these substances enter the body, they bump into and interact with the bath of hormones, enzymes, genes, and neurotransmitters that govern and regulate cell life. Above a certain threshold concentration, food chemicals may begin to influence which enzymes start to work, or when genes will turn on or turn off.
This threshold mechanism gives food the ability to tweak powerful control knobs within the body. Imagine, for instance, that an adaptation is lying dormant in the genes, just waiting for a chemical surge to “turn it on.” There’s no telling what might emerge as a result of a good diet. Witness the spurt in human height, for instance, when nutritious foods became widely available in the Western world. In this case the nutrients affect the phenotype (the growing body) but not the genotype (the set of instructions encoded in DNA that is passed from generation to generation). Take the diet away from the next generation of phenotypes, and heights will shrink to prior averages.
But what if diet can affect certain aspects of our permanent genotype over the long haul? Crawford and Marsh think that it can, and they offer the following rationale. If you can eat an animal that makes an important nutrient, such as vitamin A, you no longer have to devote your biosynthetic pathways to making vitamin A. This frees your energy for other chores, like building a brain. It may also free up genetic space, the authors speculate. Say you have only so much room on your chromosomal “hard drive,” and it’s already filled with genetic instructions. By eating vitamin A manufactured by another animal, your instructions for synthesizing vitamin A become superfluous. If a mutation suddenly rewrote that gene sequence with another set of instructions—a new adaptation—you wouldn’t miss the vitamin A recipe, and you could therefore live to take advantage of, and pass on, the new adaptation. Evolution, stuck on its plateau, would suddenly spring to a new level.
If this theory is even a little bit true, you can see how important it is for an animal (and for us) to have the good sense to gather what is needed in terms of food. But where is the command center for our fine gastronomic compasses? Is our good taste hardwired into our bodies or is it learned? The researchers I talked to think it might be a little of both.
How Did Smart Eating Develop?
The first primates were exclusively insect eaters, Glander tells me. By eating insects that fed on plants, the primates were ingesting plant compounds by proxy. By the time the primates evolved into plant eaters themselves, they had already developed the physiological apparatus to either metabolize certain nasty plant chemicals or to excrete them. Because plant poisons vary from plant to plant, however, these “safe plants” would be a small subset of the whole. If a primate wanted to step out of this limited range and try others, it would need some way to determine what was good and what was vile. Luckily, a knack for smart eating develops in two ways. It’s partly hardwired into our senses by evolution, and partly acquired or learned through life.
Glander is one of many researchers who suspect that the main form of primate leaf discrimination is through the senses of taste and smell. When the lemurs tasted the trial leaves, they sniffed and sometimes took a leaf into their mouths and punctured it, allowing the volatile compounds to waft over their Jacobson’s organs—the interconnected passageway between the mouth and the nasal passages. Presumably, it is in these smell/taste receptors that chemical analysis occurs.
As mammals, we can sense bitter, acrid, astringent, sour, and pungent flavors—all of which serve a function in food selection, says Richard Wrangham of Harvard University. Consider sourness, for instance. Sourness is a measure of acidity, which acts as a natural preservative against bad microbes (the ones that cause rancidness). Somewhere deep within us, we recognize sourness as a badge of purity, assuring us of a food’s safety. That may be why we prefer a little sour flavoring in our sweet confections, rather than straight sugar.
Certain kinds of fermentation—when a fruit turns to alcohol, for instance—may also signal safety to an animal. Fermentation in fruit is assisted by bacteria that deactivate unpleasant compounds such as cyanide and strychnine. On the other hand, there is also bad fermentation—the action of different kinds of microbes whose metabolic waste is toxic, even deadly, to humans. To avoid them, we are hardwired with a strong aversion to rancid flavors.
Our hardwiring is not absolute, however, and our aversion to or craving for certain foods may sometimes become curiously strong. In her book Protecting Your Baby-to-Be, evolutionary psychologist Margie Profet suggests that pregnant women’s unusual taste swings may be adaptations designed to protect embryos during sensitive development cycles. If true, this could explain everything from morning sickness to the pregnant woman’s inexplicable love affair with pickles. Perhaps the real appeal of pickles is their sourness, says Profet, a badge of purity at a time when rancidness must be avoided. Later in the pregnancy, the woman may begin to crave what she’s missing nutritionally—a specific hunger being hardwired into her neurons right on the spot. Thomas Scott of the University of Delaware found that when a rat is deprived of salt, neurons that normally respond to the taste of sugar are commandeered and reprogrammed to become receptive to salt. In other words, salt becomes as pleasurable to the brain as sugar normally is. Cravings might also be heightened through our other senses. When we’re hungry, for instance, the brain throws open the olfactory receptors, making us more sensitive to the odors of food. (That’s why head colds and smoking suppress appetite—we can’t smell our food as well.)
Even this flexible hardwiring can’t fully explain the fine discrimination shown by animals, however. Regardless of how plant-smart their inborn sensors are, nothing could prepare an animal to automatically recognize every species in the jungle. Some things just have to be learned on the job.
With primates (and many other animals, such as elephants), the learning begins with Mom. Infants will peer and poke into their mother’s mouth to smell and taste what she is eating, and after a while, they build a chemical profile of what’s good. “It’s like downloading information from a computer,” says Glander.
Once they leave their mother, primates have to keep on making decisions about whether new foods they encounter are safe and worth collecting. Using themselves as guinea pigs is one option, but social primates have found a better way. Kenneth Glander calls it “sampling.” When howler monkeys move into a new habitat, one member of the troop will go to a tree, eat a few leaves, then wait a day. If the plant harbors a particularly strong toxin, the sampler’s system will try to break it down, usually making the monkey sick in the process. “I’ve seen this happen,” says Glander. “The other members of the troop are watching with great interest—if the animal gets sick, no other animal will go into that tree. There’s a cue being given—a social cue.” By the same token, if the sampler feels fine, it will reenter the tree in a few days, eat a little more, then wait again, building up to a large dose slowly. Finally, if the monkey remains healthy, the other members figure this is OK, and they adopt the new food.
Not all monkeys volunteer for sampling duty, however. Glander has noticed that monkeys in vulnerable stages of their lives—juveniles, subadults, and lactating or pregnant females—seem to bow out of sampling. If the risks are too great for some monkeys, why would any monkey volunteer? “I think the benefits may be genetic,” says Glander. Adult monkey fathers, for example, may be boosting the health of their offspring by testing foods for their pregnant or lactating mates. Adults that aren’t yet parents may also volunteer, pointing out wholesome foods for their siblings and nieces and nephews who share a portion of their genes. Despite these benefits, Glander says no monkey would want to risk being a full-time sampler. “The sampler role shifts from monkey to monkey, so as to spread the risk and not unduly jeopardize anyone. This risk-sharing i
s, in itself, a good reason for being social,” speculates Glander. Sampling, he believes, may have in fact contributed to the development of social behavior in primates.
Besides tipping the scales toward sociability, tricky food choices may also have challenged animals in ways that rewarded intelligence. Researchers hypothesize that sometime in the Middle Miocene (7 to 26 million years ago), monkeys developed the ability to tolerate higher levels of toxins than apes could, giving monkeys a wider choice of foods. Apes (our ancestors) were stuck with a more sensitive digestive system and were therefore forced to roam in search of higher-quality foods and new ways to prepare that food. Richard Wrangham believes this may have contributed to our ape ancestors finally leaving the jungle, walking upright onto the plains, and beginning to use tools and fire.
The droughty climate of the apes’ new plains habitat meant that foods were more seasonal in nature. To find reliable nutrition throughout the year, they had to problem-solve, employ tools, and perhaps cooperate more with their fellow primates. As it turns out, although monkeys won the evolutionary race to detoxify compounds, apes wound up with higher mental functions.
Female apes were faced with even more limitations and nutritional demands. Unlike males, who could squeak by on lower-quality foods or take excursions to far-flung corners of their habitat for pockets of early ripening fruit, females were often eating for two or lactating. They needed safe, nutrient-rich, protein-rich, calcium-rich foods, but they couldn’t travel far to find them. Faced with this dilemma, females may have been the first to experiment with new types of foods, such as flowers, young leaves, and tubers, and to experiment with hand-held tools. Michelle L. Sauther, an anthropologist at Washington University in St. Louis who has studied food choice in primates, writes, “[Ape] females may have broken free from some of the seasonal constraints on food availability by using tools to gather wild plants, insects, and small mammals. For example, females may have employed digging sticks for underground tubers and used techniques similar to those observed in wild chimpanzees, such as using stone hammers to crack open nuts and employing termite and ant wands [sticks thrust into hives and nests to harvest insects].”