The Secret History of Food
Page 1
Dedication
For my mother, and her cooking.
And my father, and his eating.
Epigraph
History celebrates the battlefields whereon1 we meet our death, but scorns to speak of the plowed fields whereby we thrive. It knows the names of the kings’ bastards but cannot tell us the origin of wheat. This is the way of human folly.
—Jean-Henri Fabre
Of the many choices we make in our lives,2 what to eat is perhaps the most enduring and important. Whereas individual human beings can go through life without participating in political acts and without personal liberty and can survive without forming a family or having sex, none of us can go without food. It is the absolute biological necessity of food that makes it so central to cultural history and so inclusive of all peoples in all times.
—B. W. Higman
Contents
Cover
Title Page
Dedication
Epigraph
Chapter 1: A History of Swallowing
Chapter 2: Pie, Progress, and Plymouth Rock
Chapter 3: Breakfast of Champions
Chapter 4: Children of the Corn
Chapter 5: Honey Laundering
Chapter 6: The Vanilla of Society
Chapter 7: The Ghosts of Cockaigne Past
Chapter 8: The Choices of a New Generation
Chapter 9: Forbidden Berries (or Appetite for Distraction)
Chapter 10: Attack of the Killer Tomatoes
Acknowledgments
Notes
Index
About the Author
Copyright
About the Publisher
Chapter 1
A History of Swallowing
. . . the pursuit of more and better food1 has helped to direct—sometimes decisively, more often subtly—the movement of history itself.
—Reay Tannahill
“Tell me what you eat, and I shall tell you what you are.”2
These are the words written by Jean Anthelme Brillat-Savarin, one of history’s most enduring and influential food writers—a guy who not only had a cheese named after him (a triple-cream, semisoft cow’s milk cheese with a “luxurious mouthfeel reminiscent of tangy,3 sour, and mushroomy softened butter”) but whose book Physiologie du goût (“Physiology of Taste”) is still in print nearly two centuries after its 1825 debut4.
Granted, not all of Brillat’s meditations have aged as well as his cheese. For example, his belief that eating starches softens a man’s flesh and courage (“For proof one can cite the Indians,5 who live almost exclusively on rice and who are the prey of almost anyone who wishes to conquer them.”), his endorsement of sugar water as a healthy and refreshing tonic, or his frequent rants on obesity:
Suggest to a charming fat lady6 that she mount a horse, and she will consent with great pleasure, but on three conditions: first, she must have a steed which is at one and the same time handsome, lively, and gentle; second, she must have a riding habit which is new and tailored in the latest style; third, she must have to accompany her a groom who is agreeable and good-looking. It is rather rare to fill all three of these requirements, so she does not ride at all.
Or “I can remember only two really fat heroes.”*7,8
Others are open to debate, such as his opinion on serving a cheese course—and, simultaneously, on the material worth of disfigured cyclopean women: “A dinner which ends without cheese9 is like a beautiful woman with only one eye.”
Yet he was clearly ahead of his time on other points, being one of the first to extol the benefits of a low-carbohydrate diet; to draw a connection between the consumption of refined carbohydrates and obesity; and to urge parents “to forbid coffee to their children10 with great severity, if they do not wish to produce dried-up little monsters, stunted and old before they are twenty,” all notions that have since become common knowledge.
And he was also right in his premonition that what we eat defines us not just physically but psychologically, socially, symbolically, and spiritually—to a much greater degree, in fact, than he could have known in the 1800s.
Modern science, for example, has taught us that it’s not only what we eat that defines us but what our parents ate.
Numerous studies suggest that our adult preferences for salt are predicted by our mothers’ fluid loss11 during pregnancy—that heightened morning sickness and maternal vomiting (and thus lowered electrolyte levels) trigger an increased yearning for salt in utero that can last into adulthood, prompting a lifetime of overcompensated salt consumption. Similarly, exposure to flavors in our mother’s amniotic fluid and breast milk,12 which take on flavors from her diet, can result in acquired food preferences pre- and postnatally long before we take our first bite of food, given that the average fetus swallows somewhere between 500 milliliters13 and a full liter of amniotic fluid daily14 (the equivalent of about one and a half to three twelve-ounce soda cans). Researchers have detected flavors in amniotic fluid ranging from garlic to cumin and curry;15 meanwhile, breast milk has been shown to absorb an even broader range of flavors ranging from carrot, vanilla, and mint16 to alcohol, blue cheese, and cigarettes.*
In one study, infants whose mothers consumed carrot juice during pregnancy and lactation showed a greater preference for carrot-flavored cereal;17 in another, eight- and nine-year-olds whose pregnant mothers had consumed garlic had a greater preference for garlic-seasoned potatoes;18 and in another, adults who’d been fed vanilla-flavored formula as infants had a greater preference for vanilla-flavored ketchup.19 Children who were breastfed also tend to eat more fruits and vegetables and be more adventurous eaters20 than those who were given formula, owing to their exposure to a greater variety of flavors early on.
Of course, the notion that breast milk can influence human behavior is nothing new. For thousands of years, long before Brillat-Savarin, it was considered common knowledge that things like personality traits and intellect were passed on through breastfeeding,21 so mothers who were either unable to produce their own milk or too wealthy to bother would carefully screen their wet nurses for things like breast shape, emotional stability, and manners. An ancient Sanskrit text,22 for example, instructed that a wet nurse should belong to one’s own caste;23 have healthy skin unmarked by any moles or stains; be free from such vices as gambling, day sleeping, and debauchery; be neither too old, too young, too thin, or too corpulent; and have breasts that are neither too pendulant*24 nor drawn up and nipples that are neither upturned nor unprominent. And as recently as 1544, the English Boke of Chyldren cautioned mothers, “ye must be well advysed in takyng25 of a nource not of yll complexion and of worse maners, but suche as shall be sobre, honest and chaste, well fourmed, amyable and chearefull . . . no dronkarde, vycyous nor sluttysshe, for suche corrupteth the nature of the chylde.” Imagine those wanted ads.
Meanwhile, it was thought that drinking animal milk made you act like an animal.26
And it’s not just what our parents ate that defines us but how much they ate. A parent’s diet and caloric intake can affect whether the genes handed down to us are switched on or off, a process called epigenetic inheritance27 that can impact everything from metabolism and body weight to disease resistance. For example, fruit flies that were fed a high-sugar diet for just two days28 before mating parented offspring with an increased likelihood of developing obesity, while mice that were fed a high-fat diet for six weeks29 before breeding parented offspring with an increased likelihood of diabetes and a more than 20 percent increase in weight and body fat.
And these patterns seem to hold true for humans, too. A review of more than thirty studies from the Netherlands, the United States, France, India, Norway, Sweden, the United Kingdom, Ger
many, New Zealand, and Australia concluded that fetal exposure to poor nutrition30 in the womb increased the risk of obesity, heart disease, and type 2 diabetes through adulthood; while another study found that the foods parents and grandparents had eaten between the ages of eight and twelve31 impacted their children and grandchildren’s risk of heart disease and diabetes.
Other inherited influences go back even further; you might take pride in your sophisticated palate for black truffles, sour ale, charred brussels sprouts, and single-origin coffee—but your preference for them is, to some degree, genetic. Similar to color blindness, roughly half of the population doesn’t have the olfactory capacity to sense androstenone,32 the chemical responsible for a truffle’s coveted earthiness, while a smaller portion of the population is overly sensitive and finds it revolting.33 People with a gene called OR6A2,34 which correlates to aldehyde receptors, tend to think cilantro (also known as coriander)35 tastes like soap or smells like “bug-infested bedclothes.”36 In fact, there’s evidence that the name coriander comes from the Greek koris (“bedbug”),37 and aldehydes similar or identical to those found in cilantro are found in soap38 and certain bug excretions, including those of bedbugs.39
Similarly, our sensitivity to bitter foods is largely associated with a gene called TAS2R38,40 and you can measure yours at home by picking up some paper test strips saturated with a chemical called 6-n-propylthiouracil41 (PROP), which are widely available online. About half the population finds these strips moderately bitter42 (“tasters”), while a quarter finds them unpalatably bitter (“supertasters”), and another quarter describes them as having no taste at all (“nontasters”). Supertasters also tend to have a higher density of taste buds,43 and although this might sound like a coveted foodie superpower, supertasters are likely to be pickier eaters44 and avoid things like coffee, wine, spirits, dark chocolate, and various fruits and vegetables (e.g., grapefruit, broccoli, kale) because they find them too bitter.
In fact, it’s plausible that all of our ancestors were effectively “supertasters” at one point, as it’s no coincidence that a lot of things that are toxic in nature tend to be bitter or acidic,45 so the more our ancestors avoided them, the more likely they would have been to avoid death and sickness. Yet over time, those who consumed them out of desperation, palate fatigue, or bravado (and managed either to avoid those that were particularly deadly or reduce their toxicity through cooking or processing) would have passed on their tolerance genetically. Meanwhile, a lot of plants would have become less bitter and toxic over time46 as we developed agriculture and began selectively breeding crops for desirable traits. The wild ancestors of pumpkins,47 potatoes, and almonds, for example,48 were all bitter and toxic before human intervention, while ears of ancient corn were roughly the size of cigarettes with miniature kernels hard enough to break teeth.
This is basically the same story behind milk. Initially, humans were able to digest milk only as infants. Biologically, there are a lot of good reasons for this: particularly for mothers who are malnourished, breastfeeding can make it harder to get pregnant again,49 so a mother who stops breastfeeding earlier has more chances to add to the gene pool. Plus, you want the previous child to be finished nursing by the time the next child is born, as otherwise they’d have to compete for limited resources, and producing enough milk for one kid is already difficult.
So continuing to breastfeed beyond infancy wasn’t good for the tribe. Like other mammals, people would naturally stop producing lactase, the enzyme needed to digest lactose,50 as they aged—and those who continued to drink milk, whether human or animal, would suffer gastrointestinal issues such as gas and bloating. But being gassy and bloated still beat starvation, so after thousands of years of evolution, our bodies developed an adaptive resilience to milk products, making us more tolerant of lactose in adulthood.51 (The discovery of yogurt and cheese making also helped,52 as both tend to reduce milk’s lactose content, making them easier to digest; yet even so, roughly two-thirds of the adult population has trouble digesting milk products53 to varying degrees, so lactose intolerance is still the norm rather than the exception.)
Our genetic tolerance for alcohol consumption (e.g., level of inebriation, dizziness, facial flushing, and absorption rate54) is a similar adaptation, likely arising from our ancestors’ consumption of fermented fruit millions of years ago;55 in the same way, koalas developed a tolerance for eucalyptus leaves,56 which are highly toxic to other mammals and most other animals.57
Yet far and away the biggest link between what we eat and who we are arose from the discovery of cooking—and we can trace a lot of the things we see in grocery stores (and modern society) back to our decision to start putting raw meats and vegetables into controlled fires (or waters heated by geothermal springs) somewhere between 2 million and 200,000 years ago,58 a pivotal milestone that transformed us just as much as it transformed our diet.
Cooking made it possible to eat a lot of foods that would otherwise have been toxic, inedible, or indigestible. Even modern staples such as wheat, corn, and potatoes aren’t very palatable (or digestible) without exposure to heat or fire, let alone scavenged prehistoric roots and plant stems. Cooking potatoes, for example, not only makes them infinitely more pleasant going down but makes their starch content more than 90 percent more digestible,59 while properly cooking lima beans or cassava60 not only increases digestibility but also kills an enzyme that could otherwise cause potentially fatal cyanide poisoning. Meanwhile, cooking also increases the shelf life of foods, both in the short term by killing germs and bacteria and in the long term by removing moisture through smoking or drying, particularly in the case of meats. This, in turn, made foods more transportable, which also increased the odds of survival, as hanging around freshly killed animals tended to make one prey for other animals.
To be fair, cooking can also lessen the nutritional quality of food, depending on the food and the cooking method. As one look at your bright green cooking liquid after overly blanching your greens suggests, water-soluble nutrients tend to leach61 out during the cooking process; however, the result is still usually a net gain in terms of nutrients and calories. Basically, you lose some nutrients to cooking, but those that remain will be easier for the body to extract and utilize.62
Much of this has to do with the fact that cooking makes foods softer (and thus easier to digest and chew) by denaturing proteins, rupturing cells, and gelatinizing things like starches and collagen.63 Chewing might not seem particularly arduous, but that’s because we’re used to soft, cooked foods and don’t often bite into raw squash or potatoes; the fact is, before the advent of cooking, the simple act of chewing would have consumed much more of our ancestors’ time and energy. Two of our closest relatives, for example, chimpanzees and mountain gorillas, spend approximately 37 percent and 55 percent of their days,64 respectively, swallowing and chewing, versus an average of just 5 percent for modern humans; in a sixteen-hour day, this translates to a potential eight hours less chewing time.
We’re also saving a lot more energy per bite when we cook our food—about 14 percent less muscle use per chew65 for things like cooked yams, carrots, and beets versus those same foods in their raw states. Altogether, this might not seem like much, but it could very easily have meant the difference between living and dying thousands of years ago, when food sources were scarce and finding another bite to eat meant risking being eaten yourself.
So by cooking our food, we essentially outsourced some of our predigestive processes and made food not only more nutritiously available but also less costly to digest, two insurance policies against starvation that together became crucial to our survival. As a result, the early humans who came to prefer their food thoroughly cooked, versus raw or rare, would have had a better chance of surviving long enough to pass on their genes. We can still see the results of this in our modern diet; just as our sense of vision evolved to help us spot and identify safe and nutrient-rich food sources by giving us an edge in distinguishing certain color
s in nature (e.g., the redness of ripe versus green fruit66 and freshly killed meat that had yet to spoil and turn gray), we developed a Darwinian taste for the nutritional safety signals of fire, e.g., foods that are energy dense and highly portable, preserved for an unnaturally long shelf life, and tender yet crisp around the edges with, perhaps, some visual charring—essentially, the McDonald’s menu.*67 Not to suggest that French fries are an ideal food source today, but they would have been a superfood a million years ago, when finding food required a massive expenditure of calories and ensuring the survival of the fittest meant finding as much food as possible to avoid starvation and live long enough to have children and perpetuate the species.
And all of this changed us. The more we ate soft, energy-efficient foods, the more our bodies adapted, resulting in smaller jaws and colons.68 In fact, one of our distant relatives who existed before the advent of cooking, Paranthropus boisei, is nicknamed “Nutcracker Man”69 because of his massive jaws and teeth—packing molars roughly four times the size of our own. Meanwhile, our modern stomachs and colons are less than one-third70 and two-thirds, proportionally, the size of those in other primates, owing to the efficiency of our cooked diet.
This evolution is still continuing. Explains science writer Nicola Temple, human jaw sizes have continued to shrink even within the last century,71 owing largely to the rise of processed foods (e.g., McDonald’s). The adoption of forks and knives72 likely also contributed to this by offloading even more work from our jaws and teeth.
Some scholars, like primatologist Richard Wrangham, believe cooking also gave us larger brains. Wrangham argues that the added nutrition through cooking fueled the expansion of the human brain, a metabolically expensive body part that requires disproportionately more energy than the rest of our bodies, using roughly 20 percent of the body’s basal energy despite accounting for only about 2.5 percent of our body weight.73 (Basically, the energy that would have gone to larger jaws and digestive systems instead went to our heads.) As a result, explains paleoanthropologist and evolutionary biologist Peter S. Ungar, “Our brains weigh nearly five times what you’d expect74 for a mammal of our size . . . the difference between an apple and a pineapple.”