The Secret History of Food
Page 15
As Terry Burnham and Jay Phelan observe in their book Mean Genes, “We watch movies about rebels without a cause,79 not about people buying insurance.” This is because we’re the product of rebels, “descended from the humans who left their caves,80 who took risks and won.”
Even though our jaws have softened and we’ve since reached the top of the food chain, where we hardly have to chew our food, let alone hunt or gather it, that taste for adventure—to push past our comfort level and endure the pain—is still inside us; it’s how we got to the top of the food chain.
Then again, maybe the explanation is simpler; maybe we just can’t resist the temptation of forbidden fruit—or, in this case, forbidden berries.
Chapter 10
Attack of the Killer Tomatoes
There are known knowns;1 there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns—the ones we don’t know we don’t know.
—Donald Rumsfeld
It’s easy to look back in disbelief at history’s lost and dated food beliefs—the idea that cinnamon came from giant bird nests (and, if you mixed it with lamprey blood and inedible crust, made for a delectable pie); that honey fell from the sky (and, if you added it to breakfast cereal, became a gateway drug to chronic masturbation and, by proxy, baldness, habitual depression, morbid predispositions, fetid breath, and permanent darkness over one’s wretched soul); that corporations like McDonald’s and Starbucks actually care about consumers’ health or happiness.
Yet despite our litany of progress—our discriminating palates, our trove of food blogs, and our supernatural pantries—things have hardly changed. Our grandchildren, and certainly their grandchildren, will no doubt look back at us with the same wonder and bewilderment we feel when looking back at Kellogg’s Battle Creek Sanitarium or John Smith’s colonial attempts to fish with frying pans.
In 1893, a solid three hundred years after tomatoes were first cultivated in Europe2, it took the US Supreme Court to decide whether3 tomatoes were a fruit or a vegetable. At the time, imported vegetables were subject to a 10 percent tariff to protect American farmers, owing to the Tariff Act of 1883, but in 1887 a tomato importer named John Nix sued the collector of the port of New York to get his money back, arguing that tomatoes were fruits and therefore exempt. And this argument was contested for six years in escalating court battles4 before making its way to the nation’s highest court, where Supreme Court justices read from various dictionaries and heard testimony from expert witnesses before ultimately ruling that tomatoes were vegetables because they “are, like potatoes, carrots, parsnips,5 turnips, beets, cauliflower, cabbage, celery, and lettuce, usually served at dinner . . . and not, like fruits generally, as dessert.”*6,7
This happened not long after people finally decided tomatoes weren’t poisonous, a belief that had lasted for hundreds of years, owing largely to their botanical relationship to mandrakes and deadly nightshade, which are in the same family and not only are poisonous but were also said to be used in “witches’ brew” and to summon werewolves. In fact, the tomato’s scientific name, Solanum lycopersicum, literally means “wolf’s peach,”8 from the Greek lykos (“wolf”),9 which also gave us lycanthrope (“werewolf”) and persicon (“peach”)—while its old German name is Wolfspfirsich.10 And as recently as the 1860s, widespread rumors warned of tomato crops infested with poisonous worms11 capable of spitting their fatal venom several feet, causing terrible agony and instant death—but fortunately, the worms turned out to be harmless caterpillars. Another, more credible complaint was that the tomato’s innards were too acidic, causing them to leach toxic lead or copper12 from dishes and cookware. Meanwhile, others warned simply of their taste, calling them “sour trash” or “odious and repulsive smelling berries.”13
Potatoes, which come from the same family, suffered a similar reputation. In addition to their associations with witchcraft and devil worship, they were once thought to cause syphilis and leprosy,14 largely because of the way they looked, bearing a resemblance to the gnarled hands of lepers15 and, um, other afflicted body parts. Eighteenth-century Russians called them “the Devil’s apples”16 and burned them at the stake, while others warned that eating potatoes at night caused mothers to bear children with abnormally large heads17 or that pinning someone’s name to a potato cursed them to certain death. Meanwhile, wealthy people used them as decoration, growing potato plants in ornamental flower gardens18 and wearing potato flowers on their lapels19 or in their hair.
Ultimately, it wasn’t until widespread famine and crop failures forced people’s hands20 that Europeans begrudgingly offered the potato a place at their tables—even then, many resisted. Peasants in Austria “were threatened with forty lashes if they refused to embrace it,”21 while Prussia’s king Friedrich Wilhelm I threatened to cut off the ears and nose22 of dissidents who refused to plant them. In France, a scientist named Antoine-Augustin Parmentier took a softer approach;23 after struggling to convert skeptics by way of reason and science, he appealed to their sense of envy by serving potatoes to famous people and hiring armed guards to surround potato fields outside Paris, and voilà, now we have pommes frites.
Now, of course, tomatoes and potatoes are the most consumed vegetables in the United States by far, with per capita annual consumption weighing in at about thirty-one pounds of tomatoes and forty-nine pounds of potatoes24 in 2019, led largely by French fries and tomato sauce.25 In comparison, the consumption of onions, the third most popular vegetable,26 is only about nine pounds per capita.*27,28
And these weren’t the only ingredients people feared. As recently as the nineteenth century in England, there was a myth that raw fruits were poisonous, with “death by fruit” commonly listed as a cause of death29 on Victorian death certificates, a belief likely stemming from a 1569 ban on the sale of uncooked fruit30 to prevent the spread of the plague (which actually had merit, given that it was common practice at the time for butchers to throw leftover blood and entrails into rivers,31 where they’d often wash up on shores, and that this same polluted water was often used to wash fruits and vegetables).
And the fog has hardly cleared since.
People were afraid to eat Patagonian toothfish, a type of oily cod that was traditionally thrown back by fisherman, until they were rebranded with a sexier name in 1994: Chilean sea bass.32 This despite the fact that they’re neither technically a bass33 nor, a lot of the time, Chilean; many come from waters off the coasts of Africa and Australia.34 Now, of course, they sell for $29.99 a pound at Whole Foods, owing both to their desirability and the fact that this desirability has led to overfishing, going from a global capture of just 579 metric tons in 1979,35 when they were known mostly to Antarctic scientists,36 to a peak of more than 44,000 tons in 1995.37
The same thing happened with rock salmon (formerly “spiny dogfish”), blue cod (formerly “oilfish”),38 Torbay sole (formerly “witch”), and orange roughy (formerly “slimehead”). The uni (sea urchin) on your sushi platter used to be called “whore’s eggs”39 by fisherman, owing to their tendency to accumulate unwanted, fouling their equipment; before that, in ancient Greece, sea urchins were metaphors for women’s pubic hair. As David A. Fahrenthold writes in the Washington Post, “Today’s seafood is often yesterday’s trash40 fish and monsters.”
And the same state of confusion extends to almost everything else we put into our mouths, from pasta to multivitamins.
Certainly consumers are more familiar with spaghetti than yesterday’s trash fish, but that doesn’t necessarily make them savvy or any less gullible. In 1957, for example, the BBC aired a news segment on “spaghetti plantations”41 as an April Fool’s Day joke, showing footage of cooked spaghetti strands hanging from trees (to which they were affixed with tape) as farmers plucked them for harvest and placed them into baskets to dry—and people actually believed it. So much so, in fact, that the network was overrun with calls from viewers
asking where they could buy their own spaghetti trees.42 Even members of the show’s production crew, who’d been kept in the dark, fell for it. Again, this was in 1957, twelve years after the development of the atom bomb.
In the 1980s, A&W tried to one-up the McDonald’s Quarter Pounder by releasing a third-pound hamburger43 that was also less expensive and rated higher in consumer taste tests. It failed, however, because Americans are bad at fractions and thought a third was smaller than a quarter.
And in 2016, lawmakers in West Virginia were sent to the hospital44 after drinking raw milk in a celebratory toast for striking down a ban on raw milk that had clearly been put there for a reason. (Or reasons, among them E. coli, listeria, salmonella, and Guillain-Barré45 syndrome, which can result in paralysis, kidney failure, stroke, and death.) To be fair, Scott Cadle, the Republican delegate who’d distributed the milk,46 denied that the incident had anything to do with the milk, telling reporters, “It didn’t have nothing to do47 with that milk” and “It ain’t because of the raw milk.”48 And it’s impossible to know for sure, as he flushed the remainder of the milk49 down the toilet before samples could be tested, which is, apparently, something he normally does with perfectly good milk.
Meanwhile, our top nutritionists still can’t decide whether or not eggs are good for us.
In 1980, the US Department of Agriculture’s “Dietary Guidelines for Americans” consisted of a twenty-page pamphlet stapled in the center,50 offering such sage advice as “Maintain ideal weight,” “Avoid too much fat, saturated fat, and cholesterol,” “Avoid too much sugar,” and “Avoid too much sodium.”
By 2005, its guidelines had become slightly more specific, recommending that Americans consume less than 300 milligrams per day of cholesterol. Its 2015 guidelines, however, weighing in at a massive 122 pages,51 removed that limitation, prompting the American Egg Board to boast, “The U.S. has joined many other countries52 and expert groups like the American Heart Association and the American College of Cardiology that do not have an upper limit for cholesterol intake in their dietary guidelines.”
Except that’s not really true, because the actual guidelines explain that the body “makes more than enough”53 cholesterol on its own and that “people do not need to obtain cholesterol through foods” before ultimately recommending that “individuals should eat as little dietary cholesterol as possible while consuming a healthy eating pattern.” But that doesn’t mean we should eat none, apparently, because their Healthy U.S.-Style Eating Pattern outlined in Appendix 3, Table A3-154 recommends 2 to 3 cup-equivalents of dairy per day and 13 to 43 ounce-equivalents of meat, poultry, eggs, and seafood per week depending on which of the twelve caloric subgroups you belong to, which you can find in Appendix 2, Table A2-155 by cross-referencing your age, sex, and physical activity level.* And their Healthy Mediterranean-Style Eating Pattern outlined in Appendix 4, Table A4-156 recommends 2 to 2½ cup-equivalents of dairy per day and 13 to 50 ounce-equivalents of meat, poultry, eggs, and seafood per week. (Note, by the way, that cup- and ounce-equivalents don’t always correlate with actual cups and ounces. One large egg, for example, is equivalent to 1 ounce-equivalent of eggs,57 yet, per the USDA’s own guidelines, an egg has to weigh a minimum of 2 ounces58 when averaged by the dozen, in order to be called large, so technically, 2 ounces of eggs is equal to 1 ounce-equivalent of eggs. Similarly, 4 ounces of pork59 is equal to 4 ounce-equivalents, but 4 ounces of walnuts is equivalent to 8 ounce-equivalents.) And if you get out your decoder glasses and scrap paper and do the math, the maximum cholesterol intake suggested with these healthy eating patterns is—surprise—still about 300 milligrams,60 so nothing has changed other than the level of obfuscation.
Mind you, this isn’t the USDA’s fault, as they’re simultaneously charged with protecting the economic interests of American farmers61 and meat and dairy producers and protecting the nutritional interest of Americans, a Sisyphean task. So on the one hand, they’re supposed to encourage us to buy more meat and dairy products—and on the other, to eat less of them. As a result, their messaging is often inescapably convoluted and schizophrenic, reading a lot like fortune cookies or Bill Clinton’s 1998 grand jury testimony: “It depends on what the meaning of ‘is’ is.”
In addition, the USDA has to split food oversight with the US Food and Drug Administration (FDA) according to a matrix of blurred and invisible lines. The USDA is responsible for overseeing nutritional guidance, pepperoni pizza, meat sauces with more than 3 percent62 red meat, open-faced sandwiches, and catfish, for example, while the FDA is responsible for nutritional labeling,63 mushroom pizza, meat-flavored sauce with less than 3 percent red meat, closed-face sandwiches, and fish other than catfish.64 The division of eggs65 is even more confusing; the USDA is responsible for the grading of shell eggs, egg-breaking and pasteurizing operations, and products that meet the USDA’s definition of “egg products,” such as dried, frozen, or liquid eggs, while the FDA is responsible for the labeling of shell eggs, egg-washing and -sorting operations, and egg products that do not meet the USDA’s definition of “egg products,” such as freeze-dried egg products, imitation egg products, cake mixes, French toast, egg sandwiches (if they’re closed and don’t also contain a certain quantity of meat), and ethnic egg delicacies like balut.
Plus, in addition to overseeing what amounts to roughly 78 percent of the US food supply,66 the FDA oversees more than 20,00067 prescription drug products, 6,500 separate categories of medical devices, 90,000 tobacco products, and consumer products ranging from perfume, pet food, and deodorant to temporary tattoos, tampons, and microwave ovens. Also laser pointers. So they simultaneously have people regulating the prevention of maggots in consumer foods and the use of medicinal maggots in wound therapy.68
As a result of this confusion, they don’t have even a fraction of the bandwidth in terms of tools, funding, manpower, or daylight to do what they’re asked to—so the majority of food facilities under their jurisdiction (56 percent) go more than five years without inspection69 and a much larger percentage (about 99 percent) of the imported foods they’re responsible for go uninspected completely.70 All this while, according to the CDC, “48 million people get sick,71 128,000 are hospitalized, and 3,000 die from foodborne diseases each year in the United States.”
So no agency is really in charge of nutrition, transparency, or labeling, and we’re basically on the honor system. The inmates are running the asylum.
And even if we set aside the political wavering over nutrition and the issues with bandwidth, jurisdiction, and food labeling that journalist Barbara Presley Noble once called “so opaque or confusing that only72 consumers with the hermeneutic abilities of a Talmudic scholar can peel back the encoded layers of meaning,” we just find more and more layers of discrepancy and ambiguity.
Most experts seem to agree that olive oil is healthy,*73 citing things like monounsaturated fats, antioxidants, and an ability to lower “bad” cholesterol74, but that’s only if your olive oil is actually olive oil—and experts say there’s a good chance it isn’t. As Larry Olmsted writes, analysts estimate that between two-thirds and 90 percent of olive oil75 sold in the United States isn’t what it’s claimed to be and that “virtually every investigation, whether by universities,76 journalists, law enforcement, or government agencies has found an industry rife with fakery.” So not much has changed since 1820, when Fredrick Accum warned that commercial olive oil77 was often rancid or tainted with lead. In 1959, an estimated ten thousand Moroccans suffered partial paralysis78 after consuming olive oil merchants had mixed with surplus industrial lubricants intended for jet engines, and in 1981, more than twenty thousand people were poisoned and hundreds died after consuming Spanish “olive oil” that turned out to be machine oil.79
The bright side, if there is one, is that, although olive oil counterfeiting remains rampant, a lot of today’s counterfeitting has to do with faking its graded virginity;80 i.e., passing off virgin for extra virgin or diluting it with oils that are less
expensive but still edible, like canola, sunflower, or soybean. But you should still consider yourself lucky if your organic extra-virgin olive oil is even made from olives, let alone those rated for human consumption and not for use as lamp oil.
Similarly, a lot of experts say that fish is healthy, owing to its omega-3 content, but just as there are a lot of fish in the sea (and rivers and lakes and commercial fish farms), there’s a lot of methylmercury, polychlorinated biphenyls,81 parasites, agricultural pesticides, microplastics, and toxic algal blooms. Then there’s the fact that high levels of omega-3 have also been linked to prostate cancer.82
So choosing healthy seafood isn’t as simple as eating oysters only in months with the letter r in them, an adage that goes back at least to the 1500s from the advice only to eat oysters “that growes upon great ships bottomes,83 or in places not muddy; in those moneths that haue the letter R. in their names,” which was mostly a precaution against eating raw seafood in the summer84 prior to the advent of refrigeration—or, as Anthony Bourdain advised in Kitchen Confidential, never ordering fish on a Monday85 unless you’re eating at Le Bernardin, because most seafood vendors don’t deliver on weekends.*86
As nutrition professor and James Beard Award–winning author Marion Nestle writes, “To make an intelligent choice of fish at a supermarket,87 you have to know more than you could possibly imagine about nutrition, fish toxicology, and the life cycle and ecology of fish—what kind of fish it is, what it eats, where it was caught, and whether it was farmed or wild.”
Yet even if we did know all this, it wouldn’t make much of a difference, because fish fraud is also extremely rampant.
In 2008, two high school girls in Manhattan88 collected fish samples from restaurants and grocery stores for a school science project, preserving them in alcohol and sending them to a university lab for genetic fingerprinting, and found that half of the local restaurants and 60 percent of the grocery stores were selling mislabeled fish—including at least one endangered species.