White Bread

Home > Cook books > White Bread > Page 10
White Bread Page 10

by Aaron Bobrow-Strain


  3

  THE STAFF OF DEATH

  Dreams of Health and Discipline

  The whiter your bread, the quicker you’re dead.

  —Dr. P. L. Clark’s Home Health Radio, c. 1929

  GRAIN DAMAGE

  Professional cyclists spend a lot of time thinking about what to eat. Burning seven thousand calories in a race will do that. And, for many decades, cyclists’ obsessive solution to the food question focused on building muscle and storing up energy. This meant carbohydrates and protein piled on top of carbohydrates and protein. Giant carb-loaded dinners of pasta and bread were an essential pre-race ritual. In the mid-2000s, however, elite cyclists began to think about food differently—not just as the building block of muscle and energy, but as a kind of medicine.

  Pro racers and their amateur emulators began to seek out “anti-inflammatory foods,” like raspberries, ginger, and salmon, believed to speed recovery from injuries. And they began to avoid foods deemed “pro-inflammatory”—foods that purportedly irritated bodily tissues, caused aching joints, and sapped stamina. Shockingly, for a group of people accustomed to large amounts of carbohydrates, wheat, along with other foods containing the protein gluten, topped cyclists’ list of inflammatory agents.

  Christian Vande Velde, a popular rider from the Chicago suburbs, led the break away from gluten. A pro cyclist since the late 1990s, Vande Velde had helped Lance Armstrong to two of his Tour de France victories and racked up an impressive record of race wins of his own. But he was also plagued by injuries, and, by 2007, he was no longer a young rider. It would have been easy to write him off—until the wins started rolling in again and 2008 turned into Vande Velde’s best season ever. That year he claimed stage wins in the Tour de France, the Giro d’Italia, and the Paris-Nice spring classic. He took the overall winner’s jersey at the important Tour of Missouri, third place in the Tour of California, and third place in the U.S. Pro National Time Trial Championship.

  Many observers attributed Vande Velde’s success that year to his switch to a strong new team. Vande Velde, however, pointed to his diet. That year, following the advice of sports physiologist Allen Lim, Vande Velde had gone “gluten free.”1 Though the shift was painful—Vande Velde found giving up bread particularly difficult—it was worth the suffering. As he told reporters, on a gluten-free diet he slept better, felt mentally fresh, and performed at a higher level than ever before. “Physically I am a lot leaner. … I am less lethargic and my energy levels have been quite good. … I recover quicker and maybe have less inflammation in my back and hips.” Soon Vande Velde’s teammates copied his diet, and then other pro teams followed. In 2010, Lance Armstrong’s RadioShack crew rode gluten free all the way to its overall team victory in the Tour de France.2

  This story repeated itself in other elite endurance sports, but by the late 2000s, it wasn’t just professional athletes who were going gluten free. Celebrities, ranging from Hollywood darlings Gwyneth Paltrow and Zooey Deschanel to cable news tough guy Keith Olbermann, praised the benefits of avoiding gluten: the diet aided weight loss, unleashed untapped reservoirs of energy, stabilized insulin levels, and even promoted mental acuity. Already familiar with the low-carb Atkins diet, soccer moms everywhere and affluent residents of the East and West coasts followed close behind. According to one widely cited market report, 15–25 percent of American consumers wanted to purchase more gluten-free foods in 2009.3

  In just a few years, the market for gluten-free products had emerged as one of the grocery industry’s fastest-growing sectors. Books with titles like Going against the Grain, Grain Damage, The Grain-Free Diet, and Dangerous Grains filled bookstore shelves. Cable news programs warned that wheat might inflame joints, worsen autism, lead to cancer, or send insulin levels soaring. If you visited an alternative health care specialist with sore knees or digestive trouble, you’d have been almost certain to hear that giving up wheat might help.

  To me at forty, dogged by my own seemingly endless sports injuries, all this sounded really appealing. So, in the middle of researching and writing a book about bread and despite my obsessive love of baking, I went gluten free. This meant a lot more than just giving up bread, as I soon learned. Other grains—rye, barley, spelt, and sometimes oats—contained gluten. Worse still, in our industrial food system, gluten had found its way as an additive into thousands of foods, from powdered spices to ketchup. Really going gluten free required constant vigilance, endless research, difficult sacrifices, and ceaseless self-control. I only lasted two months—but in that time, I felt real results. I felt energetic and sharp, and during the first weeks of the diet, I experienced what can only be described as euphoria. Reading gluten-free websites and online testimonials, I discovered that my experience was not uncommon: many people reported feeling euphoric in the first weeks after going gluten free.

  My experience didn’t synch with the findings of mainstream science, however, or my gastroenterologist’s recommendations. According to most mainstream medical experts, gluten should rightfully concern only about 1 percent of the population—the percentage of people thought to be afflicted with celiac disease, a serious autoimmune disorder. For people with this disorder, consumption of even tiny amounts of gluten causes the villi of the small intestine to literally smother themselves in mucus. The result is damaging inflammation and impaired ability to absorb nutrients, leading to seriously elevated risk for a wide range of cancers, neurological diseases, and other autoimmune disorders.4 Except for that 1 percent, gluten was harmless, my doctor assured me (and mainstream medical research confirmed). So I got tested for celiac, and the results were negative. I was free and clear—but why did giving up gluten make me feel so good?

  The historian in me came up with what I thought was a good explanation: over the past hundred years, relentless fine-tuning of individual health through dietary discipline has become something of a national obsession. Whether through rigorous dieting, intense exercise, or almost religious attention to the latest missives of nutrition science, rituals of control over one’s body are a key marker of elite status and responsible personhood.5 And nothing made me feel in control of my body more than following the challenging strictures of gluten-free eating. I knew that my various aches and complaints probably stemmed from the daily grind of self-imposed pressures, work and family stress, and the accumulated trauma of decades of competitive sports more than they did from my diet, but I didn’t feel that I could control those parts of my life. I could control the way I ate, and it felt good. History plus psychology explained my results.

  But, as anyone who has ever typed a health-related search into Google knows, there are always experts ready to offer some other explanation. In this case, a whole army of alternative health care providers, physical therapists, diet gurus, and holistic healers, armed with everything from critiques of capitalist agribusiness to the latest insights of genomic medicine, rejected my quick self-psychoanalysis. Gluten didn’t just hurt celiacs, they warned. According to these alternative health experts, many if not most people had their health and stamina sapped by the stuff. Thus, gluten-free diet proponents cautioned their audiences to “think outside the celiac box”—to imagine a whole spectrum of gluten sensitivities and systemic effects that can’t be objectively identified by science yet, but can be perceived (and remedied) by individuals carefully attuned to their bodies.6 We have slowly discovered genetic markers and mechanisms for other low-grade autoimmune disorders, the argument went; what’s to say that low-grade gluten intolerance wouldn’t eventually be made verifiably “real” in the same way someday?7

  That kind of talk could easily be dismissed as pseudo-science, relegating the gluten-free craze to a long line of fashionable pseudo-ailments sported briefly by “the worried well.”8 Without tests accepted by mainstream science and relying only on a patient’s bodily intuition, self-diagnosis of nonceliac gluten intolerance was easy to gloss as psychosomatic. On the other side of the gluten divide, however, a collection of ever-more-mainstream v
oices had begun to pose gluten avoiders as canaries in the coal mine—people who were, for some reason, more attuned to something fundamentally askew with our health care and food systems. Gluten problems, from this perspective, spoke to a larger problem of health hazards lurking in modern food, concealed from us by Big Agribusiness and the failings of mainstream medicine. The staff of life may have sustained Western diets for thousands of years, but no longer. Modernity corrupted the staff of life.9

  Exactly what had changed varied in different accounts, but all argued that, in some important way, our wheat, or the way we eat it, had become “unnatural.” Some writers blamed modern plant breeding. Others pointed to pesticides, endocrine disrupters, high-speed dough handling, industrial fermentation, genetic modification, or the unbridled use of wheat-based additives in foods that never before contained gluten. Some of the claims were clearly specious, but others contained a tantalizing basis in truth: driven by food processors’ need for grains adapted to the rigors of industrial processing, modern plant breeding has dramatically increased gluten levels in wheat. Powered by the relentless acceleration of corporate baking, high-speed dough handling and rapid-fire fermentation have changed the molecular makeup of bread.10 What wasn’t clear was whether those changes actually impacted eaters.11

  As a student of food history, I knew that diet gurus often operate like this: introducing small grains of doubt into the comfortable confidence of mainstream science. These small grains of doubt are always just within the realm of the plausible, and they always gain traction by playing on already existing anxieties. They thrive by taking big, looming, seemingly impossible to control social forces and giving them a quick and easy individual dietary fix. Gluten-free proponents might be right about structural problems in the U.S. food system, I concluded, but their individualistic “Not in my body” solution was all wrong.12 And yet, as someone upset about the U.S. industrial food system, arguments regarding agribusiness and the harmful effects of gluten resonated with me. Was gluten free a fad or a warning bell? Or both?

  In the end, I gave up on that question, just as I gave up on gluten-free eating. I realized that, as a consumer, I could weigh evidence on either side of the scientific debate all day without getting anywhere, but as a student of history, I could offer another way of looking at the problem: regardless of whether I believed that widespread gluten intolerance was “real” or not, I knew that we could learn a lot by thinking about the decidedly social dreams rolled up in debates about gluten and health. In going gluten free, I was participating in a very old and very American dream: a deep and abiding belief in the ability to fine-tune and maximize the moral and physical health of my body and my nation by eating the right food—an irrepressible confidence in the power of proper diet to cure almost all physical and social ills. This dream, in turn, has long reflected deeper concerns about the nature of industrial progress, society’s relation to nature and, of course, anxiety about status. Not surprisingly, wheat and bread have played a major role in this history.

  Indeed, for almost as long as Western culture has existed, it has been accompanied by anxieties about wheat or its refining. Recall, for example, Plato’s discussion in The Republic about the impact of refined wheat on society’s moral health, and consider the seventeenth-century celebrity diet guru Thomas Tryon, who warned Britons that eating overly refined or poorly baked loaves upset Nature and Reason.13 During the past two hundred years, however, as bread production has grown increasingly industrial and increasingly distant from homes, suspicions about the staff of life’s effect on life itself have grown even more frequent.

  Usually these concerns focused on the components of bread like vitamins or fiber. Typically they cleaved between critics and backers of refined white flour, but sometimes, as in the 2000s, they broadened to encompass any wheat. Disagreements about refined flour’s relation to constipation date back centuries, but battles over bread have spotlighted a wide range of other effects, from blood impurity, antibody formation, and chemical poisoning to discussions of nervous inflammation, corporeal enervation, tissue sweetening, and acidosis. The specifics change over time: Atkins turns into gluten free, gluten free turns into . . . ? But much remains the same, and this is where history can offer some perspective.

  In 1924, the industry magazine Baking Technology warned readers of rampant “amylophobia” sweeping the country.14 Literally the fear of starch, amylophobia, in the writer’s usage, encompassed an amorphous, spreading sense that modern bread—either all wheat bread or just its white, refined form—did something bad to bodies. In 1924, the baking industry was emerging shining white out of forty years of moral panic over bread cleanliness and contagion. Here, however, was a different kind of concern: not fixated on external contaminants but on the nature of bread itself and its effects on human metabolism. Something about modern bread—and critics differed about what that something was—appeared to be making the country fat, sick, lazy, and weak.

  This chapter suggests that, as strange and exaggerated as 1920s and 1930s “amylophobia” might seem, it has a lot to teach us about the political and psychic costs of our national fixation on achieving perfect health through dietary discipline. But before we can understand the wheat fears of the early twentieth century, we’ll need to appreciate the even deeper and older roots from which they arose. There’s nowhere better to begin that than with Sylvester Graham, America’s first great white bread critic.

  THE CRUSADE AGAINST MORAL INFLAMMATION

  Today, if we remember him at all, we remember Sylvester Graham as the inventor of the graham cracker (which he wasn’t) or “the father of American vegetarianism” (which he may have been).15 Almost two hundred years ago, however, the man whose devoted followers later gave his name to the dull brown cracker used for s’mores inspired scandal, controversy, and riots. Thousands of people read his essays or squeezed into packed auditoriums to hear him speak during the 1830s and 1840s. Tens of thousands followed his teachings religiously, and far more decried them just as dogmatically. Medical journals ran tempestuous debates about his principles, while mob violence stalked his speaking engagements up and down the East Coast. Even after his death in 1850, controversy raged about what an autopsy revealed about the condition of his intestines.

  Before all that, though, Sylvester Graham was the seventeenth son of an elderly father and an insane mother, born sickly and not expected to thrive. As a child he showed signs of consumption. As a youth he suffered general physical debility, nervous exhaustion, and “sensitivity.” But, after an apparent breakdown in his late twenties, Graham retreated to Rhode Island where he embraced a strict dietary regimen and miraculously recovered. In his triumph over infirmity, Graham found religion—not just the Protestant creed he had studied at Amherst College, but also a deep and abiding physiological faith, a fervently optimistic belief that disease was a choice and that anyone could achieve health as he had.

  Armed with this conviction, Graham entered public life in the late 1820s, against a background of social upheaval and flourishing fervor for reform. Like many health crusaders of the time, Graham embarked on his activist journey through the temperance movement, which served as a stepping-stone into a web of reform-minded networks including abolition, suffrage, transcendentalism, vegetarianism, and animal rights. Exposed to this wide range of commitments, Graham came to believe that diet held the key to them all.

  Under the influence of increasingly popular critics of early nineteenth-century “heroic medicine,” with its affection for bloodletting and mercury purgatives, Graham believed, not without cause, that health was best achieved by avoiding doctors. As part of a larger religious current sweeping Jacksonian America, Graham combined evangelical revival with scientific study of the body. Called “Christian physiology” by historians of religion, this was not faith healing, but rather a conviction that all disease arose from a failure to conform one’s bodily habits to the Laws of Nature, a scientific order designed by the Creator.16 Under the influence of the celeb
rated French physiologist François Broussais’s “gastroenterological theory,” Graham’s particular version of Christian physiology located humans’ primary connection to Nature and Creator in the alimentary tract.

  More specifically, Graham imagined the body as a network of fibers radiating out from the intestines, connecting and feeding every organ. Ingesting “stimulating” food and drink—particularly animal flesh, white bread, alcohol, caffeine, and spices—irritated and inflamed those fibers from the gut outward, producing overall ill health. On the other hand, because all health was connected to the gut, cooling the body’s fibers through bland, disciplined eating could cure any ill. Avoiding stimulating foods was, Graham proclaimed, “nothing less than the application of Christianity to the physical condition and wants of man … the means which God has ordained for the redemption of the body.”17 Even the worst cases of bodily derangement could be eased by an ascetic diet of whole wheat bread and water.

  For Graham, health and bodily inflammation were more than physiological. Central to Christian physiology was the conviction that careful study of scientific law would inevitably confirm biblical law and vice versa. God created Nature, therefore Nature—the workings of human physiology—must logically work according to the laws of God. And because particularly vital fibrous connections linked the intestines, genitals, and brain together in “morbid sympathy,” intestinal inflammation also held the key to the nation’s moral health. In this holistic view, the maintenance of individual health and moral virtue went together. Thus, Graham’s best-selling Lecture to Young Men on Chastity famously blamed masturbation for a long list of civic woes. But the compulsion to masturbate itself arose out of poor physical hygiene and diet. Even chaste youths resisting “the solitary vice” with all their might could not triumph against “involuntary nighttime emissions” unless they harmonized their bodies with Nature through austere eating.18

 

‹ Prev