In Memory of Bread

Home > Other > In Memory of Bread > Page 2
In Memory of Bread Page 2

by Paul Graham


  When I look back at those two meals now, I think, At least I didn’t blow it. Not bad for a farewell tour. I ate the right foods in the right place with the right people—not mindlessly at a rest-stop along the road somewhere, or even in our own kitchen, by myself.

  —

  In between those two post-Thanksgiving meals, I had dragged myself to my doctor’s small but busy practice. Most residents of the North Country, a region that stretches from Syracuse and Albany through the Adirondacks and north to the Canadian border, would agree that we lack stellar medical care. We have other things—farms and rivers and fishing, mountains and hiking trails and skiing, good universities, craftspeople, artists, and the tightest community I’ve known anywhere—but I had long felt that this is not the place a person wants to be during a serious health-related event. The nearest major hospital is two hours away. I had come to look at the medical situation a little naïvely, viewing it as one of the trade-offs that inevitably come with putting down roots in such a place. This was easy to do because Bec and I were young—both of us were thirty-six—and healthy, in large part thanks to her work as a personal trainer.

  I left my doctor’s office with a prescription for Cipro in hand, confident that in a few days I’d be back to eating spicy dishes, vegetables, and meat. It had cleared up my symptoms before. This time, though, the drug seemed to knock the infection back without clearing the symptoms up completely.

  The longer the GI troubles dragged on, the more they affected everything, especially my love of cooking. I’ve always been an instinctual cook, more interested in preparing what shows up at the market—or, as I frequently put it, whatever the fridge wants me to cook—than working off of set recipes. For weeks I prepared simple, basic meals that I thought would be gentle: lots of pasta with olive oil, bland soup with saltines, and toasted bread with tea. In my mind, I was being good, making sacrifices; I wanted serrano chiles, roasted winter vegetables, chutneys over chicken and pork, coffee, whiskey, and wine. Our farm share piled up as I avoided using Brussels sprouts, cabbage, spinach, cranberries, and apple cider. I talked about a strange craving for a Reuben, which normally I did not want, because everything I ate was so bland.

  The next time I saw my doctor, he was convinced that I was suffering from an especially virulent infection or a parasite: giardia, maybe, or C. difficile. He ordered more tests and put me on Flagyl, a more powerful antibiotic, which I took for two days while trying to ignore signs that I was allergic to it.

  Finally Bec took me to the emergency room, where they hooked me up to monitors, gave me potassium and magnesium through a drip, and read the first of several scary blood panels. I was severely anemic, with a hemoglobin score of 7 (a healthy male’s is around 15). My hematocrit and iron scores were low. My red blood cell data were poor. On paper I looked like a castaway who had been living on bark and berries for a month. The ER doctor wondered out loud how I could become so quickly depleted of iron reserves; well-fed Americans, he said, usually have enough to ride out a bad spell. Around midnight, they released me with iron supplements and another antibiotic, Bactrim. I took the pills and ate the same gut-friendly foods, and continued to suffer and lose weight.

  By now, the damage to my small intestine was approaching a state of decline known as complete villous atrophy. The tiny hairlike projections in the intestines (called microvilli) that absorb nutrients had been all but razed by the onslaught my immune system had made in the presence of gluten. I was digesting and absorbing almost nothing, and at night I was incontinent. The drop in weight sped up as I stubbornly clung to my routine, walking the dog despite my wife’s objections, teaching classes, hauling wood. Neurological symptoms appeared next: twitches and tremors, irritability, an inability to think through complex questions and problems. I walked out of classes and meetings wondering what had just transpired; at home, I would sit nearly on top of our woodstove and still suffer chills.

  I made one last trip to my regular doctor, who drew some blood, read results that were even worse than the last time, forced me to make eye contact, and told me he wanted to do a CT scan. He was worried about cancer. It took less than ten minutes for him to have a phone conversation with a doctor who represented the insurance company. This physician turned the CT scan down on the grounds that a case of the runs in a thirty-six-year-old with no previous medical history was not a good reason to blow a few thousand dollars on imaging. The best my doctor could do was make me an appointment with a gastroenterologist for the next week. Until then he wished me luck, and said that I should call back if my condition worsened.

  I eliminated many foods during that time, from coffee (which was painful) to spices, from cruciferous vegetables to acidic foods like tomato sauce. I eliminated alcohol and fats. But I never cut out bread, or pasta, or crackers. I told my doctor about my adjustments to my diet. I told friends and family. Nobody suggested that I stop consuming gluten. As awareness of gluten intolerance and celiac disease increases, some people who suffer from gastrointestinal symptoms are starting to suspect wheaten foods first, or at least they are including them on a list of usual suspects along with dairy, soy, legumes, and nuts. But I didn’t have any suspects at all.

  It turns out that I was repeating a mistake common throughout medical history, all stemming from what I have come to see as the culinary centrality, and the cultural symbolism, of wheat and bread in Western culture. There is evidence that for thousands of years, the loaf, the flatbread, and the bowl of porridge (or anything made of wheat, oats, barley, rye, and spelt) have been a huge pathological blind spot, and thus the perfect immunological (or, in the case of celiac disease, autoimmunological) Trojan horse.

  By some estimates, the discovery, domestication, and cultivation of wheat and barley is the most profound event ever to have happened to humans, largely because it ushered in the practice of farming, which eventually led to population growth and presented the chance to diversify and specialize human activities. Gradually, after hunter-gatherers traded roaming for settlement, they began to make cities, pottery, and literature. (Our Neolithic-era ancestors found some unfortunate pastimes too, like making war.) Few foods can measure up to grain calorically, and its “bankability,” or storage potential, allowed, for the first time, a single staple to support whole cities. In fact, the common agricultural practice since the start of farming has been for a group of people to select a few grains—like wheat and barley in the West, rice and millet in the East—and depend upon them, utterly, for the entirety of that society’s history until something prompts a change.

  Not surprisingly, the earliest wheat- and barley-based farming civilizations believed that these powerful grains were a gift from the gods. Ceres, the ancient Roman goddess of agriculture, provides the Latinate root of our word “cereal.” Demeter (Greece) and Ninkasi (Sumeria) were worshipped for providing sustaining, sapid gifts to humankind, in particular bread and beer. Philosophers (Plato being the most famous example) argued about which kinds of grain and which preparations were best. And, until relatively recently, grain was rarely maligned. Throughout the Middle Ages and after, unscrupulous millers and bakers stretched flour and bread by adulterating it with ground bones, sawdust, and other inedible materials—but that was an ethical problem, not an epidemiological one. Widespread “panophobia” does not appear until the nineteenth century, though there were instances of it prior to then.*

  One of the more striking examples of bread’s privileged status can be found in the history of ergotism, an illness caused not by gluten, but by a toxic fungus that infests rye. Eaten primarily by the poor in northern Europe and Russia, rye was consumed more widely when famine struck and other staples were scarce. As with celiac disease, ergotism symptoms varied from person to person (and even by region). In Limoges in 857, Gauls who ate ergotic rye bread and porridge suffered from a burning sensation in their limbs, which then turned gangrenous and rotted off before they died. (The ancient Romans had known about ergot, but their knowledge seems to have been lost.) T
he disease was thought to be caused by malicious supernatural beings, not bad bread. When the healthy people who hadn’t ingested the mycotoxins burned a nonconformist or three without stopping the spread of the disease, they turned to the Church for help; St. Anthony’s Fire, as the disease came to be known, took its name from the monks of that order, who had some success with cures. Suspicion of demonic possession as the cause of sickness and weird behaviors persisted through the Dark Ages and into the Enlightenment, where spikes in witch trials correlate to years of poor climatic, harvest, and economic conditions—all of which would have led people to eat ergotic bread. A combination of scientific understanding and the increasing stability of food supplies—brought about by the spreading popularity of potatoes and corn, and better grain yields—gradually reduced the incidence of ergotism. By the twentieth century, ergot had been eradicated from the bread supply everywhere except Russia, where symptoms persisted as late as the 1930s.

  Wheat took longer to be connected to illness than rye, in part because gluten toxins, unlike mycotoxins, do not cause a person (thankfully) to bark like a dog, bang his head against a wall, or behave as if he has been dabbling in witchcraft. The first known description of celiac disease comes from the ancient Greek physician Aretaeus of Cappadocia, who was believed to have lived in the first century AD. Other physicians of his era may also have observed the sudden onset of gastrointestinal distress, lethargy, and malabsorption, but Aretaeus gets the credit for naming the disease after the Greek word for “belly” (koiliá). We know that he was most likely seeing celiac disease instead of some other condition in his patients because of the specific type of diarrhea they presented: fatty and foul-smelling, which is consistent with malabsorption. The leavings of such patients would have undergone a characteristic process of putrefaction and fermentation in the gut, instead of digestion.

  Aretaeus also appears to have noted the disease’s cyclical nature: “[Celiac affection] is a very protracted and intractable illness; for even when it would seem to have ceased, it relapses without any obvious cause, and comes back even upon a slight mistake.” This is a pretty accurate description of what I experienced around Thanksgiving, when I coasted for half a day on rice and bananas, and then, depressed, pissed-off, and hungry by late afternoon, treated myself with a slice of toasted white bread with butter, cinnamon, and raw sugar. Almost immediately, the symptoms returned with a ferocity that made me whimper. Though his observations were astute, Aretaeus missed the cause, blaming, instead of wheat or barley, a reduction of “heat” in the intestines, which he believed could be caused by something as simple as a “copious drink of cold water.”

  Aretaeus’s writings appeared to have been lost until a nineteenth-century London pediatrician named Samuel Gee, who could read Greek, came across them while researching treatments for children with chronic (and similarly putrefied) diarrhea. Gee has been credited as the first to observe that diet was the most important factor in maintaining the children’s health. However, Gee could not pinpoint the exact dietary cause of malabsorption. He eliminated milk when he noticed that the children could not tolerate it (which is typical of those with compromised GI tracts, as casein proteins can be difficult to break down), and prescribed raw meat, but he allowed them slices of toasted bread. He paid close attention to a child who improved “upon [eating] a quart of the best Dutch mussels daily,” yet failed to figure out why the child’s health declined when mussel season ended. (Gee had hoped to observe the child again the next year during mussel season, and was dismayed to find that the boy “could not be prevailed upon” to repeat the diet.) Notably, physicians in many places, including the United States, were struggling with the same questions at about the same time. Another pediatrician, Sidney Haas, famously observed the positive effects of a banana diet, one of many specialized diets prescribed in the early twentieth century.

  The most commonly circulated—and apparently inaccurate—story about the discovery of wheat’s ability to be harmful takes place during the Hongerwinter (Hunger Winter), or Dutch Famine, in the Netherlands in 1944, when a German blockade cut off supplies to cities in the western part of the country. An unusually harsh winter and a devastated infrastructure of roads, bridges, rail lines, and docks worsened conditions. It is estimated that 4.5 million people were affected by the winter food shortages, and 22,000 perished, many of them in the isolated highlands. Those caught in the famine were forced to supplement their meager rations with ingredients their poor ancestors had eaten whenever the wheat crops had crashed centuries earlier: chestnuts, acorns, and dried beans and peas.

  Bread was especially hard to come by. Periodically the Allies managed to get supplies through, including flour from Sweden, which led to stories of bread being dropped from airplanes. It’s a compelling image—hungry people standing in the frozen fields, faces and arms uplifted to packages descending from parachutes in a blue sky smeared with oily exhaust. Even the name of the endeavor, Operation Manna, conjures “manna from heaven.” However, the people received flour, not baked bread, and it reached them in slower, more mundane ways.

  According to the popular account, whenever bread did reach a certain group of Dutch children and their families, a Dutch pediatrician, Dr. Willem-Karel Dicke, took notice. Dicke observed that the recurrence of GI symptoms and mortality rate among his patients was tied to the ebb and flow of bread consumption—essentially, to the resumption and cessation of a forced elimination diet. When flour arrived and bread could be made, the well-meaning parents offered it to their children instead of eating it themselves. The affected (intolerant) children got sick, and then improved when the rations were once again exhausted. However, other sources, including Dicke’s wife, claim that he had suspected bread and wheat as the cause of malabsorption in his pediatric patients since at least 1932, and had experimented with wheat-free diets between 1934 and 1936, nearly a decade before the famine. Instead of providing a breakthrough, the conditions of the Hongerwinter allowed Dicke to further probe his hypothesis. Following the conclusion of the war, Dicke published his theory that wheat appeared to be the cause of the digestive maladies in these children. In the second half of the twentieth century, his hypothesis was confirmed by advances in medical technology—especially serology, imaging, and biopsy.

  —

  I didn’t have to wait long to find out what would happen after my insurance company denied me a look at what was happening in my gut. Later that same day, I developed a GI bleed. The sight of blood in the toilet was alarming, and I returned to the ER. The attending physician was almost chipper when I recounted my full story, especially the part about the rejected CT scan.

  “Well, now they’ll have to pay for it,” he told me, “because you came through the ER.”

  Bec and I anxiously awaited the CT results, which showed no signs of tumors. Instead, my entire abdomen was swollen with fluid. It was as if my gut had completely crashed, and what little I had eaten and the great quantity of water I’d drunk in an attempt to stay hydrated were just sitting stalled in the pipeline. Past this, though, nobody seemed to have any idea what was wrong with me.

  The doctor told me I wasn’t going home, and I didn’t resist. I let them wheel me up to a room, weigh me—I’d dropped twenty-five pounds—and hook me up to IVs. If I had not been so fogged, exhausted, and ill, I would have been exasperated at my inexplicable physical breakdown, but I’d reached the point that every sick person eventually reaches, where I only wanted answers. The gastroenterologist took one look at me the next morning and said he suspected several conditions, from celiac disease to colitis or Crohn’s. He needed to do a colonoscopy and an upper endoscopy to diagnose me (the first of many times I heard this joke, apparently an industry standard: “Don’t worry, we use different scopes!”). Before he was willing to do those tests, though, I needed two blood transfusions and time for my body to respond to them. They weren’t comfortable putting me under in my current state.

  With nothing more to do but let the fluids run into
me and listen to music, I often found my thoughts turning to food during the three days I was hospitalized. I’d once joked to my wife that hospital offerings would finish me off before whatever malady had landed me there. Now death-by-bad-food seemed a legitimate possibility. Every day at seven in the morning, noon, and five, the Chuckwagon, as I called it, banged through the hallway and its driver, the Chucklady, dropped off a brown tray with plastic dishes that looked almost as uninviting as what they contained: Jell-O, a broth so funky I couldn’t tell whether it was vegetable, beef, chicken, or none or all of those, and weak coffee. I couldn’t bring myself to taste the broth, which was a good thing: I later learned that some commercial bouillon cubes and packets, including the cheaper, institutional-grade bouillon, contain gluten in the form of wheat starch or soy sauce. This was one of many examples of how gluten is ubiquitous, able to fly under the radar, so to speak, and sicken people in a stealthy way. I was in a hospital; I was a GI patient; the doctor suspected celiac disease; and still, in the broth, gluten might have lurked. I sipped the bitter coffee black and traded up for ginger ale. I poked at the Jell-O. Then I set them all aside.

  My second night there, our friends Sarah and Cory snuck in some homemade beef broth. It was the first time in almost a month that I tasted and enjoyed food, and wanted more. I had actually asked Sarah for it, a request that still bothers me a little, now, because broth is not at all a simple gift to bring a sick person, not if you’re making it the right way. And Sarah had indeed done it the right way, using aromatics, root vegetables, and marrow bones from Cory’s mother’s cow. I did not ask how long it took her to make; I opened the flask and gulped it down, ignoring everyone’s requests to pace myself. The broth was deep and meaty, and Sarah had remembered to be generous with the salt. It remains one of the most satisfying meals I’ve ever had. The fact that I could see a Thermos of beef stock as a meal, that my body and my mind responded to it, was one of the first murky signs that I was going to be okay.

 

‹ Prev