In Memory of Bread
Page 9
As the Canadian geographer Emily Eaton describes the reception of GM wheat in Canada, Roundup was only part of the issue. Rather, Canadian farmers, consumers, and environmentalists believed that it was important that the wheat and the bread they were eating remained the same as what their ancestors had eaten. Of course, the wheat wasn’t the same, exactly, but they saw a significant difference between the breeding their forefathers had practiced on the prairie and the laboratory activities of a bioengineering corporation. The early settlers of the prairie provinces of Manitoba and Saskatchewan had carved out identities as wheat farmers, which in turn became part of the Canadian cultural and patriotic identity, especially in the west. (The same could be said of Americans with ties to the “bread basket” of the Central Plains.) In much the same way, religious communities throughout Canada expressed dismay at the thought of blessing and breaking GM bread and wafers for the Eucharist.*2 It hardly mattered that the bread would look and taste the same. The idea that the Host was not an unaltered, God-given food but an engineered one threatened to desecrate Christians’ sense of living symbolism.
Proponents of bioengineering have suggested that, someday, it might be possible to genetically modify wheat to be celiac-safe. Much though I’d like to think I wouldn’t eat GM wheat, a few years without gluten, and many struggles and disappointments in the kitchen, have shown me that the idea of “purity” at the heart of the resistance to GMOs looks different when your choice is between GM bread and beer, if it were to become available, and ersatz bread and beer. I feel that I’m on tricky ground here, close to hypocrisy, because if I can avoid it, I won’t eat meat from industrial farms and feedlots that keep animals in cruel and filthy conditions, factory-farmed eggs where the hens never see the light of day, indiscriminately harvested fish, or produce from companies that exploit and dehumanize workers. If no harm were to come from celiac-safe wheat—a big “if,” given the trouble that famously came to family farmers in the Midwest after GM corn pollinated their fields, putting them out of business—there would be no foul.*3 The point is that celiac disease has shown me another side of purity, one that has less to do with the genetics of the cultivar than with the experience of eating the same food as the other people around the table: inclusion, comfort, and pleasurable eating. And for me, at least, this other kind of purity sometimes seems significant enough to tip the scales.
I don’t have to worry too much about solving that conundrum anytime soon. Skeptics believe that bioengineering celiac-safe wheat will be challenging, because while the 33-mer gliadin peptide has been identified as a smoking gun, researchers have admitted they do not know all of the genes in the plant responsible for creating autoimmune responses. The problematic sequences or fragments may not even be located in the same sites on the wheat genome, thus rendering them harder to identify. And it’s possible that if all of the immunogenic compounds in wheat were neutralized, the baking properties of the grain would be altered. That would be a cruel turn indeed: now I would have flour I could eat, but I wouldn’t be able to make a damned thing out of it.
—
On the other side of the celiac-causation debate is convincing scholarship suggesting that wheat breeding may have little or nothing to do with gluten intolerances. This camp finds statements about strategic breeding leading to increased protein content questionable for a simple reason: farmers were not able to systematically increase gluten content until a couple of hundred years ago, because there was no way to accurately measure the protein in wheat until then. That would not preclude coincidental increases in 33-mer gliadin peptides as a result of breeding for other qualities; nor would it account for the impact of the Green Revolution on hybridized wheat varieties. It’s notable, however, that thousands of years ago, farmers were selecting for lower protein content in grain, not higher, because protein is inversely proportional to starch. Early farmers strategically bred for bigger grains because they were easier to harvest and thresh, and bigger grains are bigger because they have more starch. Once bakers started leavening dough around 5,000 BC, they would have noted that some types of wheat made better raised bread than other types—emmer and other bread wheats were preferable to durum, for example—but the growers would only have been able to increase the protein content from one generation of wheat plants to the next intuitively, not empirically.
It’s not even desirable, from a miller’s perspective, to constantly increase protein content. All cooks, but especially bakers and pastry chefs, share at least one goal with scientists: to have a reliable replication of the same chemical reaction every time. While durum semolina flour ideally contains about 7 to 11 percent protein, and bread flour 11 to 14 percent, the protein in unmilled, unblended grain fluctuates wildly, sometimes reaching values as high as 28 percent. What causes that? It’s been argued that environmental factors have more influence than breeding practices—even more than pumping the soil full of fertilizers, a practice that became commonplace following World War II, when tons of nitrogen was repurposed from the munitions industry and found its way into the corn and wheat fields of the Central Plains. The most popular belief is that protein content is tied to climactic factors. The solution to the variation is to blend flours of different protein contents to assure year-to-year consistency: 7–11 percent or 11–16 percent, depending upon what the flour is for. This means that even if the protein content of unmilled grain itself has gone up, the amount of gluten an eater takes in would have remained relatively unchanged since the advent of reliable measurement. It also suggests that those who ate wheat and barley breads before the advent of protein measurement—and, as a result, the blending of flour for consistent ratios—might have seen their own gluten consumption vary wildly from year to year.
If the epidemiology of celiac disease does not always support wheat breeding and the quantity of gluten in the Western diet as causes, then what else could explain its rising prevalence? Recently, explorations of the human microbiome have become especially exciting. We are only just beginning to understand how the resident microecology, or assembly of bacteria in our guts, could have a key effect in many human immunological functions, including our tolerance for gluten—though it seems that several factors can influence microbial diversity in a person’s gastrointestinal system. Breast-feeding, for instance, has been shown to increase resident lactobacilli and bifidobacteria in infants—an important observation in light of one study that found that a drop in lactobacilli precedes a celiac-disease diagnosis. Breast milk is rich in bacteria that compete with other bacteria such as native strains of E. coli, which have been shown in vitro to intensify gut inflammation, whereas strains of bifidobacteria appear to protect against gut inflammation. An individual’s toleration of many foods, not just gluten, might then be linked to low populations of “good” bacteria. Among the most compelling evidence is a cultural turn away from breast-feeding in Sweden from 1984 to 1996, which correlates to a population of Swedes who turned out to be diagnosed with celiac disease at a rate three times the national average.
Overall municipal hygiene, and the role of hygienic practices in preventing exposure to microbes from other sources, also seems to have an effect on gut health. In 2013, the New York Times reported on the Russian territory of Karelia, which shares a border with Finland. Genetically, the Karelians share many similarities with the Finns, though there are a few key exceptions, among them that the Finns are diagnosed with celiac disease at much higher rates. While celiac-disease cases in Finland have doubled in the last twenty years, the Russian Karelians appear not to present it at the same rates even though statistically 30 to 50 percent of them are likely to be carrying the genetic predisposition. According to the hygiene theory, the most important difference between the Finns and the Russian Karelians might be the infrastructural and economic challenges that effectively stalled the Karelian territory in the early twentieth century. The lack of development has affected many aspects of life, including overall hygiene; Russian Karelians encounter more airborne and wat
erborne microbes than the Finns. However, the study had some limitations, including not tracking dietary factors such as the type and quantity of bread (and other glutenous grains and foods) consumed by the Karelians.
But the identification of celiac disease nearly two thousand years ago seems to complicate the hygiene theory as much as it does the wheat-breeding theory, since it predates many of the inventions, childrearing practices, and environmental conditions that subscribers point to. Even relatively recent history is problematic to the hygiene theory: Samuel Gee, the nineteenth-century London physician who identified diet as the key factor, did not have an antibiotic at his disposal. The invention of penicillin was still more than half a century away. And London during the Industrial Revolution—or during any point up to the advent of indoor plumbing, sanitation, and disinfectants—was a dirty place, teeming with microbes that paradoxically both sickened and protected people. The animals in the food chain were not receiving subtherapeutic antibiotic treatments to help them gain weight, as so much industrially raised livestock does now (another potential cause of reduced gut flora in humans). Furthermore, Samuel Gee’s patients at the Hospital for Sick Children had been breast-fed, as had Willem-Karel Dicke’s pediatric patients decades later in the Netherlands. Despite all of these environmental differences, celiac disease existed in the ancient world, and potentially in greater numbers than anyone knew.
All of this research into the cleanliness of one’s environment and its impact on gut health got me wondering. I called up my mother one day. I knew she had kept the house I grew up in extremely clean, which meant I had likely come in contact with fewer microbes. But for how long did she breast-feed me? How did I take to it? At what age did she put me on solid food, give me my first piece of bread, or Barnum’s Animals Crackers? Was I three months old? That would be a good age to be introduced to gluten, according to the latest information, as it may help build a tolerance to immunogenic peptides, while six months would, by the same estimates, be too late. And what about antibiotics? When was the first time I needed them? I know my brother was always getting ear infections, but was I any hardier? I was hoping to hear about all kinds of assaults on my young microbiome—it would have at least suggested that one theory was more valid than another in my own case—but it turns out that I had a normal infancy. I liked breast milk for a while, and then I liked gluten even more. That only left the cleanliness of the house. Maybe my mother should have moved to Karelia for a few years, or taken me to play downwind from the city dump.
Put all of these factors together—the changes wheat has undergone over ten thousand years, especially in the last fifty or so; the changes to the human environment; and the changes an individual like me can kick-start in his own body without intending to—and it’s easy to see how there are multiple pathways to the disease. The most cautious studies seem to emphasize the ongoing mystery of causation while acknowledging that any one hypothesis, whether hygiene or wheat breeding, is reductive. Celiac disease, and even other gluten-related disorders, could be the result of a perfect storm of factors. It’s now common for researchers to talk about the “celiac iceberg” as a metaphor for the many considerations that rest below the apparent surface of the disease.
There is increasing urgency to uncover as much of the hidden part of the iceberg as possible. In the United States, there’s been a fourfold increase in diagnosis in the last fifty years, with some blame falling on the grain-heavy “Western” and Mediterranean diets. For every one person who is diagnosed, it’s estimated that there are five or six who have “silent” celiac disease and still carry all the risks of long-term exposure to gluten. Researchers are also seeing increased diagnoses in places where only a few decades ago they never expected to see celiac disease: Africa, Asia, and South America. In Asia in particular, the combination of increased fast-food consumption and the tendency to view wheat as a “preferred staple” as income levels rise and diets get Westernized—away from the traditional staples of rice and millet—means diagnosis rates will continue to increase.
—
There are rarely easy answers to such questions. And in the end, what difference would having a “return address” for my celiac diagnosis make? It might allow me to feel a little better intellectually, by filling in the holes in the story, but knowing the cause isn’t going to fix anything at the table.
Pharmaceutical companies are at work on that, however. To date, the most advanced research is in drugs that wouldn’t “cure” the disease so much as make it possible for a celiac to eat gluten once in a while, or to recover more quickly after getting glutened. The figures on the disease’s rising prevalence have made therapies look more profitable, and thus many drugs, expensive drugs, will show up on the market in the next several years.
The more I emerged from withdrawal that winter, and the more I adjusted to the boundaries of my new diet and, as a result, my life, the more seriously I promised myself that I would not take any of the drugs when they came on the market. In my opinion, it sounds as if most of them are being designed as crutches, not cures. I would want to wait a good long time to see what the unintended side effects are. Learning about the epidemiology of the disease has given me respect for the intricacies of the human body. Ten years from the point of introduction seems like a safe if arbitrary length of time to keep some celiac canaries in the mine shaft ahead of me.
But set a sandwich in front of me. Put a pint of stout beside it. Then, see what I do.
* * *
*1 I might also have been especially sensitive to that broad-spectrum antibiotic—which is not good. I was able to discontinue it, but what would I do if I needed one? I’m told an injection would bypass the gut and avoid causing an autoimmune reaction.
*2 The definition of purity in bread would appear to cut both ways, though. Many churches are loath to bless GF bread for communion rituals, and rely instead on “gluten-reduced” breads, which are not safe for all celiacs, though some can tolerate them.
*3 Another possibility: a modification like non-immunogenic properties might be “stacked” on top of other, more desirable traits, such as pest- or drought-resistance, so as to increase the appeal of the GM seed.
Like explorers in a strange territory, Bec and I purchased guides to help us acclimate. Every one of the cookbooks we acquired featured a cover photograph of a loaf of bread. There were pictures of other baked goods too, usually cookies and pies, sometimes a pizza. The bread, though, was the most hopeful-looking. It appeared to rival any loaf turned out by the finest bakery on its best day. Perfectly risen, browned, dusted with flour, and graced with airy crumbs like I remembered eating only months ago, these rice or sorghum or potato (or all three) breads stood as proof to eaters who had recently been exiled from the land of wheat and gluten that all was not in fact lost.
I clung to those images at first. Then I learned that they were lies. What else could explain how the loaves we so earnestly attempted in our own kitchen turned into such colossal disappointments? Actually, quite a lot, from the age of our oven to the chlorine in our village water supply to the strains of yeast and loaf pans we were using. Baking with wheaten flour is already an exacting process, but baking a dough made of the ten to twelve ingredients a GF recipe calls for adds more variability.
The first GF recipe Bec ever attempted was for a cinnamon-raisin bread from a popular GF cookbook. At the time, it was one of the standard-bearers in the genre. The recipe called for a blend of several different flours, which we went out and bought to the tune of about twenty dollars, plus a bag of xanthan gum for structure. It utilized a big dose of canola oil in an attempt to hide the grit. The spices, raisins, and sugar promised to help, as well.
Bec found the preparation stressful. The batter—it would be an exaggeration to call it dough—didn’t want to come together in the bowl, as of course it wouldn’t, lacking gluten. The mixture fell apart in her hands as she tried to shape it. For an hour we kept checking on the pan as the batter struggled to rise like
a mortally wounded animal. It gained about half an inch before she gave up and shoved it in the oven anyway.
As the mixture baked, though, the kitchen filled with that familiar smell: the nuttiness of caramelizing grains, spices, and a hint of yeast. This aroma has probably deceived eaters of faux-bread since the first desperate people decided to make loaves of peas and chestnuts. Like me, they must have lingered near the oven, breathing in a promise that disappeared the moment they sliced the loaf and tasted it.
Which was exactly what happened with this bread.
After it had cooled for an hour, we cut it. The crumb fell apart in chunks, like pieces of a dried-out plaster wall.
It was a bad sign, but I remained optimistic. Okay, I didn’t have to toast slices. I would eat the chunks as they fell off. I could try to brown them in a skillet. Hope was clearly the operative factor here, along with denial. That ended when I tasted the bread and got the sugar, the spices, a ton of grit, and little depth. I applauded Bec’s efforts anyway. She was trying a form of alchemy in the kitchen, attempting to turn rice into wheat.
Every recipe we tried, no matter where it came from, turned out somewhere between disappointment and a complete flop. The breads failed to rise, or they failed to hold together, the grains tasted weird, and those loaves I could slice still left behind an oily, gritty paste in my mouth. Some were edible only with a tablespoon of butter and a quarter-inch layer of jam. The best loaf came from my sister-in-law, who greeted me on a trip to Maryland with a surprisingly good white bread she had made in a bread machine. I wasn’t going to rush out and get a bread machine, though, at least not yet. And the low point came when Bec made an Irish soda bread from an online recipe. It was the most spectacular failure I’ve ever seen: sandy, sour, salty, and completely without structure—like a lump of baking powder, drywall, and water. We hovered over it like an accident scene for a few seconds, and then pitched it, still warm, into the trash.