IF IT’S TUESDAY, IT MUST BE WHEAT
I measured the length of the bread aisle at my local supermarket: sixty-eight feet.
That’s sixty-eight feet of white bread, whole wheat bread, multi-grain bread, seven-grain bread, rye bread, pumpernickel bread, sourdough bread, Italian bread, French bread, breadsticks, white bagels, raisin bagels, cheese bagels, garlic bagels, oat bread, flax bread, pita bread, dinner rolls, Kaiser rolls, poppy seed rolls, hamburger buns, and fourteen varieties of hot dog buns. That’s not even counting the bakery and the additional forty feet of shelves packed with a variety of “artisanal” wheat products.
And then there’s the snack aisle with forty-some brands of crackers and twenty-seven brands of pretzels. The baking aisle has bread crumbs and croutons. The dairy case has dozens of those tubes you crack open to bake rolls, Danish, and crescents.
Breakfast cereals fill a world unto themselves, usually enjoying a monopoly over an entire supermarket aisle, top to bottom shelves.
There’s much of an aisle devoted to boxes and bags of pasta and noodles: spaghetti, lasagna, penne, elbows, shells, whole wheat pasta, green spinach pasta, orange tomato pasta, egg noodles, tiny-grained couscous to three-inch-wide pasta sheets.
How about frozen foods? The freezer has hundreds of noodle, pasta, and wheat-containing side dishes to accompany the meat loaf and roast beef au jus.
In fact, apart from the detergent and soap aisle, there’s barely a shelf that doesn’t contain wheat products. Can you blame Americans if they’ve allowed wheat to dominate their diets? After all, it’s in practically everything from Twizzlers to Twinkies to twelve-grain bread.
Wheat as a crop has succeeded on an unprecedented scale, exceeded only by its cousin, corn, in acreage of farmland planted. It is, by a long stretch, among the most consumed foods on earth, constituting 20 percent of all human calories. While humans also consume plenty of corn in its widely varied forms, from corn on the cob to high-fructose corn syrup and maltodextrin, much of the corn is also fed to livestock to fatten them up and marble the meat just before slaughter.
Wheat has been an undeniable financial success. How many other ways can a manufacturer transform a dime’s worth of raw material into $3.99 worth of glitzy, consumer-friendly product, topped off with endorsements from the American Heart Association? In most cases, the cost of marketing these products exceeds the cost of the ingredients themselves.
Foods made partly or entirely of wheat for breakfast, lunch, dinner, and snacks have become the rule. Indeed, such a regimen would make the USDA, the Whole Grains Council, the Whole Wheat Council, the Academy of Nutrition and Dietetics, the American Diabetes Association, and the American Heart Association happy, knowing that their message to eat more “healthy whole grains” has gained a wide and eager following.
So why has this seemingly benign plant that sustained generations of humans suddenly turned on us? For one thing, it is not the same grain our forebears ground into their daily bread. Wheat naturally evolved to only a modest degree over the centuries, but it has changed dramatically in the past sixty years under the influence of agricultural scientists. Wheat strains have been hybridized, crossbred, and chemically mutated to make the wheat plant resistant to environmental conditions, such as drought, or pathogens, such as fungi, as well as resistant to herbicides. But most of all, genetic changes have been introduced to increase yield per acre. The average yield on a modern North American farm is more than tenfold greater than farms of a century ago. Such enormous strides in yield have required drastic changes in genetic code, reducing the proud “amber waves of grain” of yesteryear to rigid, stocky, eighteen-inch-tall high-production “semi-dwarf” wheat of today. Such fundamental genetic changes, as you will see, have come at a price for the unwitting creatures who consume it.
Even in the few decades since your grandmother survived Prohibition and danced the Big Apple, wheat has undergone countless transformations. As the science of genetics has progressed over the past sixty years, permitting human intervention to unfold much more rapidly than nature’s slow, year-by-year breeding influence, the pace of change has increased exponentially. The genetic backbone of a high-tech poppy seed muffin has achieved its current condition by a process of evolutionary acceleration for agricultural advantage that makes us look like pre-humans trapped somewhere in the early Pleistocene.
FROM NATUFIAN PORRIDGE TO DONUT HOLES
“Give us this day our daily bread.”
It’s in the Bible. In Deuteronomy, Moses describes the Promised Land as “a land of wheat and barley and vineyards.” Bread is central to religious ritual. Jews celebrate Passover with unleavened matzo to commemorate the flight of the Israelites from Egypt. Christians consume wafers representing the body of Christ. Muslims regard unleavened naan as sacred, insisting it be stored upright and never thrown away in public. In the Bible, bread is a metaphor for bountiful harvest, times of plenty, freedom from starvation, even salvation.
Don’t we break bread with friends and family? Isn’t something new and wonderful “the best thing since sliced bread”? “Taking the bread out of someone’s mouth” is to deprive that person of a fundamental necessity. Bread is a nearly universal diet staple: chapati in India, tsoureki in Greece, pita in the Middle East, aebleskiver in Denmark, naan bya for breakfast in Burma, glazed donuts any old time in the United States.
The notion that a foodstuff so fundamental, so deeply ingrained in the human experience, can be bad for us is, well, unsettling and counter to long-held cultural views. But today’s bread bears little resemblance to the loaves that emerged from our forebears’ ovens. Just as a modern Napa Cabernet Sauvignon is a far cry from the crude ferment of fourth-century BC Georgian winemakers who buried wine urns in underground mounds, so has wheat changed. Bread and other foods made of wheat may have helped sustain humans for centuries (but at a chronic health price, as I shall discuss), but the wheat of our ancestors is not the same as modern commercial wheat that reaches your breakfast, lunch, and dinner table. From original strains of wild grass harvested by early humans, wheat has exploded to more than 25,000 varieties, virtually all of them the result of human intervention.
In the waning days of the Pleistocene, around 8500 BC, millennia before any Christian, Jew, or Muslim walked the earth, before the Egyptian, Greek, and Roman empires, the Natufians led a semi-nomadic life roaming the Fertile Crescent (now Syria, Jordan, Lebanon, Israel, and Iraq), supplementing hunting and gathering by harvesting indigenous plants. They harvested the ancestor of modern wheat, einkorn, from fields that flourished wildly in open plains. Meals of gazelle, boar, fowl, and ibex were rounded out with dishes of wild-growing grain and fruit. Relics like those excavated at the Tell Abu Hureyra settlement in what is now central Syria suggest skilled use of tools such as sickles and mortars to harvest and grind grains, as well as storage pits for stockpiling harvested food. Remains of harvested wheat have been found at archaeological digs in Tell Aswad, Jericho, Nahal Hemar, Navali Cori, and other locales. Wheat was ground by hand, then eaten as porridge. The modern concept of bread leavened by yeast would not come along for several thousand years.
Natufians harvested wild einkorn wheat and stored seeds to sow in areas of their own choosing the following season. Einkorn wheat eventually became an essential component of the Natufian diet, reducing need for hunting and gathering. The shift from harvesting wild grain to cultivating it from one season to the next was a fundamental change that shaped subsequent human migratory behavior, as well as development of tools, language, and culture. It marked the beginning of agriculture, a lifestyle that required long-term commitment to permanent settlement, a turning point in the course of human civilization. Growing grains and other foods yielded a surplus of food that allowed for occupational specialization, government, and all the elaborate trappings of culture (while, in contrast, the absence of agriculture arrested development of other cultures in a lifestyle of nomadic hunting an
d gathering).
Over most of the ten thousand years that wheat has occupied a prominent place in the caves, huts, and adobes, and on the tables of humans, what started out as harvested einkorn, then emmer, followed by cultivated Triticum aestivum, changed gradually and only in fits and starts. The wheat of the seventeenth century was the wheat of the eighteenth century, which in turn was much the same as the wheat of the nineteenth century and the first half of the twentieth century. Riding your oxcart through the countryside during any of these centuries, you’d see fields of five-foot-tall “amber waves of grain” swaying in the breeze. Crude human wheat-breeding efforts yielded hit-and-miss, year-over-year incremental modifications, some successful, most not, and even a discerning eye would be hard-pressed to tell the difference between the wheat of early twentieth-century farming from its centuries of predecessors.
During the nineteenth and early twentieth centuries, as in many preceding centuries, wheat therefore changed little. The Pillsbury’s Best XXXX flour my grandmother used to make her famous sour cream muffins in 1940 was little different from the flour of her great-grandmother sixty years earlier or, for that matter, from that of a distant relative two or three centuries before that. Grinding of wheat became mechanized in the twentieth century, yielding finer flour on a larger scale, but the basic composition of the flour remained much the same.
That all ended in the latter half of the twentieth century, when an upheaval in hybridization methods transformed this grain. What now passes for wheat has changed, not through the forces of drought or disease or a Darwinian scramble for survival, but through human intervention.
Wheat has undergone more drastic transformation than the Real Housewives of Beverly Hills, stretched, sewed, cut, and stitched back together to yield something entirely unique, nearly unrecognizable when compared to the original and yet still called by the same name: wheat.
Modern commercial wheat production has been intent on delivering features such as increased yield, decreased operation costs, and large-scale production of a consistent commodity. All the while, virtually no questions have been asked about whether these features are compatible with human health. I submit that, somewhere along the way during wheat’s history, perhaps five thousand years ago but more likely sixty years ago, wheat changed in ways that yielded exaggerated adverse effects on human health.
The result: A loaf of bread, biscuit, or pancake of today is different from its counterpart of a thousand years ago, different even from what our grandmothers made. They might look the same, even taste much the same, but there are fundamental biochemical differences. Small changes in wheat protein structure, for instance, can spell the difference between a devastating immune response to wheat protein versus no immune response at all.
WHAT HAPPENED TO THE FIRST WHEAT-EATERS?
After not consuming the seeds of grasses for the first 99.6 percent of our time on this planet, we finally turned to them for sustenance ten thousand years ago. Desperation, caused by a shortage of wild game and plants due to a natural shift in climate, prompted Neolithic hunter-gatherers to view seeds of grasses as food. But we cannot save grass clippings gathered from cutting our lawns to sprinkle on top of a salad with a little vinaigrette; likewise, we found out the hard way that, when ingested, the leaves, stalks, and husks of grasses are tasteless and inedible, wreaking gastrointestinal havoc like nausea, vomiting, abdominal pain, and diarrhea, or passing through the gastrointestinal tract undigested. The grasses of the earth are indigestible to humans (unlike herbivorous ruminants, who possess adaptations that allow them to graze on grasses, such as multi-compartment stomachs and spiral colons that harbor unique microorganisms that break grasses down).
It must have taken considerable trial and error to figure out that the seeds of grass, removed from the husk, then dried, pulverized with stones, and heated in water, would yield something that could be eaten and provide carbohydrate nourishment. Over time, increased efficiencies in harvesting and grinding allowed grass seeds to play a more prominent role in the human diet.
So what became of those first humans who turned to the seeds of wheat grass to survive?
Anthropologists tell us that there was an explosion of tooth decay and tooth abscess; microorganisms of the mouth and colon changed; the maxillary bone and mandible of the skull shrank, resulting in crooked teeth; iron deficiency anemia became common; the frequency of knee arthritis doubled; and bone length and diameter decreased, resulting in a reduced height of five inches in males, three inches in females.1, 2, 3, 4
The explosion of tooth decay, in particular, is telling: Prior to the consumption of the seeds of grasses, tooth decay was uncommon, affecting only 1 to 3 percent of all teeth recovered. This is extraordinary, as non-grain-eating humans had no fluoridated water or toothpaste, no toothbrushes, no dental floss, no dentists, no dental insurance card, yet had perfectly straight, healthy teeth even to old age. (Yes, ancient humans lived to their fifties, sixties, and seventies, contrary to popular opinion.) When humans first turned to grains—einkorn wheat in the Fertile Crescent, millet in sub-Saharan Africa, and maize and teosinte in Central America—humans developed an explosion of tooth decay: 16 to 49 percent of teeth showed decay and abscess formation, as well as misalignment, even in young people.5
Living in a wild world, hunting and gathering food, humans needed a full set of intact teeth to survive, sometimes having to eat their food raw, which required prolonged, vigorous chewing. The dental experience with wheat and grains encapsulates much that is wrong with their consumption. The amylopectin A carbohydrate that provides carbohydrate calories may allow survival for another few days or weeks, but it is also responsible for the decline in dental health months to years later—trading near-term survival in exchange for long-term crippling changes in health at a time when mercury fillings and dentures were not an option. Over the centuries, human grain consumers learned they had to take extraordinary steps to preserve their teeth. Today, of course, we have a multi-billion dollar industry delivered by dentists, orthodontists, toothpaste manufacturers, and so forth, all to largely counter the decay and misalignment of teeth that began when humans first mistook the seeds of grasses for food.
WHEAT BEFORE GENETICISTS GOT HOLD OF IT
Wheat is uniquely adaptable to environmental conditions, growing in Jericho, 850 feet below sea level, to Himalayan mountainous regions 10,000 feet above sea level. Its latitudinal range is also wide, ranging from as far north as Norway, 65° north latitude, to Argentina, 45° south latitude. Wheat occupies sixty million acres of farmland in the United States, an area equal to the state of Ohio. Worldwide, wheat is grown on an area ten times that figure, or twice the total acreage of Western Europe. After all, Domino’s has lots of pizzas to sell at $5.99.
The first wild, then cultivated, wheat was einkorn, the great-granddaddy of all subsequent wheat. Einkorn has the simplest genetic code of all wheat, containing only fourteen chromosomes. Circa 3300 BC, hardy, cold-tolerant einkorn wheat was a popular grain in Europe. This was the age of the Tyrolean Iceman, fondly known as Ötzi. Examination of the intestinal contents of this naturally mummified Late Neolithic hunter, killed by attackers and left to freeze in the mountain glaciers of the Italian Alps, revealed the partially digested remains of einkorn wheat consumed as unleavened flatbread, along with remains of plants, deer, and ibex meat.6
Shortly after human cultivation of the first einkorn plant, the emmer variety of wheat, the natural offspring of parents einkorn and an unrelated wild grass, Aegilops speltoides or goatgrass, made its appearance in the Middle East.7 Consistent with the peculiar promiscuity unique to grasses, goatgrass added its genetic code to that of einkorn, resulting in the more complex twenty-eight-chromosome emmer wheat. Grasses such as wheat have the ability to retain the sum of the genes of their forebears. Imagine that, when your parents mated to create you, rather than mixing chromosomes and coming up with forty-six chromosomes to create their offspring
, they combined forty-six chromosomes from Mom with forty-six chromosomes from Dad, totaling ninety-two chromosomes in you. This, of course, doesn’t happen in higher species. Such additive accumulation of chromosomes in grasses is called polyploidy and you and other mammals like hedgehogs and squirrels are incapable of it. But the grasses of the earth, including the various forms of wheat, are capable of such chromosomal multiplication.
Einkorn and its evolutionary successor emmer wheat remained popular for several thousand years, sufficient to earn their place as food staples and religious icons, despite their relatively poor yield and less desirable baking characteristics compared to modern wheat. (These denser, cruder flours would have yielded lousy ciabattas or bear claws.) Emmer wheat is probably what Moses referred to in his pronouncements, as well as the kussemeth mentioned in the Bible, and the variety that persisted up until the dawn of the Roman Empire.
Sumerians, credited with developing the first written language, left us tens of thousands of cuneiform tablets. Pictographic characters, dated to 3000 BC, describe recipes for breads and pastries, all made by taking mortar and pestle or hand-pushed grinding wheel to emmer wheat. Sand was often added to the mixture to hasten the laborious grinding process, leaving bread-eating Sumerians with sand-chipped teeth.
Emmer wheat flourished in ancient Egypt, its cycle of growth suited to the seasonal rise and fall of the Nile. Egyptians are credited with learning how to make bread “rise” by the addition of yeast. When the Jews fled Egypt, in their hurry they failed to take the leavening mixture with them, forcing them to consume unleavened bread made from emmer wheat.
Sometime in the millennia predating Biblical times, twenty-eight-chromosome emmer wheat (Triticum turgidum) mated naturally with another grass, Triticum tauschii, yielding primordial forty-two-chromosome Triticum aestivum, genetically closer to what we now call wheat. Because it contains the sum total of the chromosomal content of three unique grasses with forty-two chromosomes, it is the most genetically complex. It is therefore the most genetically “pliable,” an issue that will serve future genetics researchers well in the millennia to come.
Wheat Belly (Revised and Expanded Edition) Page 3