During World War II, however, synthetic enrichment campaigns, championed for reasons of national security, gave industrial bread armor plating and renewed appeal. Bread enrichment campaigns also trained Americans to crave added vitamin power in their food (chapter 4: dreams of strength and defense). After the war, industrial bread helped fuel the Manichean culture of the Red Scare and Cold War. Propelled by confident belief in the moral and physical superiority of industrial food, American bread went global in the early 1960s, a key ingredient in America’s postwar dominance of the world food system (chapter 5: dreams of peace and security).
On the opposite end of the political spectrum, industrial bread emerged as a focal point for counterculture ire in the late 1960s. Antiwar activists, ecologists, and back-to-the-landers held up “plastic” white bread as a lethal symbol of militaristic hubris and cultural conformity. In the 1970s, industrial “health bread” went from counterculture to mainstream on a wave of consumer-oriented body consciousness. The 1980s and 1990s saw an explosion of elite niche market breads—“yuppie chow”—juxtaposed against industrial white, which had completed its trajectory from modern marvel to white trash icon. And, as I write these words in the spring of 2011, industrially produced whole wheat bread has, for the first time in U.S. history, outsold its refined white counterpart. Organic, artisanal-style, multigrain, and even gluten-free breads lead the industry today (chapter 6: dreams of resistance and status).
Exploring these stories, we’ll see that dreams of good food play a unique role in the creation of social distinctions: they link individual consumption decisions to the health of the whole society in a way that seems natural and physiological, not socially produced. When someone else embraces our vision of good food, it isn’t viewed as a culturally specific affinity; it is seen as acceptance of a universal natural truth—who wouldn’t want to eat good food? On the other hand, when someone questions the universal goodness of our good food, it marks them as unfathomably different—what kind of person doesn’t want good food?!19
Seen in this light, the history of bread dreams is frequently a history of ambiguous achievements and evils committed for the most benevolent of reasons. Often the problem was not the food dream itself, but what the dream made invisible. In the 1830s, food guru Sylvester Graham—the man whose followers would give America both the humble graham cracker and the lofty belief that there is something morally virtuous about whole wheat—achieved celebrity status by blaming Irish and black New Yorkers for a cholera epidemic. The poor, he argued, brought disease upon themselves because they lacked the intelligence and self-discipline to embrace a “natural” diet. National health—both physical and moral—could be achieved only through a strict regime of healthy eating.
In retrospect, there was nothing wrong with Graham’s prescription for healthy eating. It was a bit ascetic for my taste—whole wheat bread, fruit, nuts, fresh water, and no spices, meat, sugar, caffeine, or alcohol—but it was probably a reasonable reaction against the country’s relentless diet of meat, boiled vegetables, white bread, and booze. The problem was that his vision of better society through better eating made such a neat panacea that he missed the real reason poor New Yorkers died from cholera: grueling labor conditions, low wages, corrupt government, and profiteering by vendors of clean water.20
Substitute “obesity” for “cholera,” and I’m left wondering: Have we come that far since Sylvester Graham? This, in turn, raises the practical question at the heart of this book: Should we really try to change the world by changing what and how people eat?
My Tucson self says yes. My Berkeley self is more skeptical. And yet, I wrote this book in a third place: Walla Walla, Washington, where I now teach food politics to smart, eager students at Whitman College. The contemporary alternative food movement has been a force during most of my students’ lives, so many of them arrive at college already deeply committed to changing the food system. And Walla Walla itself embodies all the contradictions they will have to grapple with if they are to succeed at that.
Tucked into the dry southeastern corner of Washington State, the town is surrounded by miles of wheat. Grown by heavily indebted family farmers in vast, capital-intensive farms and then exported to Asia by large multinational grain traders at low prices subsidized by the U.S. government, Walla Walla wheat is a river of gold in the global, geostrategic food system.
Ironically, from my perspective at least, Walla Walla mostly grows soft winter wheat—the wrong wheat for artisanal bread baking. But the “foodie” in me has other options here. Walla Walla is home to a fast-growing trade in high-end wines, produced in more than one hundred local wineries. The wine boom has brought vast amounts of money and tourism to this remote place, remaking the high desert landscape with an Old West-cum-Mediterranean aesthetic. It has also brought a new food culture and the alternative food movement: three community-supported agriculture businesses, grass-fed-beef ranchers, artisanal cheese makers, schoolyard gardens, and restaurants serving local produce. In Walla Walla, I don’t just know “my” farmer and winemaker; they’re my neighbors.
Still, while the wine boom has, to my mind, brought “good food” to Walla Walla, it hasn’t brought good middle-class jobs—at least not as good as the stable, decent-paying jobs canning boiled vegetables and bottling mass-produced apple juice that have all disappeared in recent years. Ironically, good jobs making processed foods that many would disdainfully label “bad food” have been replaced by low-paid work serving eaters of “good food.”
From my vantage in Walla Walla, I still believe that changing the food system is imperative. But the history of battles over bread suggests that real change will happen only when well-meaning folks learn to think beyond “good food” and “bad food,” and the hierarchies of social difference that have long haunted these distinctions.
1
UNTOUCHED BY HUMAN HANDS
Dreams of Purity and Contagion
“I want to know where my bread comes from! I don’t want bread from some nameless basement bakery. I want my bread from a bakery that’s clean as my own kitchen. … ” Know where your bread is baked and how. Don’t take a chance with the bread you buy. You can’t afford to.
—Holsum bread advertisement, late 1920s
A HAIR IN THE MILK
There are people who believe that drinking raw milk can cure illness and restore the body to natural harmony. There are people who think that drinking raw milk is like playing Russian roulette with microbes. There are a few farm families that drink raw milk just because it’s what they have around, and a lot more folks who have never given raw milk a single thought because it’s so unusual. Then there are those for whom raw milk is both scary and seductive, wholesome yet menacing. That’s me.
A city kid, I grew up playing in vacant lots, not the back pasture. My idea of nature always involved a campground—I had no experience with the working nature of food production until I was in my twenties. The first time I saw milk come out of an actual cow, I was twenty-five and learning to do the milking myself while interning on a ranch in Arizona. “Red” was her name—the cow, that is. Red is not a particularly creative name for a cow, but my wife, Kate, and I came up with a lot more colorful monikers: the kind of names a cow gets called when it kicks over the milk pail, when it kicks over the feed pail, when it intentionally stomps your foot or butts your shoulder with its ornery old lady horns.
Red’s was the first clump of hair I ever saw floating in my milk. Before Red, I had never drunk milk with the scent of cow still lingering in it or wondered how much barnyard dust in the milk constituted “too much.” I thought Listeria was something you used mouthwash to get rid of, not the bacteria responsible for a deadly milk-borne sickness.
Since then I have drunk a lot of raw milk, most of it illegal, thanks to strict government regulations slanted toward large high-tech dairies. I don’t ascribe any particular natural virtue to milk’s unpasteurized state, but I’ve come to like the grassy taste and the sense that
I’m getting my milk direct from a local farmer. Despite all that, though, I have never gotten over the slight flutter of unease I first felt drinking raw milk—the modern intuition that maybe there was something dangerous about getting milk from a cow instead of a factory.
This unease has haunted Americans since they first began to grasp the existence of an invisible world of small, possibly threatening organisms. Not without cause. In the late nineteenth and early twentieth centuries, city residents got their milk from fetid, overcrowded “swill dairies” or off unrefrigerated train cars traveling overnight from the surrounding countryside. Until mandated pasteurization, milk was a key vector for typhoid and other serious diseases.1
Throughout U.S. history, anxieties about tainted milk have been matched only by concerns about meat. Most notably, The Jungle, Upton Sinclair’s 1906 book about unsanitary conditions in Chicago’s stockyards, galvanized a nascent consumer protection movement. Muckraking journalists, campaigning scientists, and an army of civically engaged middle- and upper-class women horrified by unsafe food took to the streets, courts, and legislatures, demanding change. Sinclair had hoped to spark outrage over the inhuman conditions experienced by immigrant meatpackers. Instead, the country fixated on germs and the frightening immigrants who appeared to spread them into the nation’s food. “I aimed for the people’s heart,” Sinclair is said to have reflected, “and by accident, I hit them in the stomach.”2
Still, when it came to protecting stomachs, the Pure Foods Movement, as it came to be called, achieved substantial reforms. Pure Foods activists forced manufacturers to change the way they handled and distributed food, boycotted unsanitary establishments out of business, forced state and local officials to take food safety more seriously, and passed what still serves as a the bedrock of all federal food safety regulation, the 1906 Pure Food and Drug Act. Unfortunately, their efforts were far from perfect and were steadily watered down over the next century.3
At the start of the new millennium, Pure Foods crusaders’ concerns still seem shockingly contemporary. Serious food-borne illnesses affected millions and sent hundreds of thousands to hospitals during the 1990s and early 2000s. E. coli in beef emerged as an almost ordinary source of tragedy, while sensational outbreaks of food-borne illness in bean sprouts, strawberries, cilantro, eggs, peanut butter, and spinach gripped the media. Food safety regulations, some with roots in 1906, appeared impotent in the face of a far-flung global food system dominated by powerful corporations. In many cases, regulators themselves seemed to have been “captured” by the very companies they supposedly oversaw. It felt like the 1900s all over again.
On the other hand, few Americans alive today can imagine a time when the specter of unclean bread was as scary as germ-clotted milk or tainted beef. And yet, during the late nineteenth and early twentieth centuries, the menace of contaminated bread was no less a topic of public outcry than dirty meat or milk. Pure Foods crusaders targeted the nation’s bakeries, government hearings convened around bread contamination, and Harvey W. Wiley, the country’s most prominent Pure Foods advocate, warned consumers of serious threats to America’s staple food.
Accurately or not, a simple loaf of bread from a small urban bakery seemed to many consumers a harbinger of death and disease. These fears ultimately changed the country’s bread. An urgent need to know that one’s bread was pure proved instrumental in convincing Americans to embrace industrially produced loaves. Early twentieth-century bread fears also confused food purity and social purity in a way that placed the blame for unsafe food on some of the food systems’ greatest victims and distracted attention from more systemic pressures, creating danger and vulnerability. As we think about food safety in our own time, the story of America’s bread panic suggests that visions of pure food can motivate desperately needed changes but also backfire in myriad ways.
THE MODEL PALACE OF AUTOMATIC BAKING
In 1910, the country’s greatest bread bakery opened on the corner of Vanderbilt and Pacific in Brooklyn’s Prospect Heights. Six stories tall with an alabaster white neoclassical facade, it was a shining temple to a new way of thinking about food “untouched by human hands.” Gleaming surfaces, massive machinery, and light-filled halls proclaimed a new creed: industrial food is pure food, and pure food is the foundation of social progress. During the first decades of the twentieth century, tens of thousands of New Yorkers flocked to the Ward Bakery on school field trips and weekly tours to witness the spectacle.
The Ward Bakery in Brooklyn, along with its twin in the Bronx, was the flagship of a revolution in the way the country’s single most important staple was produced and sold. In the early twentieth century, when average Americans got 30 percent of their daily calories from bread, more than any other single food, New Yorkers ate more bread than any other group in the country. New Yorkers also purchased more of their bread than the rest of the country, and they bought a lot of it from the Ward Baking Company. At the company’s height, Ward’s Brooklyn and Bronx factories supplied one in every five bakery loaves eaten in New York City. By the end of the 1920s, the company had extended that power across the entire country, coming astoundingly close to achieving monopoly control over every single sizable bread market in the nation.4
The Ward family achieved this dominance by pioneering key technological breakthroughs, running roughshod over union labor, laying waste to small competitors, and concocting financial machinations that would have dazzled Gordon Gecko. But the Ward Baking Company owed its uncanny ability to win over skeptical customers to a much larger sense of disquiet hanging over early twentieth-century America.
The Ward Bakery went up in Brooklyn at a moment when poor wheat harvests, commodity speculation, and the power of railroad monopolies had stressed bread supplies, causing occasional riots and widespread fear of famine. In the first decades of the twentieth century, “the bread question” was the question for many observers, and it wasn’t just the bread supply that mattered. The country was divided on how bread should be produced in the first place. As one national household advice columnist wrote in 1900, “No subject in the history of foods has been of such vital importance or aroused so much diversion of opinion as bread making.”5 These specific concerns, in turn, reflected a larger set of perturbations agitating the country.
From the 1870s to the 1920s, a singular convergence of forces buffeted the United States, upending all sense of stability and order. Unprecedented influxes of southern and eastern European immigrants, rapid urbanization, explosive technological change, and a series of grave economic downturns strained old institutions built around the dream of an Anglo-Saxon nation of self-sufficient rural communities. Thrust into an emerging system of global grain trading and financial speculation, rural America reeled.6
Urban infrastructure collapsed under demographic pressures. Corrupt politicians and their private sector cronies stepped in to provide basic services at high cost. Great trusts—vast corporations with monopoly power—came to dominate nearly every important sector of the emerging industrial economy. Work, once carried out on an intimate scale, suddenly felt controlled by distant, impersonal forces. For white, native-born Americans, everything felt undone. Old elites struggled to maintain authority. The poor felt themselves tossed around by the whims of shadowy bosses and threatened by an invasion of foreigners. And elites and the poor alike searched for ways to make sense of a world turned upside down.
As many groups faced with great upheavals have done throughout history, late nineteenth- and early twentieth-century white Americans scapegoated—placing the blame for large-scale social change on immigrants and minorities. The period saw some of the ugliest nativism and most violent exclusion of minorities in U.S. history. Racial eugenics and white supremacist theories of human evolution flourished, providing scientific authority for the country’s fear and harsh prescriptions for social improvement.
And yet, amidst all the exclusion and vitriol, other ways of responding to upheaval, championed by both working-class orga
nized labor and an emerging population of middle-class professionals, produced some of the most important and inclusive social reforms in U.S. history: child labor laws, wage regulations, antitrust legislation, worker safety protections and, of course, food safety laws. This period saw the advent of social work and public health, expanded public education, nutrition programs, and investment in public green spaces—all explicitly aimed at relieving social tensions that might have incited more radical social change.7
Middle- and upper-class women, in particular, found a place for themselves in public life through campaigns and crusades aimed at bringing progress to the needy. Careful regulation, scientific expertise, and technological innovation, it seemed, could reknit the fractured nation and stave off class struggle. By the 1890s, this onrush of activism and landmark social reforms had coalesced under the banner of Progressivism.8
Two things are important about the Progressive Era for our purposes: first, the two impulses of the time—fear and optimism—did not cleave around opposed “good guys” and “bad guys”—they were two faces of the same coin. In reckoning with Progressive Era reforms, we must live with ambiguity. Like most great social movements, Progressivism challenged some elements of an unjust status quo, while reinforcing others. Second, we must appreciate the way concern over the purity of bodies—and, by extension, the purity of the food and drink consumed by those bodies—underpinned and unified nearly all of the diverse commitments that made up Progressivism. Temperance, the crusade against alcohol, was, in many ways, a key incubator for all Progressive reform efforts, but food mattered almost as much. A safe, pure, and efficient nation could not be built without safe, pure, and efficiently produced food. If people could only be enticed to consume pure foods (and stop poisoning their bodies and minds with alcohol), social problems would cease; the poor would find the physical and moral fortitude to lift themselves out of poverty; crime and violence would wane.
White Bread Page 3