This urgent way of talking about food and security made for some strange bedfellows. For many, like Pollan, the diet-security connection gave teeth to appeals for liberal policy reform; for others, like the farmers’ market director I quoted several chapters ago, it justified xenophobia. The racialized specter of contaminated foreign food and threatening foreigners hovered silently over many dreams of finding safety in local self-sufficiency. Meanwhile, public health officials and defense department strategists took the language of diet and defense in a totally different direction: they joined forces to frame the country’s obesity epidemic as a national security threat—“the terror within,” as Surgeon General Richard Carmona phrased it just months after 9/11.5
What do we gain and lose when we think of eating as an act of individual and national defense, or when we connect dreams of “good food” to military campaigns? Where did we even get this idea that eating could be seen as a kind of combat?
Connections between civilian diet and military mobilization are ancient, of course. Armies march on their stomachs, and civilian populations have always been asked to sacrifice to make that possible. Yet, during the world wars of the twentieth century, the United States found itself insulated from conflict’s most fearsome camp followers: starvation, trammeled fields, and interrupted production. Americans conserved and went without, to be sure, but the geographical accident of unscathed food production also enabled the emergence of a different kind of relation between eating and defense.
Deliberately cultivated by military planners, food industry representatives, and public health officials, this new ethic didn’t just ask civilians to reduce intake of certain foods so that troops might have more. It also demanded that civilians increase consumption of foods deemed essential to home front fitness—maximizing nutrition and energy intake on the home front in the name of defense.
The idea that patriotic civilian populations must conform to particular notions of scientifically determined healthy eating has its roots in the early twentieth century. After the Boer War, for example, Great Britain instituted the Committee on the Deterioration of the Race to study the nation’s diet as a way of improving physical readiness for future conflicts. During World War I, Americans were berated for carrying millions of pounds of excess fat that could be better used as rations or tallow. Well-known physiologist Francis Benedict questioned “whether a patriot should be permitted in times of stress to carry excess body-weight.” And after the war, a Carnegie Institution study reflected that imperfect nutrition, particularly on the part of the civilian poor, “was a hindrance and danger to the state.” By the eve of World War II, some officials had begun to speak of poor nutrition as a form of desertion.6 If, for eugenicists and followers of Physical Culture, poor diet was seen as a crime against society in ordinary times, during wars the stakes were even higher—it was treason.
This idea is still with us today in modified forms. From the U.S. Army declaring “battle on the bulge” to vitamin supplements marketed as weapons in a war against ill health, diet and combat have blurred together in our minds in ways early proponents couldn’t have imagined. Individual food choices, we are told, have far-ranging consequences for the national readiness, whether that readiness is needed for an actually existing hot war, a future war, or simply to gird us in all-out social struggles against myriad perceived threats. Given the importance of bread in the U.S. diet, it shouldn’t surprise us that this consciousness—particularly the association between added vitamins and defense—was, in crucial ways, forged around bread. More specifically, it materialized out of educational campaigns surrounding the panicked introduction of synthetically enriched white bread on the eve of World War II.
By exploring the story of wartime bread enrichment, we’ll see that forceful national security rhetoric can, in fact, inspire sweeping positive improvements for all eaters—not just for a privileged elite. At the same time, the sense of urgency generated by linking food to national security inclined consumers and policy makers to accept stopgap measures and hurried compromises. These urgent measures, perhaps necessary in the moment, ultimately ended up narrowing Americans’ bread options and reinforcing the power of giant baking companies. What appeared only as short-term wartime expedience would set the stage for the 1950s and 1960s golden age of Wonder bread.
UNFIT TO FIGHT
In 1940, with U.S entry into Europe’s war appearing ever more inevitable, Congress authorized the country’s first peacetime draft. As men across the country lined up outside neighborhood draft boards, however, it was quickly evident that the country had a problem. After a decade of lean economic times, men aged twenty-one to thirty-five were dangerously unfit to fight.
In Chicago’s tough Eleventh Ward—cradle of the city’s Democratic machine, home to the stockyards and hard-working Lithuanian, Polish, Italian, German, and Irish men—draft boards found seven out of ten conscripts physically unfit to serve. In New York State, 30 percent failed their medical exams. In West Waterloo, Iowa, 124 out of 224 farm boys didn’t pass muster. Nationwide, General Lewis B. Hershey reported gravely in 1941, draft board doctors and dentists had rejected five hundred thousand out of the first million men screened.7
The Depression had taken a brutal toll on the nation’s health. Easily preventable problems with teeth and eyes topped the list of reasons for rejection, and a silent enemy lurked in the ranks of the un- and underemployed. Experts calculated that malnutrition, directly or indirectly, caused at least a third of all rejections.8 While a spate of books written during the economic crisis of 2008 celebrated 1930s-era cooks for their thrifty use of authentic ingredients to make “real” American food, the reality of Depression-era diet was often more grim. Yes, the country was beginning to bake its own bread from scratch again, but it also suffered from a deep “hollow hunger.” This wasn’t outright starvation, but rather, as home economist Margaret Reid reported, a “hidden hunger … that threatens to lower the zest for living and to sap the productive capacity of workers and the stamina of the armed forces.” A government commission convened after the Selective Service debacle of 1941 found that 75 percent of low-income high school students suffered from vitamin B2 deficiencies and 65 percent of Works Progress Administration workers suffered from scurvy or near-scurvy. Another study revealed that 54 percent of a sample of low-income whites and blacks suffered from night blindness characteristic of vitamin A deficiency—a statistic that terrified war planners looking ahead to combat conditions. “Nearly all” low-income students tested in another study experienced at least one vitamin deficiency, and time-series research at a community health center in New York City revealed that malnutrition rates there had risen steadily through the 1930s, hitting 37 percent in 1938. Pellagra—the vitamin deficiency disease most closely, if incorrectly, associated with bread-eating habits—killed twenty thousand Americans and debilitated well over one hundred thousand between 1933 and 1938.9
In the face of this crisis, the run-up to World War II saw intense focus on nutrition research. This was a time of great innovation in dietary surveillance, experimental nutrition, population surveys, and chemical analysis. New techniques for rapid blood sampling were developed and schoolchildren, prisoners, soldiers, and factory workers rolled up their sleeves to give planners the information needed to define a national standardized war diet. At the same time, government planners and nutritionists recognized that, no matter how efficient it was, this standard war diet could not be imposed by the state. It had to arise from the population’s souls and desires.
This presented a serious problem: during the 1930s and 1940s, Americans got more calories from industrial white bread than from any other food, and in case after case, they refused to accept major changes in that staple. Even when industry leaders like the Ward Baking Company threw their marketing weight behind whole wheat bread, as they occasionally did, sales did not rise for long.10 Efforts to promote alternatives to white bread as a form of patriotic wheat conservation had seen some success during
WWI, but mostly they failed. And they certainly hadn’t lasted. As one magazine writer observed in 1941, “Last time we had the slogan ‘food will win the war’ but precious few of the lessons which might have been learned from the wartime self-denial of 1918 carried over into peacetime dietetics. We went on cramming our tummies with bread so white it was almost blue.” All this white bread, the author concluded, made the country fat, neurotic, and unprepared for battle.11
Thiamin topped everyone’s list of concerns. Dubbed “the morale vitamin” because of its perceived effect on mental stamina and physical resilience, thiamin (vitamin B1) was deemed essential to readiness early in the war effort. Nutrition studies published in popular science magazines painted a dark picture of a thiamin-deficient nation. Subjects deprived of the vitamin displayed inability to concentrate, uncertain memory, awkwardness, self-consciousness, progressive feelings of inferiority, irritability, depression, and anxiety. Pointing out the obvious, but with great authority, the editor of the Journal of the American Medical Association observed that these traits were among the “least desirable in a population facing invasion.” Unfortunately, industrial processing had mercilessly stripped the country’s single most important food of thiamin. Eating refined white bread, a popular science writer suggested, did Hitler’s work for him.12
Later, Cornell nutrition scientist Clive McCay would reflect back on the moment: bread’s role in war had been clear to anyone who looked abroad, he argued. The secret of Germany’s “husky soldiers” was its “excellent dark loaf”; the great resilience of Russia was its stubborn rye bread. France, on the other hand, a nation of puffy white bread eaters, had folded. What would become of the United States, where people simply would not eat whole wheat? Despite hopeful slogans like “America’s Bread Front Has Never Failed,” wartime food officials were worried.13 Something had to be done, but what?
A NUTRITIONAL WEAPON DELIVERY SYSTEM
By 1943, this question had been decisively answered—at least as far as bakers and policy makers were concerned. The country would repair its broken staff with synthetic enrichment, the universally mandated addition of thiamin, niacin, iron, and later riboflavin to flour and bread. For war planners, public health officials, and baking industry executives, synthetic enrichment was the only “realistic” way to improve the nation’s health in a hurry. Even prominent nutrition scientists long skeptical of white bread joined the consensus in the name of wartime expedience. Synthetic enrichment was, they conceded, the quickest way to rush vitamins to almost every American, almost every day—without needing to change the country’s tastes or upset its milling and baking industries.14
This doesn’t mean that synthetic enrichment offered the only or inevitable option for policy makers. As war loomed, the United States could have looked to a number of strategies for fixing its broken fighting staff. Other countries changed the extraction rate—the proportion of the whole wheat berry retained in flour after the milling process—of their bread. Britain, for example, had ordered millers to produce high-extraction flour for its “War Bread,” creating a tough loaf despised by consumers, but probably responsible for saving the island from crippling malnutrition.15 Canada went even further: not only did the government mandate high-extraction “Canadian Bread,” starting in 1941, it declared that the addition of synthetic vitamins to bread constituted criminal food adulteration.16 Homegrown options existed as well. Most famously, Clive McCay’s “Cornell Bread,” developed for the New York State Emergency Food Commission, counted on substantial backing from agribusiness lobbies and health food advocates. Drawing its nutritional boost from soy flour and milk solids rather than whole wheat, this bread could fuel home front fighters and satisfy their craving for soft, white loaves. Cornell Bread instantly won loyal adherents among both health food advocates and New York government officials. But while Cornell Bread has enjoyed repeated spates of popularity from the 1950s to the present, it consistently lost out to synthetic enrichment in high-level food policy debates—despite backing from powerful dairy and soybean lobbies.17
Self-styled nutritional realists countered that none of these foreign or domestic alternatives would work in the United States. Even government intervention couldn’t change consumers’ taste for pure industrial white bread, the realists argued. Look at Switzerland, they warned, deploying their favorite cautionary tale: in 1937, hoping to increase whole wheat bread consumption, Swiss officials imposed taxes making whole wheat bread 25 percent cheaper than white bread. It didn’t work. Swiss consumers simply paid more to eat white bread.18 Influential home economist Helen Mitchell summed up the realists’ attitude for the Journal of Home Economics: “Enrichment seems a desirable compromise between a theoretically better nutritional practice and a realistic one based on the psychology of food habits.”19 Thus, the urgency of war settled a long-running debate about how to improve industrial bread, sweeping aside more radical alternatives in the name of expediency.
Unlike mandated high-extraction loaves or the use of natural additives like milk solids, enrichment was cheap and easy. It required no significant reworking of production lines, no new equipment, no need to learn new baking techniques and, once sufficient supplies of vitamin powders could be assured, little additional expense. Bakers couldn’t believe their luck. One simple flick of a compressed nutrient wafer into every batch of dough could put to rest decades of condemnation and restore the busted staff to its former glory.
Millers balked at enriching flour at first but, like bakers, they eventually saw the advantages. As one millers’ association told its members, in a time when better nutrition was “needed by all Americans to make them rugged and strong for the all-out war emergency,” enrichment offered a chance to reverse decades of declining flour consumption. Because enriched bread and flour had “become corner stones in the national education program for better nutrition,” they could sweep away “the scientific basis for former criticism of [our] fine foods.”20
With bakers and millers on board, the industrial war food machine went into motion. And once it did, alternative health breads like McCay’s Cornell loaf wouldn’t merely be passed over—they could be denounced as national security threats by industry spokespeople along with USDA and FDA officials. Nevertheless, one important question still remained: How best to distribute enriched white bread? While many in the baking industry hoped to build demand for enriched white bread and boost corporate profits by selling premium-priced loaves to affluent tastemakers, nutritional realists rejected this route. True, they argued, poorer consumers might eventually spend more on bread to emulate wealthy eaters, but in a time of war, added nutrition was too important to leave to the whims of market forces and consumer choice. Synthetic vitamins didn’t cost producers much and could easily be added to all bread.21
FOR THE AFFLUENT ONLY?
During the late 1930s and early 1940s, Robert R. Williams, a University of Chicago-trained chemist, found himself at the center of the debate over whether to mandate bread enrichment for all or to sell it as a premium-priced luxury. Born in 1893, the son of missionaries, Williams spent his childhood in southern India, surrounded by hunger. As a young teacher in the Philippines, scenes of deprivation haunted him. But it was during a stint as a low-level scientist with the colonial government in Manila that he first came to understand the moral and political weight of malnutrition.22
There, in 1910, spurred by the constant sight of listless, limbtwisted victims of endemic beriberi, Williams had set out to investigate suggestions that polished rice might cause the disease. He wasn’t interested in colonial policy, or in the political reasons why millions of Asians might subsist on rice alone. Williams’s goal was simpler and more technical: find a physiological cause and practical cure for the suffering he saw. After five years’ work, he succeeded at the first part: beriberi was, as he had suspected, the result of malnutrition, caused by the lack of a factor he named “thiamin”—a plentiful substance in rice husks but completely absent in polished grains. The sec
ond half of Williams’s dream—finding a cheap, politically viable cure to that problem—would take him more than twenty-five years.
In the meantime, Williams returned to the United States, where he joined the FDA to enforce Pure Foods laws and then helped lead national nutrition campaigns during World War I. During the Roaring Twenties, he entered the private sector as chemical director of the Bell Telephone Laboratories, then a hotbed of innovative applied science. In his spare time, however, Williams still pursued his driving passion. Finally, in 1936, after years of work, Williams announced that he had discovered an inexpensive method for creating thiamin in a lab.
True to his ideals, Williams registered the patent for thiamin synthesis to a nonprofit dedicated to funding humanitarian dietary research and set out to find industry partners who could deliver the product to people needing it most. At first, two companies answered his call: the pharmaceutical giant Merck, which agreed to mass-produce synthetic B1, and General Mills, which agreed to include it in select products. When the two companies approached Williams in 1938 with the idea of licensing synthetic B1 for use in flour, they tantalized him with the promise of massive economies of scale, a national advertising campaign, and direct access to the 15 percent of the U.S. flour market controlled by General Mills—but there was a cost. In return for pioneering the commercialization of B1 flour and enrichment products, General Mills and Merck demanded exclusive rights, a monopoly concession. Unless a government edict mandated enrichment of all flour products, General Mills’ negotiators argued, no company would be willing to take the first step without guaranteed exclusive rights that would allow it to charge a premium price and recoup the costs of innovation.23
White Bread Page 14