In Meat We Trust

Home > Other > In Meat We Trust > Page 11
In Meat We Trust Page 11

by Maureen Ogle


  The national cornucopia wasn’t the only reason for the decline in meat eating. In the early years of the twentieth century, new ideas about diet and nutrition diminished the value of a meat-centric diet. The first blow arrived courtesy of Horace Fletcher, a wealthy businessman who devoted most of his time to exploring dietary fads and ideas. He conducted a series of self-experiments and concluded that eating less was better than eating more, and that one key to optimal health lay in chewing one’s food to a liquid state. Doing so, he claimed, amounted to a predigestive process that boosted stamina and endurance. (Upton Sinclair and his wife tried “Fletcherizing” but gave up on the diet because they lost so much weight.) Fletcher’s work attracted a surprising amount of attention from scientists, among them Yale University faculty member Russell H. Chittenden, who was intrigued by the small amount of protein that Fletcher consumed. For several months, Chittenden fed low-protein diets to a collection of soldiers and athletes, and his subjects demonstrated a marked increase in muscle tone, strength, and endurance. Chittenden concluded that Americans could and should eat less meat, certainly less than nutrition experts advised. The professor’s work, noted one admirer, “herald[ed] the collapse of a fundamental fallacy in diet.”

  Had Chittenden’s research remained closeted in his university ivory tower, it and he might not have mattered. But he was a skilled promoter, and news of his work reached deep into Americans’ daily lives. Those who ate meat at every meal courted disaster, one writer told readers of Cosmopolitan, at the time a national general-interest publication, because excessive meat consumption led to “proteid (meat) poisoning,” and that, in turn, caused “brittle arteries.” A meat-rich diet was “utterly abnormal and must lead inevitably to disaster.” A professor at Columbia University explained to readers of Good Housekeeping that excessive meat consumption led to “gout, rheumatism and other ‘uric acid disorders,’” and, worse, meat proteins putrefied in the intestines, spawning bacteria and “poisonous” byproducts. She advised women to feed their families less meat and more milk, eggs, and cheese. The advice, and its shaky science, earned an official seal of approval from C. F. Langworthy, the director of nutrition investigations at the federal Office of Experiment Stations: meat, he announced in a 1910 report, was “not essential to a well-balanced diet,” and he advised Americans to eat more eggs and cheese, legumes and nuts.

  The onset of World War I hastened the exit of meat from the center of the plate. Herbert Hoover, who served as the Wilson administration’s wartime food manager, asked citizens to support the Allies by changing their eating habits. Europeans needed food, he explained, and it was up to Americans to provide it. “We have a great surplus of potatoes, vegetables, fish, and poultry,” he explained, but those foods were difficult to ship, and the government planned instead to export “concentrated foodstuffs—grain, beef, pork, fats, and sugar.” Fill up on eggs and cheese, he implored the nation, but leave beef and pork for the war. Americans complied (more or less) and learned they could live on less meat than they had once done.

  But vitamins delivered meat its biggest blow. Scientists had long suspected that besides protein, carbohydrates, and fats, foodstuffs also contained a mystery substance that in some equally mysterious manner contributed to health. Scurvy, for example, was cured by eating certain fresh foods, especially citrus fruits. Why? What did those foods contain? In 1912, Polish chemist Casimir Funk offered an answer: a substance that he labeled “vitamines,” the “vita” for life and the “amine” because he believed they were amines. (Some are and some are not, and within a decade or so, the final e vanished from the word.) A year later, several American researchers working in different laboratories demonstrated that milk fat also contains a life-essential substance, what we now call vitamin A, “incontrovertible evidence,” rejoiced one of the men, of a “hitherto unsuspected nutrient indispensable for health and . . . the maintenance of life.” Over the next few years, the vitamin publicity engine cranked into full gear as information once buried in scientific journals became fodder for the pages of newspapers and magazines eager to explain the virtues of spinach, lettuce, tomatoes, and other once-lowly foodstuffs. “Feed your body vitamines,” urged a typical magazine essay published in 1919. The body demands these “Unknowns,” “mysterious substances that have defied chemical isolation and analysis, but which have a powerful and determining effect on growth.” Another writer warned readers by way of example, relating the tale of a mother and father determined to nourish their son, little Willie, with a diet “carefully balanced” in proteins, fats, and carbohydrates. Alas, little Willie “languish[ed]; he failed to grow; he whimpered and fell away.” A friend recommended that the parents feed the boy raw cabbage, advice “the father took for offensive humor and the mother for an insult.” Still, desperation knows no bounds and they piled Willie’s plate with cabbage. Lo and behold, he “began to grow and shout. . . . Little Willie is now nearly six years old, and he shows every promise of becoming a great football player.”

  What was good for the little Willies of America proved a disaster for the meat industry. In 1926, the New York City school board banned frankfurters from lunchrooms on the grounds that the food was “unsuited” to students’ nutritional needs. The board’s lunch director explained that sausage was so “heavy” that when children ate it, they “neglected to eat green stuff” and milk. “This is an open attack on the frankfurter,” fumed a writer for a butchers’ trade newspaper. The lowly frank was one of the meat industry’s “best foods” because it provided “the most nutriment for the money.” The decision “puts us back ten years, at least,” sighed one manufacturer. Children were told by teachers that frankfurters were “unwholesome,” they carried that message home, and their mothers banished sausages from “the home table.” The United States Department of Agriculture sided with the sausage. Frankfurters are “wholesome, appetizing and economical,” said a department spokesman. When served on bread and with a drink, “they provide lunches that are hard to beat when time is a factor and the pangs of hunger are to be fully satisfied.” (The packers, by the way, insisted on the term frankfurter as a more palatable alternative to a moniker they loathed: “hot dog.”)

  The meat industry responded as special interests do when threatened: by launching a campaign to counter what it termed “propaganda.” The Meat Institute, a packers’ trade organization, and the American National Livestock Association mounted a pro-meat publicity campaign, sending lecturers hither and yon, placing pro-meat articles in newspapers and magazines, and bombarding teachers with instructional material. Industry representatives commandeered microphones at the nation’s newest media outlets, radio stations, to tout meat’s virtues. During “Meat for Health” week, the industry’s publicity arm cranked out 3 million pieces of literature, including pamphlets and advertisements and posters for windows and meat wagons, and sponsored informational programs at butcher shops and grocery stores. The Milwaukee Meat Council, made up of packers and retailers, hired an actor to portray a caveman, presumably because such a figure epitomized brute strength and good health. He strolled up and down one of the city’s busiest streets, carrying a cavemanlike club and wearing a “shabby mane and beard, bear skin, [and] sandals” as well as a sandwich-board sign that read “I EAT MEAT.”

  But the industry’s most important efforts unfolded behind the scenes. In the first half of the 1920s, meat makers urged officials at the USDA to promote and protect not just meat, but the entire meat production system. The department obliged, contributing statistics for use on promotional posters, thus giving the “eat meat” campaign an official stamp of approval. The packers’ Meat Institute persuaded eighteen of the nation’s agricultural experiment stations, which were attached to land grant colleges and universities, to fund meat research coordinated by the institute and the USDA. In this way, explained the institute, “the nation’s best brains and equipment [could] be utilized to the full in bringing light to bear on the problems to be studied.” The meat men also p
ut their money where their mouths were, funding university fellowships to support meat-based research.

  These efforts forged a permanent alliance between the meat industry and the land grant establishment, a relationship that would become increasingly important—and problematic—in the years to come. But those links between and among the USDA, schools, farmers, and meat processors also provided the underpinnings of a long-term project that fundamentally altered the way livestock was bred, raised, and fed. Factory farming, as it was called, was intended to support livestock producers and other farmers and satisfy Americans’ demand for low-cost food, especially meat.

  4

  Factories, Farmers, and Chickens

  IN THE MID-1930s, Jesse Jewell, a businessman living in Gainesville, Georgia, was forced to confront an uncomfortable truth: thanks to a series of catastrophes, both natural and man-made, his family’s seed-and-feed company was, as he put it, “shot.” Unless he could find another way to sell what he had to offer—seed stock, livestock feed, fertilizer, basic farm tools—the business would go under and he, like so many other residents of north-central Georgia, would be bankrupt and unemployed. That prospect did not appeal to the ambitious, entrepreneurial Jewell, and he soon latched on to his new approach to profit: he bought a load of live chicks on credit and loaned those, plus bags of feed, to unemployed and mostly destitute local farmers, who raised the chickens to market weight (anywhere from two to four pounds). Jewell then hauled the birds to Atlanta and other regional urban markets, sometimes live and stashed in wooden crates, sometimes slaughtered and packed in ice. When he had sold the birds, he paid the farmers their share of the profit (after deducting payment for chickens and feed).

  The rest was profitable history: over the next three decades, Jewell expanded that unassuming start into a broiler-making empire. (The term broiler referred to a bird’s size. Broilers ranged from two to two and a half pounds, fryers weighed about three pounds, and roasters were anything larger.) He contracted with hundreds of “growers,” as they were called, who agreed to feed thousands of chickens at a time in factorylike conditions. The growers employed automated watering and feeding equipment and used carefully calibrated “inputs” that included commercially manufactured mixtures of feed and antibiotics. Jewell integrated those chicken-growing operations into his larger corporate structure, which included processing plants where his employees slaughtered and packaged the chickens, mills where he manufactured the chickens’ feed, and hatcheries where other Jewell employees cranked out the basic raw materials: chicks and eggs. He diversified beyond the basic bird into ready-made convenience foods such as frozen fried chicken and chicken potpies. By the time he sold J. D. Jewell, Inc., in the early 1960s, both he and the company were worth millions. Along the way, chicken had been transformed from an expensive, seasonal luxury into a dietary staple, and integrated, “industrial” livestock and meat production had migrated from the chicken house to the hog pen and cattle barn.

  Factorylike livestock production extended the food infrastructure pioneered decades earlier by the dressed-beef men. Armour, Swift, and other meatpackers had designed slaughterhouses that emulated factories and incorporated those into complex, nationwide distribution systems. Jesse Jewell and others carried the factory model to the farm and built integrated corporations that connected farm to slaughterhouse to food processor to retailer. Two motives inspired the project of taking the factory to the farm: a desire to keep food costs for consumers low and a need to ensure that farmers enjoyed an adequate standard of living. Both were inextricably linked to the emergence of a consumer economy in the early twentieth century. Because the subject of factorylike livestock production will dominate much of the rest of this book, it’s important to understand the context in which it took shape.

  A consumer economy thrives on the making, selling, and buying of nonessentials—think cars and cosmetics, shoes designed for style rather than function, iPads and televisions. That economy, so familiar to us today, was preceded by and built upon the “producer” economy that dominated the nineteenth century, when Americans built factories where they manufactured goods that furthered industrial development: rail ties and sewer pipes, machine tools and steam engines. By the end of the nineteenth century, that foundational structure was in place and they shifted their attention to manufacturing consumer goods—clothing, cosmetics, radios, and cars. Americans bought such goods prior to the twentieth century, of course, and they continued to invest in and manufacture producer goods after consumers gained supremacy. But in the twentieth century, the economy revolved around making and getting (relatively) unnecessary “stuff.”

  The health of a consumer economy depends on disposable incomes that allow people to spend money on nonessentials. One way to ensure that consumers consume is with credit, which became widely available in the 1920s. General Motors, for example, created the General Motors Acceptance Corporation (GMAC) to provide low-interest loans so that Americans could purchase cars. But another crucial factor in sustaining a consumer economy is ready access to low-cost food. The less money Americans must spend on food, the more they can spend on video games, books, and cell phones. When food is abundant and supplies are greater than demand, consumers enjoy low prices, but food producers—farmers—earn little profit. If the reverse is true and demand outstrips supply, food prices rise. Farmers profit, but consumers howl. Thus the fundamental contradiction of a consumer economy: the paradox of plenty (or, as farmers call it, the pain of plenty). Urbanites demand that farmers produce an abundance of foodstuffs. But if farmers comply, they earn little profit and so either can’t or won’t produce more. And so the consumer economy has grown hand in hand with one of the great balancing acts of American politics: the need to guarantee cheap food on one hand and income parity for farmers on the other, a need that spawned the programs and policies known collectively as “farm subsidies.” This balancing act was and still is complicated by the fact that most Americans live in cities and don’t produce their own food.

  Americans experienced their first significant encounter with the paradox of plenty during an agricultural crisis sparked by World War I and its aftermath, an episode that served as the seedbed for factory farming. The outbreak of war ratcheted up demand for American agricultural products, whether wheat or meat. President Woodrow Wilson, his food administrator, Herbert Hoover, and USDA officials urged farmers to produce, produce, produce; to feed not just Americans but the warring nations of Europe, too. Farmers obliged, increasing their output by the fastest means possible: they bought more land so they could plant more acres or feed more livestock. Most could not afford to pay cash, so they took out mortgages and then borrowed more money to pay for barns and silos, for additional draft animals (tractors had not yet become common), for feed, fuel, and tools. As food czar, Hoover encouraged farmers in that decision, arguing that Germany had “sucked the food and animals from all those masses of people she has dominated and left [them] starving.” Even when the conflict ended, he said, Europeans would need twice as many imported fats and proteins as they had during the war.

  Hoover was wrong. When the fighting stopped, military officials and foreign countries alike canceled contracts for everything from cotton to corn. Desperate to find buyers for what they had produced, farmers dumped crops and livestock on the market, and the ensuing glut caused prices to collapse. The price farmers received for corn fell 78 percent, and returns on both beef and wheat dropped by more than half. Demand vanished, but farmers’ debts did not. They could not sell their output at prices high enough to pay their bills, so they responded in what seemed, to them, a logical manner: they increased their output, hoping that volume would pay off their debts. But because every farmer did the same thing, agricultural products glutted the market and prices plunged again.

  As the crisis deepened, many farmers slid into bankruptcy and threatened the vitality of the banks that had loaned them money. Economists, agricultural experts, sympathetic politicians, and leaders of farmers’
organizations warned that fewer farmers would mean less food and even higher food prices. They argued that if urbanites were entitled to cheap food, farmers were entitled to an adequate return for their labor and an income that would allow them to maintain economic parity with city folks. The moment prompted some advocates, including farmers, implement manufacturers and seed dealers, and agricultural economists to propose a radical solution: they urged lawmakers to detach agriculture from the free market and use federal legislation to support crop prices and farmer income; to subsidize agriculture so that farmers could enjoy the same standard of living as city people. Many Americans were (and still are) horrified by the idea and railed against the plan of using taxpayer dollars to circumvent marketplace mechanisms, arguing that therein lay the path to socialism, communism, or worse. But supporters pointed out that in war-ravaged Europe, high food prices had already sparked social unrest and contributed to the spread of fascism and communism. If American food prices soared, the same thing could happen in the United States. Government “interference” in agriculture, they argued, would save the republic. In the 1920s, that argument was not enough to carry the day. Urbanites, who made up a majority of the population, wanted nothing to do with the burdens of subsidies. Twice in that decade, Congress passed parity legislation, and twice President Calvin Coolidge vetoed it. As we’ll see, the Great Depression demolished resistance to subsidies, but in the 1920s, many farm advocates touted a different plan for ensuring farmers’ profits and maintaining low food prices: factorylike farming.

 

‹ Prev