Book Read Free

In Meat We Trust

Page 14

by Maureen Ogle


  The third factor that contributed to enthusiasm for factory farming lay well beyond the chicken coop and battlefield. In postwar America, large grocery chains emerged as major power players in the nation’s food supply system, and factory farming, and especially the integrated broiler industry, was well suited to meet their demands.

  The ascendance of chain grocery stores can be traced back to the agricultural crisis of the 1920s. As beleaguered farmers glutted the market with their corn, cotton, and cattle, prices of those commodities collapsed, and in theory, consumers should have benefited. Instead, food prices soared, and a baffled public demanded an explanation. Dozens of studies examined the entirety of the American food system, from farm to table. Most analysts arrived at the same conclusion: Basic agricultural foodstuffs were cheap—and farmers weren’t making much money—but consumers were paying high prices at grocery stores thanks to two unrelated factors. The first was consumers’ insistence on convenience. One USDA analyst pointed out that more women were working outside the home and they had neither the time nor the inclination to spend hours in the kitchen. A contest between, say, a cooked-from-scratch roast and canned beef stew was no contest at all.

  But convenience was neither cheap nor free, and the demand for “[t]ime-saving, convenience, comfort, and satisfaction,” explained a congressional commission appointed to study rising food costs, had “reached a point where it costs more to distribute and serve [food] than it does to produce [it].” The price of store-bought bread, for example, included “a maze of service costs.” Its manufacturer invested in equipment needed to mix and bake the bread and employed an army of salespeople and advertising copywriters to persuade shoppers to buy it. Moving the bread from factory to table necessitated hiring truck drivers, machine operators, packing crews, and deliverymen. Those layers of expense provided jobs and paychecks, but they also drove up the final price of bread and other foods.

  But the dismal state of food retailing also contributed to the high cost of eating. In the 1920s, most Americans still shopped for food the same way their grandparents had, buying dry goods like flour and spices at one store; perishables such as potatoes, onions, and apples from another; and meat from a butcher shop. The average food retailer catered to a limited neighborhood clientele and purchased supplies in small lots from multiple food jobbers, each of whom carried a narrow line of goods. One analyst used lettuce to calculate the resulting inefficiencies: Suppose a wholesaler bought a carload of lettuce, or 320 crates. A typical jobber purchased 1/16 of that load; a retailer 1/320; and the consumer “one head or 1/7,680 of a car.” Each subdivision of the original carload added to the cost of the final product. Worse, complained critics, the grocery business was too often the refuge of the incompetent and the inexperienced. A study of grocers in Oshkosh, Wisconsin, revealed that most of them lacked any experience in retailing, and their numbers included a policeman, a shoemaker, and a musician, which, said one observer, explained why so many of them failed. In Louisville, Kentucky, a third of grocers failed after a year; in Buffalo, New York, 60 percent went under.

  These inefficiencies created an opening for large, centrally managed chain grocers who could drive down costs through volume buying and streamlined distribution. Even before the 1920s, a handful of chain grocers had made inroads into retailing, mostly in large cities and mainly on the East Coast. Chief among them was the Great Atlantic and Pacific Tea Company, or A&P, which began life in the nineteenth century as a purveyor of tea and coffee but whose owners gradually expanded their offerings to include a full line of grocery items. By 1900, A&P operated two hundred stores. The original outlets offered delivery and credit, but not self-service; clerks gathered items for shoppers. But in 1913, A&P launched a collection of “Economy Stores.” The new shops abandoned in-home delivery and other frills in exchange for low prices that company executives believed would generate high-volume sales. Manufacturers of national brand products like canned foods and dry cereals initially objected, arguing that A&P’s policies besmirched hard-won reputations by treating branded goods as cheap stuff. They changed their minds once they recognized that shoppers who patronized modern stores like A&P’s were more willing to try to buy branded goods.

  In the wake of the agricultural crisis of the 1920s, those charged with studying and reforming the American food system touted chain grocery stores as a way to modernize and improve food distribution. The chains streamlined the task of shopping by providing an array of foodstuffs, from produce to canned goods, in a single location. A&P and other grocers also emphasized the pleasures of consumption by providing clean, well-lit environments, wheeled carts, low prices, and, thanks to self- service, maximum convenience. But chains made their biggest impact behind the scenes. Unlike neighborhood shops and independent grocers, chains ordered directly from manufacturers and in bulk, which kept their costs low. Food manufacturers benefited, too; by dealing with a single grocery chain rather than hundreds of individual retailers, they reduced bookkeeping and accounting expenses, to say nothing of costs associated with selling and delivery. All of it added up to efficiency that translated into lower food prices, and by the time World War II ended, grocery chains dominated food retailing.

  Their dominance of meat retailing unfolded more slowly. Back in the 1920s and 1930s, the chains had struggled to learn how to sell meat. As one grocery executive admitted, “[c]hain store merchandising is founded on control,” and meat, with its unwieldy carcasses, fat, gristle, blood, and bone, resisted control. An A&P executive begged the nation’s packers to make meat behave less like itself and more like easy-to-manage canned peas. “It is now possible to buy bread already cut in slices,” he argued, so surely it was “logically and economically” feasible to supply grocery chains with precut, prepackaged meats. If only it were that simple. “The packaging of coffee, crackers, [and] cereals is child’s play compared with packaging of fresh meats,” marveled a reporter in 1929. “A whole new technic [sic] must be worked out.” Even when the packaging succeeded, customers weren’t always sure what to do with it. One retailer recounted the day an angry customer marched into his store and demanded a refund for a package of bacon she’d bought three weeks earlier. The meat was spoiled, she told the man. The puzzled grocer asked her where she’d stored it. On top of her icebox, she replied, but that “should be of no significance” because the bacon was in “a ‘sealed package’ and should not require special care.” (Presumably the woman’s refund included a quick lesson in packaging and refrigeration.)

  By the 1950s, chain grocers had solved those problems, in large part thanks to wartime research that resulted in new packaging materials and improved refrigeration equipment, and so they extended their command and control to meat sales as well. Their power allowed them to dictate terms to meatpackers and processors. That was especially true in the broiler industry. Chain stores nationwide used packaged chicken as “loss leaders,” calculating that shoppers lured by low poultry prices would stick around to load their carts with higher-priced goods. Selling chickens below cost quickly became a requirement for any grocer who wanted to lure customers. If a competitor across town sold broilers for, say, 29 cents, lamented one grocer, “we’ve got to sell at 29 or lose our customers for everything else we sell in the store.” The more dependent the chains became on cheap chicken, the more they pressured Jewell and other processors to supply broilers at a low price. Jewell had no choice but to comply, and he in turn pressured his growers to increase production. The resulting output glutted the market, and grocers snapped up cheap poultry for use as loss leaders. “This starts the price-cutting sale cycle all over again and everybody gets hurt!” complained a merchandiser for a major chain.

  The pain wouldn’t end anytime soon, as the chickens themselves got a dose of modernity that gave shoppers even more meat for the price. In 1945, the Dekalb Company of Illinois began developing hybrid birds that would mature and feather quickly, produce two hundred or more eggs a year, and provide meaty flesh. They weren’t alone.
In 1946, a tour guide at a USDA research facility boasted about the changes scientists there had wrought in the basic bird. “See that batch of pullets over there?” he asked a visitor. “They’re practically all white meat, tender and delicious.” But few did more than A&P to encourage and sustain the broiler boom. The grocer sponsored “Chicken of Tomorrow” contests that rewarded breeding innovations. Thousands of breeders and growers participated, and in 1950, the USDA calculated that nearly 70 percent of the 625 million chickens raised for meat descended from Chicken of Tomorrow bloodlines. By the early 1950s, the American broiler was meaty and big-breasted, boasted hefty drumsticks, and arrived at maturity faster and on less feed than chickens sent to market just a few years earlier. Modern chickens converted feed into meat more efficiently than did cattle, and nearly as efficiently as that master of conversion, the hog; and pound for pound, chicken offered a less costly form of protein than its two competitors.

  During the fifties, in part because of the chains’ demands but also because so many investors wanted a cut of the action, the broiler industry resembled nothing so much as a gold rush, complete with boomtowns, fast wealth, and spectacular collapses. Even so, output soared, and production costs, and the prices consumers paid, dropped. Genetics, breeding for meat, and antibiotic-laced feeds helped, of course, but so did the integrated production-processing structure developed by Jesse Jewell. Indeed, the broiler industry was dominated by the same people who had invented it: integrators like Jewell who exercised control from egg to chick to packing plant to grocery store. “Integration has made chicken the cheapest meat on the market,” Jewell said, “and we want chickens to stay cheap.” His centralized decision making also allowed him to manage the price gyrations ignited by grocery chains’ use of broilers as loss leaders. He balanced the up-down prices of basic broilers with more stable income from “value-added” products like chicken sticks (a takeoff on another popular innovation of the 1950s, the fish stick) and chicken-based frozen TV dinners, the latter an innovation introduced during the decade by the Swanson company, another major broiler maker. Frozen, canned, and refrigerated foods satisfied consumers’ demand for convenience and taught shoppers to think of chicken as something other than a basic commodity sold at below-profit prices. Jewell could manufacture those value-added products because he controlled the number and types of chickens that flowed into his processing plants. Analysts applauded Jewell and other integrators, arguing that they were more business-oriented and “cost-conscious” than traditional farmers who operated on “a smaller scale.”

  Here, then, was modern farming of the sort so many people had envisioned back in the 1920s. It was subsidized, thanks to government research and poultry improvement programs. Production was automated and large-scale. Critics complained that poultry growers were mere hired hands rather than farmers. But, for better or for worse, that was precisely the goal: to make the farm function like a factory. Supporters pointed out that contract farming was neither new nor unusual. For decades, commercial vegetable and fruit farmers had raised crops under contract for food processors who required produce of uniform size and grade. Moreover, every farmer who participated in a federal subsidy program worked, in effect, as a contract producer, and as many supporters of integration pointed out, most Americans worked for a contracted price. “As a whole,” commented one analyst, “agriculture stands alone as the only major industry that still clings to its glorious past and holds out for a ‘free price.’” In time, another predicted, integration would “revolutionize the production of animal products,” and contract farming would be the norm rather than the exception.

  The story of the mid-twentieth-century agricultural revolution is less one of malevolent corporate capitalism than it is the struggle to balance the welfare of the producing minority with the demands of the consuming majority. The middlemen in this case were federal policies aimed at protecting the few in order to benefit the many, as well as the tools and ideas aimed at reducing the costs of feeding a nation. For decades, the farmer epitomized the rugged American individualist, living on the land, beholden to no one. But in the 1950s, farmers shouldered heavier burdens, charged as they were with feeding the world, preventing the spread of communism, and paving the way for consumers to spend money on televisions, vacations, and college educations. An urban majority screamed bloody murder if the price of steak rose 5 cents a pound, blamed lazy farmers and ignorant politicians for that woe, and threatened to vote said politicos out of office. So midcentury farmers employed the ideas and tools that empowered them to make low-cost food and earn a decent living doing it. They fed their livestock antibiotics and caged their chickens. They accepted the federal government’s goal of eliminating “marginal” farmers and subsidizing large, efficient ones. In the 1950s, those who wanted to stay on the land had to play by the new rules; that was the price of survival.

  Jewell’s chicken-based empire was just one manifestation of that midcentury revolution. Farmers who started their careers in the 1920s went from horse-drawn plows to agile, powerful tractors. From dousing crops with Paris Green to watching an airplane drench fields with DDT. From shoveling feed and manure to pushing a button and letting a machine do the work. A half-century later, “pro-food” activists would argue that Americans have paid a high price for that revolution, and at the time, many farmers reacted with trepidation. But far more responded the way Americans would, fifty years later, to the digital age: with amazement, awe, and delight.

  Among those who experienced the revolution was John Davis (born in 1904), who grew up on Corn Belt farms, first in Missouri and then in Iowa. During his childhood, his family relied on horse-drawn plows and cultivators and a steam-powered thresher (which they shared with their neighbors), and they butchered and processed their own hogs. He graduated from high school in 1923, a moment when the farm crisis hammered even relatively prosperous farm families like his. By then, the Davises were living in central Iowa, not far from Iowa State College (now Iowa State University), one of the nation’s premier land grant schools. The proximity allowed Davis to attend school and help on the farm, but after earning a bachelor’s degree in economics, he, like so many other young rural Americans, migrated away from the farm. During the 1930s, he watched the nightmare of the Great Depression from a small Iowa town where he taught school, but in summers he headed to the University of Minnesota for coursework that earned him a master’s degree in agricultural economics. He finished in 1935 and never returned to full-time farming, choosing instead to take a series of white-collar jobs, including two stints at the USDA. In 1954, he accepted a position at Harvard, where he directed an agriculture program at the university’s business school.

  By that time, the fifty-year-old Davis had spent his entire life immersed in agriculture, albeit from a variety of perspectives. Those vantage points inspired a simple, but powerful, observation: farmers and their work could not be isolated from the rest of the economy. Viewing “agriculture as an industry in and of itself” may have made sense a century earlier when most Americans lived and worked on the land, he wrote, but in the mid-twentieth century, that perspective was both foolish and shortsighted. Davis observed that modern farmers had handed over the tasks of “storing, processing, and distributing food and fiber” to “off-the-farm business entities,” like trucking companies and grocery chains. A different set of “off-the-farm” companies designed, built, and manufactured the inputs on which modern farmers relied, whether tractors and combines or antibiotics and poultry cages. Together, these three sectors—agricultural production, processing-distributing, and input manufacturing—constituted an interlocked whole, no part of which could survive without the others. Similarly, the triad’s manufacture and distribution of food and fiber constituted one of the overall economy’s largest components. Davis and a collaborator calculated that in 1954, consumers spent $93 billion on agriculturally based “end products and services”—food, paper, restaurant tabs, textiles, and so forth—a figure that didn’t include dolla
rs spent to grow, process, manufacture, or distribute food and fiber. Americans could not afford to conceptualize agriculture as an enterprise distinct from the rest of the economy.

  Davis also understood that agriculture constituted the weakest link in the triumvirate because its individual units—farmers—lacked the power and the information needed to make market decisions. Input manufacturers and output processors, in contrast, executed financial and production decisions based on internal factors largely under their own control. Even if commodity prices fell, the price a farmer paid for, say, a tractor typically remained high because that price was based on factors controlled by the manufacturer, not the farmer. Davis argued that over time, the market stabilized this “cost-price” squeeze, but always at the expense of farmers and the taxpayers who subsidized that weak link. The more subsidy programs that Congress created to protect farmers, the more those programs entangled other members of the triad. Manufacturers of grain silos, to use a simple example, had a vested interest in ensuring that farmers produced surpluses that needed to be stored in those silos.

  But like other economists and politicians at the time, Davis distinguished between farmers who practiced “commercial” agriculture and those “low-income” farmers who did not. He argued that marginal farmers—men and women who worked the land but either refused to or could not commit to thinking like factory managers—contributed nothing substantive by way of food and fiber; as a result, society lost “the value of their productive potential.” Worse, when policymakers and politicians pondered solutions to the “farm problem,” they invariably characterized that quandary in terms of those two poles—commercial farmers on one end and low-income farmers on the other—rather than seeing the two as connected to each other and to the other two members of the triumvirate. Until and unless Americans grasped the complexities of providing food and fiber for a non-agricultural population as well as the intimate connection between agriculture and the economy as a whole, and until they made hard choices about what constituted a “viable” farm, the agricultural problem would never be solved. Davis urged Americans to think of the production of food and fiber not as agriculture but to perceive it instead as “agribusiness.” If that included abandoning adulation of the “family farm” and rethinking the myth of the sturdy yeoman, so be it. Only then would the nation come to terms with the fundamental conundrum of how to feed an urban majority and sustain a consumer economy.

 

‹ Prev