In Meat We Trust

Home > Other > In Meat We Trust > Page 13
In Meat We Trust Page 13

by Maureen Ogle


  But all that cost money, and that was the one thing that local farmers didn’t have. That’s why so much of the new poultry industry ended up in the hands of people other than conventional farmers: bankers were willing to offer the loans needed to get the project off the ground, but they preferred dealing with borrowers experienced in business. Enter men like Jesse Jewell. In many respects, his migration from shopkeeping to poultry production amounted to a one-for-one substitution: for decades, merchants like him had provided farmers and sharecroppers with feed, seed, or fertilizer in advance of the growing season, taking payment when the crop was sold. Jewell simply swapped chickens for feed and seed, benefiting from the research and expertise provided by UGA faculty. He also enjoyed assistance provided by the manufacturer of the feed he sold: the company extended credit so that he could buy the feed and chicks that he loaned to local farmers, helped him establish a modern bookkeeping system, and dispatched a salesman to accompany Jewell on tours of the countryside to recruit farmers. Moreover, by the time Jewell settled on chickens as his salvation, there was new cash floating around the region. The AAA had paid out $8 million for the cotton plow-up, money that landed in the pockets of landowners who were looking for ways to invest it. The New Deal also offered other, indirect agricultural subsidies: in 1935, the USDA launched the Poultry Improvement Program and committed manpower and intellectual capital to the study and control of poultry disease. Thanks to the Tennessee Valley Authority and the Roosevelt administration’s investment in rural electrification, southern farmers obtained access to electricity that enabled Jewell’s growers to install automated feeding and watering devices. Those tools, which were often designed at land grant engineering programs, saved labor and allowed growers to increase the number of birds they could feed.

  Other subsidized research also shaped the new broiler industry. In the then-new field of genetics, researchers studying how and why traits passed from parent to offspring often used chickens as their research subjects because they were small and easy to handle, and they matured in weeks rather than months. Compared to other animals, hens have a short reproductive cycle, and their offspring develop not in the womb but in a separate container, the egg. By the 1920s, the chicken was one of the most studied of all domestic livestock. Even before the broiler industry took shape, scientists had learned a great deal about how to increase egg yields and raise healthier chickens. The ubiquity of laboratory chickens also led, inadvertently, to a central component of factorylike livestock production: confinement. Researchers who worked with chickens studied their subjects in a confined setting so that they could control light, temperature, and humidity and monitor the birds’ food intake. Scientists soon realized that confined hens lived longer and laid more and healthier eggs than their free-ranging farmyard cousins. Confinement also eliminated the primary obstacle that stood between poultry producers and efficient, large-scale production: flock size. Chickens are highly susceptible to disease, and every farmer knew that the bigger the flock, the higher the incidence of disease, and the harder those outbreaks are to control. In a confined setting, however, growers could quickly identify and remove sick birds. The major obstacle was that confined birds invariably developed rickets, or “leg weakness” as farmers called it, leaving them unable to walk and preventing chicks from maturing and hens from laying. But in the early 1920s, researchers discovered that adding vitamin D to chicken feed eliminated the need for sunlight. That “phenomenal” discovery, raved one expert, opened the door to “specialization and the application of factory methods” to poultry production.

  All these factors—expertise, research, money—shaped the new broiler industry, thanks to which, wrote one man, chicken farming was headed toward “commercialized production on an efficient and . . . large scale.” Jewell was no farmer, and that was fine with the people who urged him and others to build the new industry. What supporters wanted was an agricultural sector that operated more like a factory than a farm.

  Jewell spent the late 1930s expanding his business. He contracted with more growers and sought out quality chicks for them to raise, and he cultivated outlets where he could sell the birds, which he continued to drive to market himself. But the onset of World War II presented an unexpected opportunity for rapid expansion and solidified the industry’s reliance on contract farming, automation, large scale, and integration of livestock production with processing. As had been true during the previous world war, demand for all goods, especially foodstuffs, rose sharply (and ended the depression), but few industries benefited more than the one devoted to making broilers. At the outset of the conflict, federal officials decided to set aside most of the nation’s pork and beef for the troops, in part because of their value for both nutrition and morale, but also because those could be shipped as, say, smoked hams or canned beef stew. But the commercial poultry industry was so new that an infrastructure for packing and processing its meat was limited, and bureaucrats charged with managing the nation’s food kept the chickens at home. (Turkeys, however, were shipped to troops for morale-boosting Thanksgiving and Christmas meals on the front lines.) USDA officials launched a “Grow More Poultry Program,” the public feasted on chicken fixed every which way, and military buyers grabbed as much as they could to feed to men and women stationed stateside.

  World War II also drained American agriculture of its labor supply, a fact that, as we’ll see later, would have a profound impact on the way farmers raised livestock. Even before the United States entered the war, factories had geared up to supply warring countries with materiel, and men and women decamped from the farm for jobs those factories provided. In Georgia alone, between 1937 and 1941, 30 percent of agricultural workers left farm for factory. The shortage worsened after the United States declared war in late 1941. Everywhere in rural America, from dairy farms to cattle-feeding operations, from Corn Belt hog lots to rural Georgia chicken coops, labor vanished. When labor cannot be found, humans make a logical decision: they replace it with machinery. Americans had a long-standing tradition of doing so. For most of the nineteenth century, for example, the country suffered chronic shortages of labor that fostered a national passion for mechanization and automation. So, too, in the 1940s. Factory farming already had plenty of support both in and out of agriculture, and World War II affirmed that enthusiasm. Nowhere was this more true than in the broiler industry.

  Faced with record demand on one hand and lack of labor on the other, Jesse Jewell tightened his control over the broiler-making process. Back in the 1930s, his growers had cobbled together coops from scraps of brick, sheet metal, or wood. Now Jewell required them to replace those with purpose-built structures outfitted with electricity and automated feeding and watering systems, buildings that contained thousands of chickens housed in long rows of stacked wire cages. Jewell helped his growers finance those improvements: he borrowed from a local bank and then reloaned funds to growers. He replaced verbal agreements with written contracts that gave him outright ownership of chicks and required farmers to use feed supplied by him. He based payments on the growers’ efficiency as measured by the number and size of birds they raised per pound of feed. Other Georgia broiler makers made the same decisions, and similar contracts ruled in burgeoning poultry industries at Delmarva and in Arkansas and other (mostly southern) states. The change unnerved some observers. A poultry extension agent at the University of Delaware feared that Delmarva growers did not understand the implications of these arrangements. As long as they could pay for feed “and have a few dollars left,” they “figure they are making a little profit,” he complained. They seemed not to realize that their profit had been earned at the expense of interest payments on debt. But faced with relentless demand—and, of course, the potential for profit—integrated poultry production built on the factory model became entrenched.

  The processing side of the equation changed, too. When Jewell signed contracts to supply chicken to military buyers, he also agreed to meet federal regulations in his factory. The cont
racts mandated, for example, that carcasses be subjected to a high-temperature scald and that he ship packaged, precut pieces rather than whole birds. Eventually the War Food Administration decided that federally purchased poultry must be shipped frozen. All of it required Jewell to redesign his plant and to invest in processing, packaging, sanitizing, and cooling equipment. It wasn’t cheap: the required scalding machinery increased production costs by 50 percent, forcing Jewell to compensate by reducing expenses elsewhere in his facility. But here again, Jewell got help: USDA and military staff provided engineering expertise and ensured that equipment manufacturers, already burdened by war demands and materials shortages, honored nonmilitary contracts such as those from broiler processors. Faculty at land grant schools and extension programs contributed research in the form of new freezing techniques, for example, and disease management strategies. The assistance infuriated small operators who couldn’t (or wouldn’t) shoulder the necessary expense. One man complained that the system was “arranged to help big business and discourage little business.” He was right: the USDA, military buyers, and land grant experts encouraged large-scale, efficient, machine-based production as a way to manage the paradox of plenty and to produce food as inexpensively as possible. By war’s end, broiler production had become a highly mechanized, science-based, integrated industry. The results were extraordinary: In 1939, Georgia producers put 1.6 million chickens on the market, well up from the 400,000 they had raised in 1934. In 1945, they sent nearly 30 million to market.

  But labor wasn’t the only shortage that afflicted agriculture during the war. Everyone who raised livestock, whether chickens, cattle, or hogs, struggled with feed supplies that ranged from scarce to nonexistent. Corn, for example, was in short supply. So, too, were fish meal and cod liver oil: Americans had long imported the former from Japan and the latter from Norway. The Nazi invasion of Norway and the bombing of Pearl Harbor closed supply routes. The results were predictable: farmers spent more to feed livestock, and the relatively scrawny animals they took to market yielded less meat and translated into higher prices at grocery stores. (That, by the way, was one rationale for wartime price controls on meat: without them, retail prices would have soared and shoppers would have raised hell—the last thing any politician wanted, especially those who remembered the food riots that preceded American entry into World War I.) Unless Americans could find substitutes for conventional feedstuffs, warned one official, a “serious bottleneck” would eliminate meat from many tables and from camps on the front lines. Those shortages led directly to the use of antibiotics in livestock production.

  To understand why, we need to look briefly at barnyard nutrition. Single-stomach animals like chickens and hogs thrive when their diets contain animal-derived proteins, such as fish meal and cod liver oil, or “tankage” (the byproducts of rendering plants). Deprived of those proteins, animals are more prone to disease and weigh less at maturity. Less flesh on the animal translates into less meat on the table. By the time World War II began, scientists had been studying the mysteries of animal-derived proteins for more than twenty years, and their research had fostered the development of the commercial feed industry. Ralston-Purina and Quaker Oats, for example, manufactured feedstuffs that included fish meal and cod liver oil. But those ingredients were expensive; if scientists could find substitutes, they could help farmers reduce their production costs.

  Thus the search to understand why animal-based proteins are more powerful than ones derived from plants, and to identify the so-called animal protein factor (APF) that differentiates fish oil, say, from plant-based proteins. The wartime feed shortage intensified the need for answers, and university, corporate, and USDA researchers (themselves short-handed as scientists and graduate students headed off to war) doggedly conducted feeding trials with sweet potatoes and other plants, as well as vitamins, minerals, amino acids—anything that might replicate the effects of APF. An employee at one agricultural experiment station chided his colleagues for their lack of imagination. There was no time “to repeat feeding trials for five successive years before conclusions are drawn.” The emergency of war, he argued, demanded “newer and more effective research.” He urged them to follow the lead of biologists, chemists, and geneticists engaged in basic research, especially those studying the fundamental physiological processes of life. How, precisely, did growth happen? What internal mechanism caused plants, for example, to reach for sunlight? “The ultimate objective” of such research, explained one scientist, was “growth control.” Once humans understood the mechanics of growth, they could manipulate and control it and even encourage “abnormal growth.” That explains the interest in colchicine, a substance derived from the crocus plant. Historically, it had been used to treat gout, but in the 1930s, biologists discovered that it accelerated “evolutionary changes,” transforming conventional plants into “giant” specimens. Some researchers believed that colchicine could have the same effect on animals. In 1940, a scientist at the University of Pittsburgh injected it into chicken eggs. The birds that hatched grew “abnormally large” combs and wattles, and males crowed three months earlier than usual.

  In the late 1940s, scientists finally unraveled the mystery of APF, arriving there as researchers so often do: by accident and via a circuitous path, in this case research aimed at curing pernicious anemia, at the time a deadly global menace. Pernicious anemia cripples its victims, leaving them too weak to move, and eventually attacks the nervous system. At the time, the only way to alleviate or cure it was with hefty doses of liver—a half-pound or more a day—or injections of liver extract, both of which were expensive. (Nor, it must be admitted, was the prospect of eating a half-pound of liver a day particularly inviting.) But as with APF, it wasn’t clear how or why that cure worked, or which component of liver played the crucial role. Scientists knew that if only they could identify that mystery ingredient, they could design a cheaper substitute. The answer came in 1948 when scientists working at the pharmaceutical company Merck announced that they had isolated liver’s anti-anemia ingredient, which they named vitamin B12. A dose weighing less than a single strand of human hair was sufficient to set patients on the road to good health. But even that was expensive: one ton of liver yielded just twenty milligrams of the vitamin. A few months later, scientists at Lederle Laboratories, owned by American Cyanamid, announced that they had extracted the vitamin from common bacteria, and not long after, the Merck group developed a technique for making large quantities at a low price. Merck manufactured antibiotics, bacteria- killing substances that were then relatively new, in enormous vats of fermented microbes. The process generated gallons of waste in the form of organism-soaked residues, and those could be used to make B12.

  The final step in this convoluted chain of discovery came in 1950. Two researchers at American Cyanamid were testing the impact of B12 on livestock with the expectation that the vitamin would improve the animals’ health. It did—and then some. To the men’s astonishment, B12 manufactured from the residue of the antibiotic Aureomycin acted as a superaccelerant. Animals that ate it grew as much as 50 percent faster than animals fed B12 extracted from liver. Nor did it take much to produce that effect: about an ounce of antibiotic per ton of feed. The implications were obvious. Feeds laced with a synthetic vitamin-and-antibiotic product cost less to manufacture than those based on fish meal or tankage, and livestock that ate it would reach maturity faster, which meant farmers could spend less on feed. For broiler producers like Jesse Jewell, the combination produced a 10 percent trifecta: chickens needed 10 percent less time to reach market weight, they ate 10 percent less feed, and mortality rates dropped about 10 percent. The discovery blew “the lid clear off the realm of animal nutrition,” noted the editors of a farming magazine, and left “animal nutritionists gasping with amazement, almost afraid to believe what they had found.” Farmers would “[n]ever again” have to contend with the “severe protein shortages” that plagued them during World War II. From the perspective of both far
mers and consumers, antibiotics were as valuable as tractors, combines, and agricultural subsidies.

  Enthusiasm for antibiotics and other components of factory farming increased after World War II thanks to three factors. First was the ongoing shortage of agricultural labor. When the war ended, most men and women did not return to the farm. The cold war, a dire need for housing, and the baby boom pushed the economy into hyperdrive. Factory assembly lines, whether in weaponry, building materials, furniture, or automobiles, absorbed record numbers of workers, as did offices, schools, and other non-agricultural employers. American farmers dumped their wartime profits into technologies that replaced human labor.

  Second, postwar politics transformed agricultural mechanization into a patriotic imperative. From the 1940s on, American food served as a weapon, first against the Axis enemies, and then in the cold war struggle against communism. U.S. General Lucius Clay, who served as the governor of occupied Germany, summed the equation in blunt terms: one way to “pave the way to a Communist Europe,” he said, was by forcing citizens of the former warring nations to choose “between being a communist on 1500 calories and a believer in democracy on 1000 calories.” If food, scarce nearly everywhere in the world except the United States, could help win this new war, American farmers must do whatever was necessary to support the cause. There was no time for dallying and no place for laggards, of which, economists grumbled, agriculture harbored entirely too many. Most farmers were “moving forward” into more mechanized, factorylike farming, noted a reporter summarizing one of the many hearings and investigations into the problem of agricultural “underemployment.” But many persisted in “standing still” and in relying on the “methods of their grandfathers.” By refusing to do their share, they imposed a “heavy burden” on the nation.

 

‹ Prev