by Maureen Ogle
Beef and pork producers as well as meatpackers took offense at the document because it specifically indicted meat (as a man involved in the fracas put it, all “hell broke loose”), and McGovern released a revised version that avoided that word. But opposition to the report came from more than just the meat industry. In 1980, for example, the National Academy of Sciences published a study that challenged the heart-healthy mantra. Consumer advocates denounced the academy’s findings as biased because one of the report’s authors had once worked as a consultant for the egg industry. The man pointed out the lunacy of that criticism: during his career he’d received a quarter-million dollars in grants from industry sources, but $10 million from government agencies. How could he be a corporate patsy because of $250,000, but not a government stooge thanks to $10 million? (It’s worth mentioning that Hegsted, the Harvard scholar who tutored Mottern in the “correct” view, devoted his later career to research funded in part by Frito-Lay.)
But the damage was done; in the minds of many Americans, beef and pork had become public enemies one and two, and the bad news kept coming. In the early 1980s, several widespread, and widely reported, disease outbreaks were traced back to beef tainted with a newly discovered and exceptionally virulent form of an otherwise common bacteria, Escherichia coli O157:H7. After investigators tracked one of the episodes back to a South Dakota cattle herd, they concluded that feeding antibiotics to livestock had potentially fatal consequences for humans. There could no longer be any doubt, argued the researchers, that “antimicrobial-resistant organisms of animal origin cause serious human illness.” No doubt in their minds, but plenty in other people’s. In the wake of the findings, a consumer advocacy group petitioned the FDA to ban drug additives in livestock feed, but a hearing on the request ended like every other discussion of the subject: it raised more questions than it answered, and the scientists’ seemingly irrefutable evidence proved to be both debatable and refutable.
Between lethal bacteria on one hand and heart disease on the other, beef and pork consumption plunged. A 1983 consumer poll documented pork’s woes. Those surveyed complained that pork contained too much salt, cholesterol, fat, and calories. Forty-five percent said they’d cut back on fresh pork for “health reasons,” and nearly a quarter said they’d reduced their consumption of all pork products, fresh or processed. Even McDonald’s, the wizard of food, couldn’t work its magic on pork. In the summer of 1980, the company began testing a “McRib” sandwich, rib-shaped slabs of ground and chopped pork slathered with barbecue sauce. The pork industry salivated at the potential of this new menu item, but the McRib proved a no-go; the company pulled it from the menu in 1983. Part of the problem lay in preference: Kansas City–style barbecue sauce leans toward sweet, and North Carolina’s toward tart; McDonald’s one-taste-suits-all could not overcome those regional differences. The condiment also made for messy eating, a detriment to Americans accustomed to eating on the run and in their cars. But in the end, McDonald’s conceded that the McRib succumbed to consumer resistance: good taste and low price could not overcome pork’s bad reputation. McDonald’s fared better with its Egg McMuffin, which also contained pork, apparently because it suited a consumer niche: when analysts dissected the ten-pound-per-capita drop in pork consumption, they discovered that Americans would eat pork as long as it was processed and convenient—whether as bacon, “lean” microwaveable sausages, or Egg McMuffins.
The toppling of King Beef was more shocking. Per-capita consumption dropped from 131 pounds in 1976 to 105 in 1980 to 97 pounds a decade later. “A story about the beef industry belongs in the obituary column,” mourned Ken Monfort. A Nebraska cattle raiser agreed. “Nobody eats beef anymore,” he mused. “Sometimes I wonder if I would be better off not getting out of bed in the morning.” Cattle feeders pooled their funds to support pro-beef advertising campaigns—“the Mercedes of Meat” and “Somehow, nothing satisfies like beef”—but those did little to bolster the king’s sagging reputation. In desperation, members of the California Cattlemen’s Association petitioned the national cattlemen’s group to end the use of low-level antibiotics. “We thought everybody would always eat beef,” said the California organization’s director, “but it turned out not to be true”; his group’s members reasoned that eliminating antibiotics might persuade some people to come home to beef. The National Cattlemen’s Association refused to go along, but Paul Engler, whose cattle-feeding operation was by then the largest in the world, announced that he would stop using two controversial antibiotics. He didn’t believe that antibiotic-laced feeds were dangerous, he explained, but many consumers did. The “inference” of danger was already out there, he argued, so “why jeopardize the demand for your product?” “By dropping antibiotics,” added a company vice president, “we are trying to teach the public that beef is healthy.”
But Engler’s decision had no effect on beef sales, although how much was due to fear of fat and calories and how much to potentially lethal bacteria was not clear. A financial analyst warned cattlemen that it was time to accept “the harsh reality that the collapse in consumer taste for beef is permanent.” “It’s a declining industry,” he emphasized, “and the only question is how far it will decline.” Even the president of the National Livestock and Meat Board conceded that the days when meat makers could take consumers for granted were over. “It’s the younger, more highly educated, high-income people who are turning away from beef toward more vegetables and white meat in their diet,” he said. “These are the opinion leaders that are eventually going to be influencing the eating habits of our bread and butter customers.”
It’s not clear what role income and education played in the shift, but he was correct about the ascent of “white meat,” by which he meant chicken. Every report about heart disease, fat, and cholesterol touted the virtues of poultry (and, to a lesser extent, fish) as a healthy alternative to beef and pork. Consumers didn’t need much convincing, in part because chicken consistently cost less than the other two meats, a factor of biology: cattle and hogs needed months of expensive grains to reach market weight, but a broiler was table-ready in eight weeks or less. As important, chicken was everywhere shoppers and diners wanted to be. From its inception, the broiler industry had worked the convenience angle more aggressively than its pork and beef counterparts, and packaged chicken products abounded. Don Tyson hit the broiler jackpot in the early eighties when he won a contract to supply McDonald’s with its newest offering: the Chicken McNugget, which consisted of a bit of chicken, a lot of “filler” and batter, and even more calories and fat. McNuggets were an instant success—and drew instant fire from Michael Jacobson: in a complaint filed with the FTC, Jacobson’s Center for Science in the Public Interest accused the chain of false advertising. McDonald’s described the McNugget’s contents as “delicious chunks of juicy breast and thigh meat,” but Jacobson pointed out that the bites also contained sodium phosphate, chicken skin, and beef fat. Who cared? Want to eat healthy? Eat chicken. Eat a McNugget: a bit of chicken and a lot of calories and fat. All of it added up. In 1960, Americans ate twenty-eight pounds of chicken per capita; by 1970, that had risen to forty. In 1980, they put away forty-eight pounds, and in 1987, broiler makers squawked with delight as poultry toppled King Beef. It’s no accident that in the late eighties, pork producers adopted an ad campaign that touted pork as “the other white meat.”
But even broiler producers couldn’t take their market for granted, not in an era of media bloat and heightened consumer awareness. The final decade of the century dished up plenty of evidence that when it came to meat, whether beef, pork, or poultry, whether on the table or on the farm, something had gone wrong.
In 1993, scores of people became ill, and some died, in a food-poisoning episode traced to undercooked hamburgers purchased at Jack in the Box, a northwestern fast-food chain. The culprit proved to be the same one scientists had linked to bacterial resistance a decade earlier: E. coli O157:H7. The tiny organism became a household name after th
e Jack in the Box incident, and the tragedy highlighted the flaws in a food safety system designed for a premicrobial era. Federal meat inspection dated back to 1906, when inspectors were trained to look for diseased livestock, not diseased meat. They were right to do so: back then, epizootic diseases routinely ravaged poultry flocks and cattle and hog herds, and the USDA poured money into researching and eradicating those scourges. That work proved so successful that by the 1950s, many once-common vaccinations were no longer necessary. Indeed, researchers theorized that E. coli 0157:H7 had flourished because cattle ranchers and feeders had reduced or eliminated once-routine vaccinations. But meat inspection procedures had not kept pace with science. Federal rules allowed inspectors to condemn foods that were “so infected” that eating them might “give rise to food poisoning,” but the men and women charged with monitoring slaughterhouse output had few tools for identifying microorganisms. A 1974 court ruling made that even more difficult: bacteria were not an adulterant and inspectors were not required to consider their presence when giving a carcass the thumbs-up or -down. But the proliferation of O157:H7 and the Jack in the Box episode amounted to a line in the sand. Critics demanded that meat inspection be overhauled. Strategies that worked back when slaughterhouse lines moved at the pace of a single-load rifle were useless on kill lines that operated at machine-gun pace, and speed, many argued, contributed to cross-contamination that led to tragedies like the one at Jack in the Box.
Leaders of the meat industry’s primary trade group, the American Meat Institute, conceded the point, but they argued that federal inspectors were also part of the problem. “They don’t know where they’ll fit [in a new system],” argued an AMI spokesman. “They’re not microbiologists.” Not so, retorted an official with the inspectors’ union. He and other inspectors were “not against technology, and we’re not against moving forward.” But they objected to proposals that would replace conventional inspection with a “science-based system.” Nor were packing plants the only problem. The Jack in the Box episode was blamed less on E. coli than on line workers who had failed to cook hamburger to the required temperature. But state and city food and restaurant inspection systems suffered from that most common of ailments, lack of funds. Consider the case of Kansas: Its inspectors were charged with traveling to and inspecting thirty-two restaurants a week. Any one inspector might manage a cursory search for, say, cockroaches and overflowing dumpsters, but it’s unlikely they’d have time to do much more.
Changes were needed, but what those should be, and how to implement them, was open to debate. Some critics argued that the USDA should shift its mission from agricultural cheerleading to consumer protection. Easier said than done. For over a century, the department had tried to be, and often succeeded at being, all things to all people: it had led the way in eradicating crop and livestock diseases and in promoting improved agricultural and livestock management. But it was also the cheerleader-in-chief for the nation’s food industries, and employees promoted both production and consumption of everything from steak to broccoli, from poultry to cantaloupe. Asking the department to support and promote the interests of farmers, manufacturers, and consumers was bound to generate power struggles and gridlock. Worse, the work of the FDA and the USDA typically either overlapped or collided. In the wake of the Jack in the Box case, one reporter pointed out the looniness of a food safety system that required the USDA to inspect canned soups that contained meat, and the FDA to inspect soups without. Reforming the status quo was easier to imagine than do; there are few human endeavors more entrenched than bureaucracies.
The Jack in the Box incident cast doubt on the USDA, on meat safety, and on food inspection, but a different disaster highlighted the role that agriculture played in putting meat on the table: the North Carolina manure spills of the late 1990s. Those marked a turning point; after that, it was hard for anyone to ignore the costs of factory farming.
No state had benefited more from the new geography of hog farming than North Carolina, and no group more than Murphy Farms, one of the biggest hog farmers in the United States. Murphy Farms was the brainchild of Wendell Murphy. After graduating from college in 1960, he taught high school briefly, but like that other schoolteacher-turned-agricultural-power-player, Warren Monfort, Wendell Murphy wanted a different life. In the early 1960s, he bought a corn mill, which he operated with his father and brother. From there it was a short leap into feeding hogs. In 1964, the Murphys recruited their first contract “growers,” and as in the broiler industry, the Murphys functioned as banker and coordinator, loaning their growers the money needed to buy piglets and feed, and selling the hogs when they were ready for market. Over the next twenty years, the family embraced confinement production, signed up dozens of contractors, and built farrowing operations. In the late 1980s, they expanded into Iowa because, said Wendell, “we wanted to find out if we ought to be in the hog business in Iowa or North Carolina.” He concluded that North Carolina had more advantages, especially weather, but that didn’t stop the family from establishing additional outposts in Missouri and Illinois. But as had been the case with the Monforts back in the 1950s, the Murphys’ desire for growth collided with lack of outlets: Murphy Farms produced more hogs than North Carolina packers could process.
Deliverance arrived in the form of Smithfield Foods, a Virginia-based hog slaughter and pork-processing company. Like Murphy, Smithfield’s president, Joseph Luter III, wanted to expand but couldn’t lay his hands on as many hogs as he needed. Urban sprawl had devoured Virginia farms, and he was trucking a third of his kill from the Midwest, a logistical burden that raised his costs relative to midwestern slaughtering operations. His problem became the solution to the Murphys’. In early 1990, Luter announced plans to build the world’s largest hog slaughterhouse in North Carolina, not far from where Murphy and other hog-farming giants raised millions of animals. When the new plant opened in 1992, it ignited North Carolina’s already robust hog-farming industry: in 1991, the state turned out 2.8 million hogs; in 1994, the number hit 7 million, nearly all of them clustered in the southeastern corner of the state.
But even in hog-friendly North Carolina, hogs, pork packing, and the jobs both created came at a price, and critics dug in their heels. “We are not against the smaller farmer,” explained a spokesman for the Alliance for a Clean Swine Industry, but he and other opponents objected to “the bondage of feces and urine” created by big hog farms. But what mattered more? Jobs or odor? According to many residents in that part of the state, jobs did. When Bladen County officials held a hearing to consider Smithfield’s request to build, more than a thousand people showed up, many of them wearing “I support Smithfield” buttons, and a “wildly cheering” crowd roared its approval when the state’s commissioner of agriculture urged county officials to let Smithfield move forward. As hogs, and jobs, proliferated, local media tracked the turmoil and the debate. In February 1995, reporters at the Raleigh News & Observer published a series of reports on the impact of the state’s hog industry, little of which was flattering, especially the portrayal of Wendell Murphy as “Boss Hog,” the legislative kingpin who called the shots and built the industry. (Murphy, who served in the state legislature from 1983 to 1992, was not insulted. “All of a sudden, I found myself a hero,” he said later. “It was like all of a sudden people really started coming to me: ‘Man, you are really good. We didn’t know you were doing all this stuff.’”) The series, which won a Pulitzer Prize, described the complaints of people who loathed the industry’s odors and feared the pollution, detailed the way the legislature had smoothed the path for hog farming and the world’s biggest slaughtering house, and noted the gratitude of those who saw impoverished counties gain jobs and income. It captured, in short, the complexity, paradox, and unease that was meat in late-twentieth-century America and drew attention to agriculture, an economic sector whose efficiency had rendered it all but invisible in the eyes of the general public. In the wake of the coverage, the state’s legislators pondered chang
es to the legal structure that had long supported and empowered Murphy and other hog producers.
And then came the storms. In the summer of 1995, and not long after the Pulitzer series ran, torrential, prolonged rainfall inundated large parts of the state. Water swamped a dike on an “industrial swine” farm, and nearly 30 million gallons of feces and urine poured into the New River. Hog waste stood eight inches deep on a nearby road, and sludge coated crops in the fields of nearby farms. “Didn’t nobody mean for it to happen,” said one of the owners of the company that had built the lagoon. “It just happened.” Maybe, maybe not, but reporters discovered that the farm, which was less than two years old, had been the first one built using a new set of strict environmental guidelines designed to protect citizens, land, and water from industrial hog wastes. Over the next few weeks, several more lagoons, including one at a chicken farm, collapsed, washing more waste into waterways. The “environmental Alamo,” said one reporter, destroyed the illusion that giant hog farms were benign. The environmental Alamo had siblings: In 1999, Hurricane Floyd struck the North Carolina coast. Four rivers flooded and thousands of hog carcasses littered the state’s countryside. The ensuing stench and mess, and detailed reporting about it, fueled the debate about livestock production and meat.