Book Read Free

Lesser Beasts: A Snout-to-Tail History of the Humble Pig

Page 17

by Mark Essig


  FIFTEEN

  “A Growing Prejudice Against Pork”

  “The ear was assailed by a most terrifying shriek: the visitors started in alarm, the women turned pale and shrank back.” So begins a famous passage in Upton Sinclair’s 1906 novel The Jungle. The visitors have chosen to take a tour of a slaughterhouse. They watch as a worker hooks chains to the legs of pigs and an overhead rail lifts them into the air: “Another was swung up, and then another, and another, until there was a double line of them, each dangling by a foot and kicking in frenzy.” The narrator is impressed by the efficiency, but also appalled. “It was pork-making by machinery, pork-making by applied mathematics,” he explains, then continues: “And yet somehow the most matter-of-fact person could not help thinking of the hogs; they were so innocent, they came so very trustingly; and they were so very human in their protests—and so perfectly within their rights! . . . One could not stand and watch very long without becoming philosophical, without beginning to deal in symbols and similes, and to hear the hog-squeal of the universe.”

  Most Americans, in fact, could hear such squeals without becoming philosophical. A newspaper described Sinclair’s concern for pig suffering as “nauseous hogwash,” and the author himself later disavowed any such empathy, claiming he had intended the passage as “hilarious farce.”

  Sinclair often struggled to convey his intended messages. A failed author of romantic novels and a recent convert to socialism, he had traveled to Chicago to examine working conditions at slaughterhouses. He hoped that his novel, serialized in a socialist journal in 1905 and published as a book a year later, would expose the plight of exploited workers and prompt a revolution. He failed in that goal. “I aimed at the public’s heart,” he explained, “and by accident I hit it in the stomach.”

  The American people, Sinclair was not the first or last to learn, had a large capacity for ignoring the sufferings of the less fortunate. The food on their plate, however, was a different matter, especially after The Jungle offered this description of sausage making:

  There was never the least attention paid to what was cut up for sausage; there would come all the way back from Europe old sausage that had been rejected, and that was moldy and white—it would be dosed with borax and glycerine, and dumped into the hoppers, and made over again for home consumption. There would be meat that had tumbled out on the floor, in the dirt and sawdust, where the workers had tramped and spit uncounted billions of consumption germs. There would be meat stored in great piles in rooms; and the water from leaky roofs would drip over it, and thousands of rats would race about on it. It was too dark in these storage places to see well, but a man could run his hand over these piles of meat and sweep off handfuls of the dried dung of rats. These rats were nuisances, and the packers would put poisoned bread out for them; they would die, and then rats, bread, and meat would go into the hoppers together.

  From the hoppers emerged sausage that was “sent out to the public’s breakfast.”

  Though The Jungle spelled trouble for the meat industry in general, it was especially bad news for pork. Long regarded as the food of poor people and country folk, it was now increasingly rejected by a nation growing wealthier and more urban. A newer concern was trichinosis, the recently identified parasite, transmitted through undercooked pork, that encysted its larvae in human muscles. That seemed like a big risk for the sake of a pork chop.

  The meat industry fought hard to counteract such prejudices: it cleaned up its plants, the government helped craft new rules to combat trichinosis, created new ways to cure and market ham and bacon, and even enlisted the wives of farmers to promote pork in newspaper columns and at grocery stores. Such efforts had mixed results at best.

  Even before The Jungle, the public had grown suspicious of the Chicago meatpackers, who had consolidated into a cartel that fixed prices, set production levels, and divided territory, controlling most of the industry from slaughter to retail. After a rise in retail meat prices—partly beyond the meatpackers’ control—infuriated the public, the federal government in 1902 successfully sued the meat trust under the Sherman Antitrust Act. The victory proved only nominal: the big companies absorbed the costs, skirted the restrictions, and continued to operate as before. The Jungle altered the debate. Previously, consumers had worried about being ripped off. Now they feared being poisoned.

  After the scandals triggered by The Jungle, meatpackers tried to convince the public that their products were sanitary and wholesome—as in this 1912 advertisement showing a fresh-faced blonde girl in white clothes, with ham and bacon carefully wrapped in white paper and proudly displaying the government inspection seal.

  Newspapers and magazines jumped on the muckraking bandwagon and found more evidence of tainted food. Meatpackers had been using borax—a mineral most often found in detergents—to preserve meats, which explained how one firm could advertise an additive that would maintain the freshness of “pork and liver sausage, when exposed on your counter, and in the hottest weather, for at least one week.” Newspaper reports of such practices fueled public outrage. President Theodore Roosevelt, ill-disposed to trusts in general and the meat industry in particular, commissioned an investigation that confirmed most of The Jungle’s allegations. Within four months of the book’s publication, Congress passed the Pure Food and Drug Act and the Meat Inspection Act. The federal government took over inspection of all packing plants, promising that sanitation standards would be enforced and that no “unsound, unwholesome” meat would reach the American public.

  As it turned out, less meat of any sort found its way onto Americans’ plates. In the decade after The Jungle’s publication, per capita meat consumption plunged from 170 to 140 pounds a year, and it would remain relatively low for decades. Two world wars and the Great Depression played a role in that decline, as did the growing number of urbanites who rejected breakfast meats in favor of cereals like Kellogg’s Corn Flakes. Meatpackers, though, blamed The Jungle and what one industry official called “systematic anti-meat propaganda.”

  Beef eventually rebounded from this dip in popularity, but pork continued to struggle. Its associations were largely negative and had deep cultural roots. According to Edward Hitchcock, a chemist and president of Amherst College, bacon “is so extremely undigestible and heavy” that it should be eaten only by the “laboring classes” and shunned by “the sedentary and the literary.” “Fat bacon and pork are peculiarly appropriate for negroes,” a physician explained in 1860. The medical theories of the sixteenth century—that only manual laborers could properly digest pork—were alive and well in the nineteenth. This was thanks in no small part to Sylvester Graham, popular health reformer and father of the eponymous cracker, who embraced Renaissance medical writers in his campaign to persuade Americans to eat less meat.

  Simple dislike buttressed these medical theories. America had inherited from England a hierarchy of meats that placed beef and veal at the top, lamb and mutton next, and pork at the bottom. One cookbook writer dismissed barreled pork as “sea junk”—a reference to its use as a maritime provision—and rejected its taste as “villainous,” while another described pork as “dangerously unwholesome.” An 1893 guide to household management claimed, “A growing prejudice against pork in all its varieties . . . pervades our best classes.”

  Statistics bear out these observations. A 1909 study of 8,000 families in US cities found that wealth shaped the type of meat people ate. The highest income group ate three times as much poultry and 50 percent more beef, compared to the lowest income group. The poor ate the most pork. African Americans ate more pork than whites, and as their income rose, they spent the extra money not on more pork but on beef and chicken. For southern whites, the same was true: as they began to earn more money, their pork consumption stayed flat as their consumption of chicken and beef climbed.

  Pork packers, well aware of these trends, responded with new marketing tactics. Rather than se
lling anonymous barrels of pickled pork, they followed the cultural trend toward national brands marketed directly to consumers. This was one area where pork had an advantage. Beef, because it cured poorly and had to be sold fresh in the butcher case, could not easily embrace this model. Pork products—salted, smoked, and wrapped in branded packaging—could. Customers at butcher shops requested porterhouse steaks or hamburger—company of origin unknown—but they learned to ask by name for Armour’s Star Bacon and Swift’s Premium Ham.

  In their efforts to improve the reputation of pork, meat packers emphasized ham and bacon. The focus on ham was not surprising because it had always been the most prestigious cut. American hams—especially Virginia’s Smithfield variety, from peanut-fed hogs—earned so much praise that Queen Victoria had a standing order for six Smithfield hams a week. Those were dry-cured or country hams, and packers had once produced them in quantity even though the cure took months. In the twentieth century, however, they switched most production to the city ham, wet cured in a brine solution much like barreled pork. To speed the process—lengthy cures tied up a lot of capital—packers started injecting brine into the ham with needles. Later they invented “vein-pumping,” which involved blasting brine from a high-pressure hose into a large vein in the ham and allowing the animal’s circulatory system to spread it through the meat. Such methods cut a three-month cure down to a week, then later to just a few hours. Efficiency triumphed, but flavor suffered. Dry-cured hams achieve greatness over a period of months as enzymes break down proteins into dozens of intense flavor compounds. Wet-cured hams tend to be soggy and insipid.

  Bacon received an even more thorough reinvention. In 1850 the term “bacon” applied to any dry-cured cut of pork; fifty years later, as packers standardized their terminology, “bacon” referred only to belly meat—and nearly all of it was wet cured. The earlier practice of salt-packing pork bellies for more than a month, then smoking them, required too much time for packers operating on an industrial scale. A sweet pickle delivered better results: bellies were dumped into 1,000-gallon vats holding a sugary brine, then drained and moved to the smokehouse. The method could not deliver the intense flavor of dry-cured bacon, but it did reduce labor costs, produce a more consistent product, and give Americans the sweetness they craved even in meat.

  Before World War I, this new type of bacon was cut into slabs of at least four pounds, wrapped in waxed paper, and branded with the company label. Customers sliced it at home. But in 1915 some producers began to sell wet-cured bacon presliced in one-pound packages. By the 1950s, packers had automated their lines so that bellies were pressed to uniform thickness, needle-injected with brine, automatically sliced, shuffled into the familiar shingle-like display, and packed in clear plastic that let customers view the streaks of lean and fat. The automated processes invented to produce modern bacon required heavy capital investment, but they paid off. By 1960 bacon had shed its reputation as a country meat and been reborn as a beloved breakfast staple for all classes.

  Fresh pork experienced no similar resurgence. It didn’t help that consumers were instructed to cook it until well-done—which generally meant bone-dry—in order to kill disease-causing worms. The lifecycle of Trichinella spiralis starts when a host—a person or pig, for instance—eats meat that contains the worm’s encysted larvae. The host’s gut digests the cyst walls, releasing the larvae, which grow into sexually mature adults, mate, and produce more larvae, which enter the bloodstream and then the muscles, where they encyst themselves and wait for another creature to eat them. Cooking meat to 137 degrees Fahrenheit kills the larvae, but not every cook followed that rule, leading to headlines such as “Missouri Town Reports 47 Cases of Trichinosis.”

  Pigs generally got the worms the same way humans did: by eating undercooked pork. Up through the 1950s and 1960s, feeding garbage to pigs on a commercial scale was common. (When in-sink garbage grinders such as the DisposAll became popular in the 1960s, they earned the nickname “mechanical pigs” because they got rid of food waste, a job recently held by genuine pigs.) Garbage-feeding swine clustered around the highest concentrations of garbage, in cities on the East and West Coasts. The American City, a policy journal, surveyed garbage-collection methods in 1920 and found that nearly a third of cities with populations over 100,000 used swine feeding as their primary garbage-disposal method; an even higher percentage of smaller cities did so. The article pointed to the economic benefits: “A ration of 1 pound of marketable pork to 50 pounds of garbage has been established, and with pork at 20 cents a pound on the hoof . . . garbage as feed is worth $8 a ton.” The article cited, as an example, Worcester, Massachusetts, which had a population of 185,000 and kept a herd of 2,000 pigs to process its waste. In a period of just over two years, the city made a profit of $59,000 selling garbage-fed pork.

  The large-scale practice of feeding pigs garbage and then selling the meat became more common during World War II, when corn became expensive and growers sought other sources of feed. Secaucus, New Jersey, home to 250,000 hogs, earned the nickname “Pig Capital of the East.” The farmers collected garbage from Manhattan restaurants each night, fed it to their pigs, and sold the pork back to Manhattan restaurants. The farms survived until 1960, then fell victim to encroachment by the New Jersey Turnpike and neighbors’ complaints about the stench.

  It was a model of efficient waste disposal—except that the garbage often contained pork scraps. Pigs, of course, ate meat, and swine cannibalism on a large scale had been common in America for more than a century because packers sold pork by-products as swine feed. Those by-products, however, had been cooked and dried, rendering them disease-free. Restaurant garbage, by contrast, might contain raw pork, which could transmit not only trichinosis but devastating livestock diseases like hoof-and-mouth and African swine fever. “Human trichinosis is based almost entirely on porcine trichinosis,” a 1942 New York State commission concluded. “And porcine trichinosis is based almost entirely upon feeding hogs raw garbage containing trichinae-infested pork scraps.” In 1952 a national outbreak of a swine disease called vesicular exanthema was traced back to garbage-fed pigs, and the US Department of Agriculture (USDA) took action. A 1952 rule required that all garbage fed to pigs must first be cooked to kill pathogens.

  Garbage-fed pigs constituted a tiny portion of the annual pork crop—less than 2 percent—but they caused an outsized portion of the problems. “Although garbage-fed hogs are daily sold as food universally, there is some aversion to this practice,” one report noted mildly. Meat quality suffered: because pigs lay down fat in much the same form in which they consume it, the taste of their flesh is closely linked to diet. And because the supplies of garbage- and corn-fed hogs were not differentiated in the market, a bad pork chop reflected on the entire industry. Lingering fears of parasites proved even more damaging. As one scientific study noted, “The prevalence of trichinosis in the United States has long cast an unfavorable light on the production of American pork.”

  Despite the best efforts of America’s growing pork industry, the dubious practices of some pig farmers and meatpackers prevented the reputation of pork from rebounding. A 1942 study from the USDA noted “a shift in the consumption of meats as incomes rose, from pork to beef, veal, and lamb.” In other words, despite the remaking of ham and bacon, pork remained the meat of the poor. A 1955 study of urban consumers found that with each step up in income level, beef consumption rose while pork consumption fell. Distinctions between city and country dwellers persisted as well. Urbanites devoted half of their meat consumption to beef and only a quarter to pork, while in the country those percentage were reversed.

  These trends did not bode well for the pork industry. World War II factory jobs accelerated the population shift from country to city, and city dwellers made more money than rural Americans. A booming post–World War II economy raised living standards. After the war, three out of four American families had mechanical refrigerators, which meant they could now keep
fresh meat in their homes and had less need to store cured meat in the pantry. Beef, most palatable when fresh, could suddenly be enjoyed with much greater convenience, a shift that undermined one of the greatest reasons for pork’s historic appeal.

  The year 1953 marked the end of an era for the American pig. That year, for the first time, Americans ate more beef than pork: 77.6 pounds of beef per capita, compared to 63.5 pounds of pork. The trend would continue in the decades ahead. By the 1970s, pork consumption had fallen to fifty-one pounds, while beef rose to eighty-six pounds.

  The pork industry took note. The president of the National Pork Producers Council traveled the country to meet with pig farmers and give a talk titled “Improving the Image of Pork and Pork Producers.” In the 1960s the Porkettes, a women’s auxiliary to the Iowa Swine Producers Association, assigned themselves a similar task. They created a mascot, Lady Loinette, who served as a partner to the primary pork mascot, Sir Hamalot. The Iowa Pork Queen, crowned each year by a committee of Porkettes, served as an ambassador for the meat. One queen asked, “Who first but Iowa would envision combining the image of the hog with the enthusiasm of vibrant young women in order to promote the pork industry?” The Porkettes held contests for baking with lard and created a campaign to promote pig leather with the slogan “Pigskin—Our Prettiest Byproduct” (which may have served as a reminder of how distinctly unpretty the other by-products were). The group’s magazine, Ladies Pork Journal, included a feature titled “You’ll Know She’s a Porkette When . . . ,” with such responses as “She passes out new pork recipes to ‘city’ friends.”

  Americans’ pork prejudices persisted nonetheless. The first president of the Porkettes told a story about a “professional man from the city” who visited her farmhouse and told her he did not eat pork because it was “unwholesome.” Another Porkette was conducting a grocery store promotion when an “old grandmotherly lady” explained that she always served applesauce as a side dish “to counteract the poison in pork.”

 

‹ Prev