Book Read Free

Down to Earth_Nature's Role in American History

Page 27

by Ted Steinberg


  TWO ALL-BEEF PATTIES

  More than anything else, it was the rise in popularity of the fast food hamburger that launched Americans on a bovine extravaganza. The origins of the hamburger are obscure. As far back as the 1830s, something called a hamburg steak was evidently served at Delmonico’s restaurant in New York City. But in all likelihood, the hamburger, as we know it today, emerged in the early twentieth century. White Castle, founded in 1921 in Wichita, Kansas, was the first company to promote the hamburger as a form of fast food. Spreading from Kansas to cities across the Midwest and then to markets in the East, White Castle sold its burgers for five cents apiece and catered mainly to a working-class clientele, often building locations near factories. In one banner week in 1925, the company sold over 84,000 burgers, stuffed into insulated bags printed with the slogan “Buy ’em by the Sack” printed across it.27

  It was not until the 1950s, however, that fast food restaurants assumed their dominant role in American culture, in large part because of the work of McDonald’s founder, Ray Kroc. After his visit to the famed San Bernardino hamburger bar, Kroc bought the franchise rights from the two McDonald brothers. In 1955, Kroc opened his first restaurant in a suburb of Chicago.

  Kroc realized that if he made his hamburgers bland enough, holding off on seasonings and spicy sauces, he could sell them to a wide segment of the American population, even to children. Indeed, Kroc self-consciously made children, with their unsophisticated palates, the principal target of his advertising, creating the Ronald McDonald clown as a way of pitching his burgers. According to one poll conducted in 1986, 96 percent of children surveyed were able to identify Ronald, with only Santa Claus scoring higher.28

  Kroc also seemed to have a knack when it came to locating his restaurants. In 1959, having opened 100 stores to date, Kroc hired an airplane to help him identify the best sites for future stores. He singled out shopping centers and large intersections as he sought to capitalize on the nation’s suburban, highway-oriented culture. Clever placement near housing developments helped him attract overworked mothers. “You Deserve a Break Today,” the company sloganeered. As one foreign observer marveled, “A family of four can save Mother two hours in the kitchen, eat and drink for about $5 and get back into their station wagon in fifteen minutes flat.” On less hectic days, a family might choose to linger at one of the playgrounds that soon became fixtures at McDonald’s and other fast food restaurants, allowing the chains to capitalize on the decline in open space endemic to heavily developed suburban areas.29

  Everything in McDonald’s was (and is) planned, right down to the size of the hamburger patty itself: 3.785 inches across, weighing 1.6 ounces and containing no more than 19 percent fat. Kroc put earlier efficiency experts such as Frederick Taylor to shame. He sought total control, taking apart each and every step in food preparation and service and detailing exactly how it was to be accomplished in a company operations manual. The first such manual was 75 pages in length, spelling out such trivial details as the order in which to flip rows of hamburgers (third row first). Over the years, the manual has grown to the point where today it fills over 600 pages.30

  The assembly-line production of hamburgers was the retail counterpart to the equally efficient cattle disassembly line. In both cases, the introduction of machines and precise instructions allowed employers to hire cheap labor. McDonald’s now trains more people than the entire U.S. Army, primarily young adults, ages 15 to 19. Like the new generation of meat-packers, McDonald’s has been fiercely anti-union. In the 1970s, the company staved off more than 400 unionization efforts, using lie detector tests in at least one instance to intimidate employees.31

  McDonald’s has experienced extraordinary success. In 1972, the company became the largest meal-serving organization in the nation. In 1976, when beef eating in the United States peaked (it has since declined because of worries about its health effects), McDonald’s was selling more than six million hamburgers a day. By the late 1990s, one in seven visits out to eat found the American consumer headed for its Golden Arches. The orgy of hamburger-eating has helped make McDonald’s the world’s largest beef buyer, relying on the slaughter of three million cattle each year in the United States alone.32

  “OVER 17 BILLION SERVED”

  Drawn in 1974, near the peak in American beef consumption, this sketch of Americans eating hamburgers inside the Statue of Liberty pictures a small “M” for McDonald’s on the crown. (Library of Congress)

  The rise in hamburger-eating was of course good news for America’s beef industry. Clever marketing by fast food companies helped their market share, but so did initiatives that had gone on earlier in Washington. In 1946, the USDA legally defined what a hamburger could be and, under its definition, a “hamburger” could contain nothing but ground beef and beef fat. Although any kind of fat would do the job of binding together a burger to keep it from falling apart on the grill, the agency, by decreeing that only cattle fat could be used, gave the beef industry a veritable patent on perhaps the only food product more American than apple pie. Because cattle set free on open pastures and fed on grass alone do not have enough fat, the fast food companies turned to the sedentary, feedlot cattle, which had a thick layer of fat carved off at slaughter. In the end, the fast food companies received the ingredients for making inexpensive hamburgers, while the beef industry received a monopoly on the nation’s most popular food item.33

  The beef industry has continued to exert tremendous clout in Washington. Beginning in the mid-1950s, at precisely the time that Americans sat poised to go on a beef-eating splurge, scientists uncovered a relationship between diets high in fat and heart disease. With evidence of fat’s harmful effects mounting, liberal Senator George McGovern opened hearings in 1977 on the relationship between food and chronic diseases. His committee’s report recommended that Americans “decrease consumption of meat.” The National Cattlemen’s Association, however, objected to the government’s advice. Shying away from doing battle with the powerful beef lobby, McGovern’s committee revised its recommendation. Instead of a blanket statement advising citizens to eat less meat, the committee weakened its position, counseling consumers to decrease the consumption of “animal fat, and [to] choose meats, poultry, and fish which will reduce saturated fat intake.”34

  In the early 1990s, when the USDA determined that American consumers needed more guidance in choosing healthy food, the beef industry again intervened. This time the agency came up with a food pyramid. Grains and cereals occupied the longest band at the base, vegetables and fruits occupied the layer above, and meat and dairy products were the next layer up, with fats and sugary items capping off the chart. The National Cattlemen’s Association cried out that the guide unfairly stigmatized beef. That caused the USDA to backpedal and ultimately to postpone publication of the pyramid. When it finally released the guide, the USDA revised its accompanying recommendations and urged consumers to eat two to three portions of meat and dairy each day, the same advice it had been giving Americans since as far back as 1958. But what really pleased the industry was the agency’s modification of its stance on the upper limits of meat eating. In 1990, it had advised Americans to consume on a daily basis no more than six ounces of meat per day. Under the revised food pyramid provision, however, the upper limit was bumped up to seven ounces—yet another triumph for the beef lobby.35

  OIL, WATER, GRASS

  The problems with beef extend well beyond the threat that it poses (at least if consumed in large amounts) to human health. Raising cattle on feedlots is an immensely energy-intensive enterprise. Where some of the energy goes is readily apparent, as in the fuel needed to run farm equipment. But it also takes a tremendous amount of energy to produce the fertilizer that farmers use to produce corn, which in the United States is grown largely to fatten livestock. In 1990, America’s cornfields used roughly 40 percent of the nitrogen fertilizer consumed in the nation. One study (from 1980) demonstrated that it took 17,000 kilocalories of energy to produce
a kilogram of beef. That is the energy in about a half-gallon of gasoline, all for just 2.2 pounds of meat.36

  Apart from being a heavy drain on the nation’s energy supply, beef producers are also gluttons when it comes to water. When you add together all the water it takes to produce a pound of beef—to irrigate grain, water the stock, and process the cattle—the total comes to 360 gallons. This demand for water is especially problematic on the Great Plains, home to the nation’s feedlots and beef processors. At IBP’s Holcomb, Kansas, plant, 400 gallons of water are needed to slaughter and process just one animal. That translates into 600 million gallons of water every year to process one million head of cattle. The water comes from the Ogallala aquifer, a 174,000 square mile underground reservoir that once contained nearly 10 trillion gallons. More than half of that magnificent supply is now gone, and if present levels of consumption continue it is quite conceivable that the Ogallala will be tapped out in just a few more decades.37

  Beyond these ecological impacts is the effect that cattle-raising has had on the vast landholdings of the federal government, both in national forests and in other prime pasture areas. In 1934, Congress enacted the Taylor Grazing Act to bring some semblance of order to the public domain. The act set up a system for leasing the land to ranchers and established a National Grazing Service (which later became the U.S. Bureau of Land Management) to supervise the cowboys. Although explicitly set up as a rental arrangement, the system has led permit holders to treat the leases as a form of private property. In other words, ranchers have at times sold something that was not theirs to sell, the right to use government-owned land to graze cattle.38

  The low price charged ranchers for grazing permits has long been a bone of contention. The price has often been just a fraction of what the land would lease for if it were privately owned. And since the government’s price is calculated per animal unit month—the forage it would take to feed a cow and her calf for 30 days—ranchers have had an economic incentive to overstock the public range. To some, the grazing program amounts to little more than a government handout, or “cowboy welfare” in the words of the radical environmentalist Edward Abbey.39

  The predictable result has been rampant overgrazing, especially along streams, where cattle congregate to access water and forage on level ground. A 1990 study by the Bureau of Land Management and the U.S. Forest Service (the other organization that oversees public grazing lands) revealed that only a third of its nearly 58 million acres of holdings were in either good or excellent ecological condition.40

  In the intermountain West—the area between the Rockies and the Sierras and Cascades—cattle have had some unforeseen consequences. Cattle are heavy, exerting on the order of 24 pounds per square inch of land. Their sheer weight, combined with their constant grazing of the native bunchgrasses, has severely disrupted the soil in vast stretches of the public range. Into this environment in the 1890s came an Old World plant that westerners called cheatgrass, a species so pernicious that it robbed farmers of their livelihood. Stockmen initially welcomed the nonnative plant, which thrives in disturbed soils. It soon became clear, however, that cheatgrass had limited value as forage. It dies quickly, and the dead grass has little nutritional content. Worse still, the plant, in part because it does not stay green for long, promotes the spread of wildfires. Cheatgrass is now the single most common plant species in this region.41

  If all these impacts were not enough, there is one final irony. Although cattle ranchers in the intermountain West make use of some 300 million acres of federal land, roughly 16 percent of the entire continental United States, they produce a mere fraction (three percent in the early 1980s) of the beef consumed in this country. The rest is either raised on private lands by stockmen in eastern states or imported.42

  THE CONFINEMENT

  Despite all the changes outlined here, there is one aspect of the steer’s existence that has not changed all that much: They still spend nearly all their lives outside grazing, save for the last three months spent on feedlots. The lives of chickens and pigs, however, have changed far more dramatically. Since the 1950s, the industrial paradigm has been applied even more thoroughly to U.S. poultry and hog farming. The quaint image of Old MacDonald’s farm, a peaceful scene where roosters followed cows around the barnyard, has given way to large-scale operations founded on the confinement of huge numbers of chickens and hogs in indoor quarters. The animals’ lives are closely monitored and controlled, from birth to death. And with hog and poultry production concentrated in the hands of just a few large corporations, modern animal agriculture has emerged as one of the nation’s most formidable environmental challenges.

  Back before the 1930s, chicken was considerably less popular than it is today. The meat, which was dry and unappetizing, came primarily from sterile old hens. But the development of the broiler industry in the Delmarva Peninsula (where Delaware, Maryland, and Virginia come together) introduced Americans to modern chicken-eating as we know it. Broilers were tender, young roosters, suitable, as the name suggests, for broiling (the old chickens were normally fried). The market for broilers boomed during World War II as the government’s “Food for Freedom” program urged Americans to eat more chicken and leave the beef and pork for the troops.43

  In the 1950s, when antibiotics became widely available, farmers began moving chickens from the barnyard into indoor facilities. By early the following decade, large, vertically integrated firms controlled all aspects of chicken production—hatching, feeding, and ultimately slaughtering the animals. Instead of thousands of small farmers and poultry processors, large corporations emerged by the 1960s to control all phases of broiler production, marketing individualized brands sold directly to consumers. In 1968, Frank Perdue, whose father entered the Delmarva broiler business during the 1930s, went on television himself to attest to the quality of his birds. The Perdue company fed its birds xanthophyll, derived from marigold petals, to turn their skins from white to yellow, making them better looking and tastier. By 1970, Perdue controlled more than 15 percent of New York City’s broiler market.44

  Perdue started out processing about 18 birds per minute. The birds were hung by their feet and stunned before having their throats slit, heads and feet severed, and lungs vacuumed out. After that, they were cut up, wrapped in plastic, and ready to ship. In 1979, production had increased to some 300 birds per minute. For the disassembly line to run efficiently, all the birds had to have the same body shape, making it necessary for Perdue and the other major poultry companies to control all aspects of the growth and production process, beginning with the bird’s genetic stock. The broilers were genetically engineered to grow larger thighs and breasts than the wild chickens from which they descended. They were also programmed to grow quickly, reaching a weight ripe for slaughter in half the time (just seven weeks) it had once taken the birds to mature.45

  Broiler companies concentrated in the South, where the warm climate cut down on barn-heating costs. In Arkansas, packing plants sprouted near where farmers raised the birds (under contract with the companies) in large confinement barns. By locating in anti-union states such as Arkansas, the companies also held down their labor costs. The rise of large-scale broiler companies has almost completely eliminated the small chicken farmer. Between 1974 and 1992, the percentage of sales by broiler producers selling 100,000 birds had increased from 70 to 97 percent of the national total. Today, the chicken has taken the place of the passenger pigeon as the most populous bird in the nation.46

  Virtually the same set of changes unfolded in the pork industry. Antibiotic use allowed farmers to confine the animals in football-field size buildings containing concrete and steel pens. The creatures were genetically identical and programmed to produce a leaner meat to compete with chicken in both fat and cholesterol content. Once a breeding sow delivered piglets (about every five months), they were shipped off to a nursery farm and eventually to a finishing farm, reaching a marketable weight of 250 pounds in only six months.

 
; The man generally credited with industrializing hog farming is a North Carolinian named Wendell Murphy. Beginning in 1969, after an epidemic of cholera caused state officials to quarantine his pig herd, Murphy convinced his neighbors to take over the risk of raising hogs. He provided the pigs and feed; the farmers put up the land and labor. Murphy agreed to pay the farmers a specified price for each pig they raised to market weight. If hog prices shot up, Murphy gained; if they went down, he took a loss. But by only paying for live pigs—it was too bad for the farmers if the pigs died—Murphy shifted the risk of hog raising to others. To contract with Murphy, farmers had to build large confinement barns, structures that have become more automated and expensive over the years. Predictably, the number of hog farms nationwide has declined precipitously (from 600,000 to 157,000 between 1984 and 1999) as the animals are concentrated in fewer, larger indoor settings.47

  In the year 2000, Smithfield Foods acquired Murphy Family Farms and became the largest hog raiser and pork producer in the world. Taking its cue from the giant chicken firms, Perdue and Tyson, Smithfield employs vertical integration, controlling all aspects of pork production from the pig’s birth to its conversion into bacon and other products. “There’s only one way to get consistency—that’s to have common genetics, feed the animals the same way and process them the same way,” remarked Smithfield’s chief executive officer Joseph Luter.48

 

‹ Prev