Food Fight

Home > Other > Food Fight > Page 5
Food Fight Page 5

by Mckay Jenkins


  Pro-labeling groups consider this move a joke. Bar codes directing you to the Internet make abstract what ought to be utterly present and clear: Does the package in your hand contain GMO ingredients, or not? If you actually take the time to navigate to a company’s website, you might (perhaps) find somewhere (in small print) that yes, Coca-Cola uses GM corn to make its high fructose corn syrup; or yes, children’s breakfast cereals are sweetened with crystals made from GM sugar beets; or yes, Crisco oil uses GM soybeans. But who’s actually going to go to all that trouble? Add to this the fact that 50 percent of the country’s poor and 65 percent of the elderly do not even own smartphones, and you have to wonder: Is the goal of this move broad public awareness of what goes into food, or another way for companies to obscure what they are feeding us?

  As with the regulation of toxic chemicals in products like cosmetics or baby bottles, companies have also worked hard to limit the size of their battlefield: a single piece of legislation in Congress is a lot easier to manipulate than bills passing through dozens of state legislatures. In 2014, in the midst of major GMO labeling battles in places like California and Vermont, Rep. Mike Pompeo (R-Kansas) introduced a federal bill seeking to prohibit states from requiring GMO labels on food. Opponents of the measure dubbed it the “DARK” Act, for “Deny Americans the Right to Know,” and hundreds of thousands of people signed petitions opposing the bill. “If the DARK Act becomes law, a veil of secrecy will cloak ingredients, leaving consumers with no way to know what’s in their food,” said Scott Faber, senior vice-president of government affairs for the Environmental Working Group. “Consumers in sixty-four countries, including Saudi Arabia and China, have the right to know if their food contains GMOs. Why shouldn’t Americans have the same right?”

  Opponents also considered Pompeo’s bill a gift to Big Food, and indeed, the Pompeo campaign’s top individual contributor has been Koch Industries Inc., the energy, agricultural chemical and fertilizer conglomerate run by billionaire brothers Charles and David H. Koch, who are known for their extensive support of conservative political causes.

  In the end, Big Food won. In the summer of 2016, President Obama signed the Senate version of Pompeo’s bill into law. Although the administration pitched the move as a step forward in the march toward consumer information, the law accomplished most of what Big Food desired: it keeps labeling rules in the hands of a single federal agency, which will decide what percentage of GMOs in a food product will require labeling; it allows for the use of obscure QR codes rather than clear labels on food packages; and most important, it kills far stricter rules written by states like Vermont, Connecticut, and Maine.

  The Obama administration’s fraught decision notwithstanding, the labeling debate continues to raise deeper questions about the ways our food is made. Do you really care only that a food was genetically engineered? Or would you also like to know that it was sprayed with an herbicide that is known to be carcinogenic to humans, or with another chemical known to destroy the plants that monarch butterflies need to survive? That it was sprayed with an insecticide known to kill bees? That it was grown in a monoculture field that is destroying biodiversity generally, or is polluting drinking water supplies? How far do you want to go with this?

  When it comes to food labels, everything comes down to your level of risk tolerance, Jim Carrington, the president of the Danforth Center and a forceful proponent of the safety and benefits of genetic engineering, told me. Table salt is dangerous if used too much, and every year some people die from drinking too much water. Celery, broccoli, potatoes—lots of plants contain natural toxins that help them survive. Does that mean they deserve labels?

  “The question is not whether something has the potential to cause cancer. There is nothing that is not in that category,” Carrington said. “A rooster crows every morning and then the sun comes up. Association does not equal causation.”

  Carrington’s view is that food production depends on all kinds of processes and ingredients that can be delivered in ways that are better or worse, and GMOs are no different.

  “So let’s say we label something that has a GMO ingredient,” Carrington said, a note of sarcasm creeping into his voice. “If we require that, you know what I want to require? I want to know every input that went into that product. I’m concerned about water, and soil erosion, and nitrogen leaching into the waterways. That’s all big-time environmentalism. Show me a label for everything in that box. Show me how much water the crops required, how much fertilizer ran into the nearest waterway or aquifer. But don’t stop there. I want to know how many gallons of fuel were used per pound of produce, what the miles per gallon were for that tractor, whether or not there were any farm animals within two miles because I want to know about E. coli.

  “Marking GMO ingredients as ‘different’ is marking something that in fact has no impact on what’s in the box,” he continued. “There is no substantive difference that will affect you. What I’m saying is, if you get to label something that has no bearing on your health or safety, I say let’s go all the way. Show me every bit of information about how that product was produced so I, as a consumer, can make an informed choice. If you force a label on something that doesn’t matter for reasons you say do matter—‘I want to protect my children’—then I want to claim every bit of every other thing I’m concerned about. It’s not rational, it’s arbitrary, and it has negative consequences.”

  In a way, Carrington’s modest proposal—labeling everything that goes into making our food—precisely reflects the sentiments of people who completely disagree with him about GMOs. It may be that our desire for labels is simply shorthand for our collective desire to know more about a food system that—to most of us—has become utterly industrial, technological, and abstract. We are given so little information about the way our food is grown and have so little contact with people or places that actually grow it. Perhaps the entire debate about GMOs may just be evidence of our cumulative ignorance about one of the most intimate things in our lives: the way we eat.

  So how did we lose our way?

  2.

  The Long, Paved Road to Industrial Food, and the Disappearance of the American Farmer

  The road we have traveled to our current state of eating is actually a very long, interconnected highway. After World War II, American national security strategists decided that protecting the homeland required building a network of broad interstates that mirrored the German Autobahn. This monumental road-building project—now close to 47,000 miles long—was initially conceived as a way to efficiently move troops and military machinery, but it has also had dramatic peacetime consequences for the American landscape, and for the American diet.

  Suddenly, big, safe interstates—and the millions of miles of ring roads, state roads, and town roads they encouraged—allowed people to live farther and farther from the cities where they worked. People moved out of cities in droves, looking for new places to live. Land prices outside cities skyrocketed, and small farmers occupying that land had a hard time resisting when real estate developers came to call.

  Suburban development hit small American farms like a virus. In the 1950s alone, some 10 million people left family farms. Chances are, your grandparents (or even your parents) can tell you stories about all those farms in your area that over the last few decades have been turned into subdivisions and shopping malls. In Maryland, where I live, suburban development has replaced 900,000 acres of farmland (and 500,000 acres of forest) in just the last forty years.

  All these new roads, and the suburbs and industries to which they gave birth, caused a second tectonic shift in American culture: in the way we came to eat. Car-friendly fast-food chains like McDonald’s and Carl’s Jr. and Burger King started popping up along the new highways like weeds. By the early 1960s, Kentucky Fried Chicken was the largest restaurant chain in the United States.

  These restaurants did not cook, exactly; what they did was heat up
highly processed, prepackaged foods that tasted exactly the same, whether you were in Dallas or Des Moines. The ingredients didn’t need to be fresh, they needed to be uniform, and storable, and—most important, given skyrocketing demand—they needed to be provided in vast quantities.

  Fast-food joints didn’t need local asparagus from New Jersey or collard greens from Georgia or one-of-a-kind apples grown in small orchards in New York. They needed commodity grains to sweeten their sodas, fry their fries, and feed the animals that could be turned into hamburgers and hot dogs and fried chicken. What these restaurants needed was corn, and wheat, and soybeans. And lots of them.

  As small family farms near population centers went bankrupt or sold their land to developers, and as the American diet started demanding processed meals, food production flowed like beads of mercury to the control of larger and larger industrial farm operations in the Midwest. As food production became centralized, companies that controlled the grains, chemicals, and processing factories became bigger and much more politically powerful. Thanks to intensive lobbying, tens of billions of dollars in federal farm subsidies began flowing to giant agribusinesses that were driving the development of the industrial food system. As early as the 1970s, farmers around the country were being told (in the words of President Nixon’s Agriculture Secretary Rusty Butz) to “get big or get out.”

  Most farmers got out. A little over a hundred years ago, there were 38 million people living in the United States, and 50 percent of them worked on a farm. Today, we have 300 million people. How many work on farms? Two percent.

  Today, if you drive across the grain belt—Pennsylvania, Ohio, Indiana, Illinois, Iowa, Nebraska, Missouri, Kansas—you will spend many, many hours crossing an ocean of just three crops: corn, wheat, and soybeans. They are being grown by farmers you will likely never meet, processed in factories you will likely never see, into packaged foods containing ingredients that look nothing like the crops from which they were made. You won’t see it, but your soda will be sweetened with high-fructose corn syrup, which replaced sugar in the 1980s. Your fries will be dunked in boiling soybean oil. And your burgers and nuggets and sliced turkey breast will all be processed from animals fed corn or soybeans, or both.

  What you most likely won’t see, out along on the great American road system, are regional food specialties, or the mom-and-pop diners and restaurants that used to serve them. New England clam chowder, New Orleans gumbo, Maryland crab bisque: all these foods require local ingredients, which (by definition) giant farms in Iowa or Kansas are unable to provide. Replacing them has been the food that these farms can provide: Fast food. Processed food. Soda. Pizza. Chicken nuggets. Cheap hamburgers. A vast culinary sameness, all essentially built out of two or three crops, controlled by a small handful of companies. All available twenty-four hours a day in any restaurant, dining hall, or gas station in the country.

  It wasn’t just fast-food restaurants pushing this new food system. Food-processing giants like ADM, ConAgra, and Cargill learned to take monoculture corn and soybeans and turn them into the raw ingredients that could be made into just about anything a supermarket shopper wanted. Companies like General Mills or Coca-Cola could take a few cents’ worth of wheat or corn and process it into Cocoa Puffs or a two-liter bottle of soda and sell it for a few dollars. As food scientists became more creative, they learned how to take wheat and corn and soy and turn them (along with the secret “fragrances” and “flavors” whose provenance only the food scientists seem to know) into limitless quantities of foods sold in suburban supermarkets—as often as not built on top of former farms.

  These new foods were cheap to make, enormously profitable, and consumers seemed to love them. Americans spent $6 billion a year on fast food in 1970. By 2014, they were spending more than $117 billion. Today, Americans drink about 56 gallons of soda a year—about 600 cans per person—and every month, 90 percent of American children visit a McDonald’s.

  As industrial farms continued to grow, they gobbled up not just good land but marginal land, changing the face of millions upon millions of acres of forest, grasslands, hillsides, even wetlands. The strange thing was that the plants they grew—corn, soy, wheat—didn’t seem to mind this change. The plants could grow, weed-like, even in marginal soil.

  So, for better or worse, could the animals. Industrial feedlots across the Midwest began buying trainloads of corn and soybeans to feed an industry that now slaughters 9 billion animals a year.

  As farms consolidated and grew, and as industrial processors increased their demand for ingredients that could be turned into shelf-stable food, farmers responded by growing what the market demanded—and eliminating what the market did not. Over the course of the twentieth century, the varieties of fruits and vegetables being sold by commercial U.S. seed houses dropped by 97 percent. Varieties of cabbage dropped from 544 to 28; carrots from 287 to 21; cauliflower from 158 to 9; tomatoes from 408 to 79; garden peas from 408 to 25. Of more than 7,000 varieties of apples, more than 6,200 have been lost.

  —

  THE DEVELOPMENT of American highways and suburbs caused one of the most dramatic changes in land use in the history of the world. But running parallel to this was an equally momentous shift in agricultural technology, which grew up fast to supply the rapidly changing American diet. In the 1930s, a plant breeder named Henry A. Wallace began boasting of the benefits of crossbred or “hybrid” corn, which he had meticulously developed to produce unprecedented yields. Even Wallace knew he was on to something dramatic. “We hear a great deal these days about atomic energy,” he said. “Yet I am convinced that historians will rank the harnessing of hybrid power as equally significant.”

  Wallace was right. Corn yields doubled—from around 25 bushels per acre to 50 bushels per acre—in ten years. From 1934 to 1944—even before the postwar boom in agribusiness—hybrid corn seed sales jumped from near zero to more than $70 million, and rapidly replaced the enormous variety of seeds farmers had saved and traded for generations. By 1969, yields were up to 80 bushels an acre, and fully 71 percent of the corn grown in the United States was being grown from just a half-dozen types of hybrid seed. Industrial monoculture had arrived. Wallace’s Hi-Bred Corn Company became Pioneer Hi-Bred International, America’s largest seed company.

  Since the 1960s, corn yields have doubled again, and now stand, in some places, close to 200 bushels per acre—nearly a tenfold increase in a single century. This phenomenal increase in production was dramatically accelerated by the invention, in the early twentieth century, of the Haber-Bosch process, which won its German inventors Nobel Prizes for discovering how to convert atmospheric nitrogen into ammonia. The ability to synthesize ammonia—routinely called the most important invention of the twentieth century—made it possible for industry to mass-produce two things that changed the world: explosives during the war and synthetic fertilizers after the war.

  By the late 1940s, the war over, American industries found themselves with an enormous surplus of ammonium nitrate, the primary ingredient used to make TNT and other explosives. Since the synthetic compound also proved to be an excellent source of nitrates for plants, the U.S. Department of Agriculture (USDA) started encouraging the use of these chemicals on American farmland.

  Suddenly, farmers (and their crops) shifted from a reliance on energy from the sun (in the form of nitrogen-fixing legumes or plant-based manure) to a reliance on energy from fossil fuels. Liberated from the old biological constraints, farms “could now be managed on industrial principles, as a factory transforming inputs of raw material—chemical fertilizer—into outputs of corn,” Michael Pollan writes in The Omnivore’s Dilemma. “Fixing nitrogen allowed the food chain to turn from the logic of biology and embrace the logic of industry. Instead of eating exclusively from the sun, humanity now began to sip petroleum.”

  A similar pattern emerged for the poison gases that industry had developed for the war: they were repurposed as agricultur
al pesticides and herbicides. Monsanto had begun the twentieth century making things like aspirin. In 1945, the company began making herbicides like 2,4-D, which would become a prime ingredient in Agent Orange, and is now one of the most popular farm sprays in the world. Monsanto also spent decades making PCBs, a compound used in both pesticides and electrical transformers (and long since banned as a dangerous carcinogen). By the 1960s, Monsanto was making a whole host of pesticides, with tough-sounding cowboy names like Lasso, Lariat, and Bullet. But the company’s star product was Roundup, the glyphosate that is now the most popular herbicide in the world—and which, in a few short years, would be the star player in the growth of GMOs.

  DuPont, Dow, Syngenta, Bayer, BASF—all the world’s largest chemical companies made fortunes manufacturing compounds like DDT, atrazine, and scores of other farm chemicals. Today, the six top chemical companies control nearly 75 percent of the world’s pesticide market.

  This transition, from wartime chemicals to petroleum-based farm chemicals that now cover hundreds of millions of acres in the United States alone, has proven a double-edged sword for the world’s farmers, and for the rest of us. For one thing, it means that most of us, in the words of the Indian food activist Vandana Shiva, are “still eating the leftovers of World War II.”

  True, it cranked up the amount of food farmers could grow, but it also (in the words of Czech-Canadian scientist Vaclav Smil) “detonated the population explosion.” Farmers could now grow a lot more food, but suddenly—thanks in no small part to all this extra food—there were a lot more people to feed. Since the end of World War II, chemical fertilizer production jumped from 17 million tons per year to more than 200 million tons. Excess fertilizers and pesticides that are not taken up by plants seep into the rivers and bays, where they contaminate drinking water and cause algae blooms (and aquatic dead zones) so large they can be seen from space. They evaporate into the air, where they serve as major contributors to climate change.

 

‹ Prev