Best American Magazine Writing 2013
Page 52
Numbers kept rising in the nineteenth and twentieth centuries, after a German chemist, Justus von Liebig, discovered that plant growth was limited by the supply of nitrogen. Without nitrogen, neither plants nor the mammals that eat plants can create proteins, or for that matter the DNA and RNA that direct their production. Pure nitrogen gas (N2) is plentiful in the air but plants are unable to absorb it, because the two nitrogen atoms in N2 are welded so tightly together that plants cannot split them apart for use. Instead, plants take in nitrogen only when it is combined with hydrogen, oxygen, and other elements. To restore exhausted soil, traditional farmers grew peas, beans, lentils, and other pulses. (They never knew why these “green manures” replenished the land. Today we know that their roots contain special bacteria that convert useless N2 into “bio-available” nitrogen compounds.) After Liebig, European and American growers replaced those crops with high-intensity fertilizer—nitrogen-rich guano from Peru at first, then nitrates from mines in Chile. Yields soared. But supplies were much more limited than farmers liked. So intense was the competition for fertilizer that a guano war erupted in 1879, engulfing much of western South America. Almost 3,000 people died.
Two more German chemists, Fritz Haber and Carl Bosch, came to the rescue, discovering the key steps to making synthetic fertilizer from fossil fuels. (The process involves combining nitrogen gas and hydrogen from natural gas into ammonia, which is then used to create nitrogenous compounds usable by plants.) Haber and Bosch are not nearly as well known as they should be; their discovery, the Haber-Bosch process, has literally changed the chemical composition of the earth, a feat previously reserved for microorganisms. Farmers have injected so much synthetic fertilizer into the soil that soil and groundwater nitrogen levels have risen worldwide. Today, roughly a third of all the protein (animal and vegetable) consumed by humankind is derived from synthetic nitrogen fertilizer. Another way of putting this is to say that Haber and Bosch enabled Homo sapiens to extract about 2 billion people’s worth of food from the same amount of available land.
The improved wheat, rice, and (to a lesser extent) maize varieties developed by plant breeders in the 1950s and 1960s are often said to have prevented another billion deaths. Antibiotics, vaccines, and water-treatment plants also saved lives by pushing back humankind’s bacterial, viral, and fungal enemies. With almost no surviving biological competition, humankind had ever more unhindered access to the planetary petri dish: in the past two hundred years, the number of humans walking the planet ballooned from 1 to 7 billion, with a few billion more expected in coming decades.
Rocketing up the growth curve, human beings “now appropriate nearly 40% … of potential terrestrial productivity.” This figure dates from 1986—a famous estimate by a team of Stanford biologists. Ten years later, a second Stanford team calculated that the “fraction of the land’s biological production that is used or dominated” by our species had risen to as much as 50 percent. In 2000, the chemist Paul Crutzen gave a name to our time: the “Anthropocene,” the era in which Homo sapiens became a force operating on a planetary scale. That year, half of the world’s accessible fresh water was consumed by human beings.
Lynn Margulis, it seems safe to say, would have scoffed at these assessments of human domination over the natural world, which, in every case I know of, do not take into account the enormous impact of the microworld. But she would not have disputed the central idea: Homo sapiens has become a successful species, and is growing accordingly.
If we follow Gause’s pattern, growth will continue at a delirious speed until we hit the second inflection point. At that time we will have exhausted the resources of the global petri dish, or effectively made the atmosphere toxic with our carbon-dioxide waste, or both. After that, human life will be, briefly, a Hobbesian nightmare, the living overwhelmed by the dead. When the king falls, so do his minions; it is possible that our fall might also take down most mammals and many plants. Possibly sooner, quite likely later, in this scenario, the earth will again be a choir of bacteria, fungi, and insects, as it has been through most of its history.
It would be foolish to expect anything else, Margulis thought. More than that, it would be unnatural.
As Plastic as Canby
In The Phantom Tollbooth, Norton Juster’s classic, pun-filled adventure tale, the young Milo and his faithful companions unexpectedly find themselves transported to a bleak, mysterious island. Encountering a man in a tweed jacket and beanie, Milo asks him where they are. The man replies by asking if they know who he is—the man is, apparently, confused on the subject. Milo and his friends confer, then ask if he can describe himself.
“Yes, indeed,” the man replied happily. “I’m as tall as can be”—and he grew straight up until all that could be seen of him were his shoes and stockings—“and I’m as short as can be”—and he shrank down to the size of a pebble. “I’m as generous as can be,” he said, handing each of them a large red apple, “and I’m as selfish as can be,” he snarled, grabbing them back again.
In short order, the companions learn that the man is as strong as can be, weak as can be, smart as can be, stupid as can be, graceful as can be, clumsy as—you get the picture. “Is that any help to you?” he asks. Again, Milo and his friends confer, and realize that the answer is actually quite simple:
“Without a doubt,” Milo concluded brightly, “you must be Canby.”
“Of course, yes, of course,” the man shouted. “Why didn’t I think of that? I’m as happy as can be.”
With Canby, Juster presumably meant to mock a certain kind of babyish, uncommitted man-child. But I can’t help thinking of poor old Canby as exemplifying one of humankind’s greatest attributes: behavioral plasticity. The term was coined in 1890 by the pioneering psychologist William James, who defined it as “the possession of a structure weak enough to yield to an influence, but strong enough not to yield all at once.” Behavioral plasticity, a defining feature of Homo sapiens’ big brain, means that humans can change their habits; almost as a matter of course, people change careers, quit smoking or take up vegetarianism, convert to new religions, and migrate to distant lands where they must learn strange languages. This plasticity, this Canby-hood, is the hallmark of our transformation from anatomically modern Homo sapiens to behaviorally modern Homo sapiens—and the reason, perhaps, we were able to survive when Toba reconfigured the landscape.
Other creatures are much less flexible. Like apartment-dwelling cats that compulsively hide in the closet when visitors arrive, they have limited capacity to welcome new phenomena and change in response. Human beings, by contrast, are so exceptionally plastic that vast swaths of neuroscience are devoted to trying to explain how this could come about. (Nobody knows for certain, but some researchers now think that particular genes give their possessors a heightened, inborn awareness of their environment, which can lead both to useless, neurotic sensitivity and greater ability to detect and adapt to new situations.)
Plasticity in individuals is mirrored by plasticity on a societal level. The caste system in social species like honeybees is elaborate and finely tuned but fixed, as if in amber, in the loops of their DNA. Some leafcutter ants are said to have, next to human beings, the biggest and most complex societies on earth, with elaborately coded behavior that reaches from disposal of the dead to complex agricultural systems. Housing millions of individuals in inconceivably ramose subterranean networks, leaf-cutter colonies are “Earth’s ultimate superorganisms,” Edward O. Wilson has written. But they are incapable of fundamental change. The centrality and authority of the queen cannot be challenged; the tiny minority of males, used only to inseminate queens, will never acquire new responsibilities.
Human societies are far more varied than their insect cousins, of course. But the true difference is their plasticity. It is why humankind, a species of Canbys, has been able to move into every corner of the earth, and to control what we find there. Our ability to change ourselves to extract resources from our surroundings with ever-inc
reasing efficiency is what has made Homo sapiens a successful species. It is our greatest blessing.
Or was our greatest blessing, anyway.
Discount Rates
By 2050, demographers predict, as many as 10 billion human beings will walk the earth, 3 billion more than today. Not only will more people exist than ever before, they will be richer than ever before. In the last three decades hundreds of millions in China, India, and other formerly poor places have lifted themselves from destitution—arguably the most important, and certainly the most heartening, accomplishment of our time. Yet, like all human enterprises, this great success will pose great difficulties.
In the past, rising incomes have invariably prompted rising demand for goods and services. Billions more jobs, homes, cars, fancy electronics—these are things the newly prosperous will want. (Why shouldn’t they?) But the greatest challenge may be the most basic of all: feeding these extra mouths. To agronomists, the prospect is sobering. The newly affluent will not want their ancestors’ gruel. Instead they will ask for pork and beef and lamb. Salmon will sizzle on their outdoor grills. In winter, they will want strawberries, like people in New York and London, and clean bibb lettuce from hydroponic gardens.
All of these, each and every one, require vastly more resources to produce than simple peasant agriculture. Already 35 percent of the world’s grain harvest is used to feed livestock. The process is terribly inefficient: between seven and ten kilograms of grain are required to produce one kilogram of beef. Not only will the world’s farmers have to produce enough wheat and maize to feed 3 billion more people, they will have to produce enough to give them all hamburgers and steaks. Given present patterns of food consumption, economists believe, we will need to produce about 40 percent more grain in 2050 than we do today.
How can we provide these things for all these new people? That is only part of the question. The full question is: How can we provide them without wrecking the natural systems on which all depend?
Scientists, activists, and politicians have proposed many solutions, each from a different ideological and moral perspective. Some argue that we must drastically throttle industrial civilization. (Stop energy-intensive, chemical-based farming today! Eliminate fossil fuels to halt climate change!) Others claim that only intense exploitation of scientific knowledge can save us. (Plant super-productive, genetically modified crops now! Switch to nuclear power to halt climate change!) No matter which course is chosen, though, it will require radical, large-scale transformations in the human enterprise—a daunting, hideously expensive task.
Worse, the ship is too large to turn quickly. The world’s food supply cannot be decoupled rapidly from industrial agriculture, if that is seen as the answer. Aquifers cannot be recharged with a snap of the fingers. If the high-tech route is chosen, genetically modified crops cannot be bred and tested overnight. Similarly, carbon-sequestration techniques and nuclear power plants cannot be deployed instantly. Changes must be planned and executed decades in advance of the usual signals of crisis, but that’s like asking healthy, happy sixteen-year-olds to write living wills.
Not only is the task daunting, it’s strange. In the name of nature, we are asking human beings to do something deeply unnatural, something no other species has ever done or could ever do: constrain its own growth (at least in some ways). Zebra mussels in the Great Lakes, brown tree snakes in Guam, water hyacinth in African rivers, gypsy moths in the northeastern United States, rabbits in Australia, Burmese pythons in Florida—all these successful species have overrun their environments, heedlessly wiping out other creatures. Like Gause’s protozoans, they are racing to find the edges of their petri dish. Not one has voluntarily turned back. Now we are asking Homo sapiens to fence itself in.
What a peculiar thing to ask! Economists like to talk about the “discount rate,” which is their term for preferring a bird in hand today over two in the bush tomorrow. The term sums up part of our human nature as well. Evolving in small, constantly moving bands, we are as hard-wired to focus on the immediate and local over the long-term and faraway as we are to prefer park-like savannas to deep dark forests. Thus, we care more about the broken stoplight up the street today than conditions next year in Croatia, Cambodia, or the Congo. Rightly so, evolutionists point out: Americans are far more likely to be killed at that stoplight today than in the Congo next year. Yet here we are asking governments to focus on potential planetary boundaries that may not be reached for decades. Given the discount rate, nothing could be more understandable than the U.S. Congress’s failure to grapple with, say, climate change. From this perspective, is there any reason to imagine that Homo sapiens, unlike mussels, snakes, and moths, can exempt itself from the natural fate of all successful species?
To biologists like Margulis, who spend their careers arguing that humans are simply part of the natural order, the answer should be clear. All life is similar at base. All species seek without pause to make more of themselves—that is their goal. By multiplying till we reach our maximum possible numbers, even as we take out much of the planet, we are fulfilling our destiny.
From this vantage, the answer to the question whether we are doomed to destroy ourselves is yes. It should be obvious.
Should be—but perhaps is not.
Hara Hachi Bu
When I imagine the profound social transformation necessary to avoid calamity, I think about Robinson Crusoe, hero of Daniel Defoe’s famous novel. Defoe clearly intended his hero to be an exemplary man. Shipwrecked on an uninhabited island off Venezuela in 1659, Crusoe is an impressive example of behavioral plasticity. During his twenty-seven-year exile he learns to catch fish, hunt rabbits and turtles, tame and pasture island goats, prune and support local citrus trees, and create “plantations” of barley and rice from seeds that he salvaged from the wreck. (Defoe apparently didn’t know that citrus and goats were not native to the Americas and thus Crusoe probably wouldn’t have found them there.) Rescue comes at last in the form of a shipful of ragged mutineers, who plan to maroon their captain on the supposedly empty island. Crusoe helps the captain recapture his ship and offers the defeated mutineers a choice: trial in England or permanent banishment to the island. All choose the latter. Crusoe has harnessed so much of the island’s productive power to human use that even a gaggle of inept seamen can survive there in comfort.
To get Crusoe on his unlucky voyage, Defoe made him an officer on a slave ship, transporting captured Africans to South America. Today, no writer would make a slave seller the admirable hero of a novel. But in 1720, when Defoe published Robinson Crusoe, no readers said boo about Crusoe’s occupation, because slavery was the norm from one end of the world to another. Rules and names differed from place to place, but coerced labor was everywhere, building roads, serving aristocrats, and fighting wars. Slaves teemed in the Ottoman Empire, Mughal India, and Ming China. Unfree hands were less common in continental Europe, but Portugal, Spain, France, England, and the Netherlands happily exploited slaves by the million in their American colonies. Few protests were heard; slavery had been part of the fabric of life since the code of Hammurabi.
Then, in the space of a few decades in the nineteenth century, slavery, one of humankind’s most enduring institutions, almost vanished.
The sheer implausibility of this change is staggering. In 1860, slaves were, collectively, the single most valuable economic asset in the United States, worth an estimated $3 billion, a vast sum in those days (and about $10 trillion in today’s money). Rather than investing in factories like northern entrepreneurs, southern businessmen had sunk their capital into slaves. And from their perspective, correctly so—masses of enchained men and women had made the region politically powerful, and gave social status to an entire class of poor whites. Slavery was the foundation of the social order. It was, thundered John C. Calhoun, a former senator, secretary of state, and vice president, “instead of an evil, a good—a positive good.” Yet just a few years after Calhoun spoke, part of the United States set out to
destroy this institution, wrecking much of the national economy and killing half a million citizens along the way.
Incredibly, the turn against slavery was as universal as slavery itself. Great Britain, the world’s biggest human trafficker, closed down its slave operations in 1808, though they were among the nation’s most profitable industries. The Netherlands, France, Spain, and Portugal soon followed. Like stars winking out at the approach of dawn, cultures across the globe removed themselves from the previously universal exchange of human cargo. Slavery still exists here and there, but in no society anywhere is it formally accepted as part of the social fabric.
Historians have provided many reasons for this extraordinary transition. But one of the most important is that abolitionists had convinced huge numbers of ordinary people around the world that slavery was a moral disaster. An institution fundamental to human society for millennia was swiftly dismantled by ideas and a call to action, loudly repeated.
In the last few centuries, such profound changes have occurred repeatedly. Since the beginning of our species, for instance, every known society has been based on the domination of women by men. (Rumors of past matriarchal societies abound, but few archaeologists believe them.) In the long view, women’s lack of liberty has been as central to the human enterprise as gravitation is to the celestial order. The degree of suppression varied from time to time and place to place, but women never had an equal voice; indeed, some evidence exists that the penalty for possession of two X chromosomes increased with technological progress. Even as the industrial North and agricultural South warred over the treatment of Africans, they regarded women identically: in neither half of the nation could they attend college, have a bank account, or own property. Equally confining were women’s lives in Europe, Asia, and Africa. Nowadays women are the majority of U.S. college students, the majority of the workforce, and the majority of voters. Again, historians assign multiple causes to this shift in the human condition, rapid in time, staggering in scope. But one of the most important was the power of ideas—the voices, actions, and examples of suffragists, who through decades of ridicule and harassment pressed their case. In recent years something similar seems to have occurred with gay rights: first a few lonely advocates, censured and mocked; then victories in the social and legal sphere; finally, perhaps, a slow movement to equality.