Another popular choice that I have previously advocated dates the start of the Anthropocene to the first atomic bomb tests in the 1940s. This seems correct both scientifically and poetically. These explosions left clear isotopic and geologic signatures that will forever uniquely mark a moment in the rock record of Earth. The hundreds of above-ground nuclear tests conducted from 1945 until the atmospheric test ban in 1963 left an identifiable trace of long-lived plutonium and other radioactive isotopes in sediments around the globe. This provides what geologists call a “golden spike,” a unique marker associated with an event or transition that can be identified widely around the world.2 This nuclear horizon also conveniently coincides closely with the beginning of the postwar Great Acceleration, in which so many human impacts began to surge.
Fallout from the first nuclear explosions records a historical moment when an awareness of our awesome and fearsome new powers as world changers became unavoidable. As Robert Oppenheimer memorably remarked, reflecting on the Trinity explosion, the first nuclear bomb test on July 16, 1945,
We knew the world would not be the same. A few people laughed, a few people cried, most people were silent. I remembered the line from the Hindu scripture, the Bhagavad-Gita… “Now I am become Death, the destroyer of worlds.” I suppose we all thought that, one way or another.
If there was ever a time when we started to see we were all in the same boat, shooting holes in the hull, it was the dawn of the nuclear age. It really is an excellent choice for the beginning of the Anthropocene because it’s the moment we saw what we’d become. It also left an indelible isotopic marker. It’s a point where geology and history merge.
The best argument against the nuclear test horizon for the Anthropocene beginning is that humans had already radically altered the planet thousands of years before. Geographer Erle Ellis and several colleagues have forcefully argued for what they call the “Early Anthropocene.” They point out that massive planetary-scale changes had already been caused by human activities long before the industrial era. These include the extinction of most very large land animals (including woolly mammoths, giant ground sloths, woolly rhinos, and cave bears) between fifty thousand and twelve thousand years ago, the clearance of forest regions and the spread of agriculture across our planet’s arable lands seven thousand years ago, which caused CO2 levels to rise. Methane levels rose because of rice agriculture and livestock around five thousand years ago, and many scientists believe that the earliest episode of human-induced global climate changes can be traced to this period.
Another provocative proposal is to start the Anthropocene in the year 1610, when the amount of CO2, as found in ice cores, actually decreased noticeably. This has been linked to the Columbian Exchange, the massive biological disruption that occurred after Europeans reached the Americas and trade across the Atlantic caused a sudden interchange of species between hemispheres. In terms of the swapping and mixing of biological information that occurred, it was as if the two hemispheres had sex. You cannot say it was consensual. The sudden drop in CO2 at this time has a surprising and tragic origin. Roughly fifty million indigenous people died, largely from the introduction of pathogens to which they had no immune defenses. The collapse of Native American civilizations allowed the reversion of massive areas of abandoned cultivated fields back to forests. This caused the observed drawdown of CO2 between 1570 and 1620, and a consequent global cooling.3 As with the nuclear bomb horizon, this proposed golden spike, the climate signal of genocide, serves as a permanent marker of the destructive potential of human ingenuity.
A lot of ink and electrons have been spent on arguing over when the Anthropocene began, but in a certain way, this is like having a protracted argument about what day your daughter was finally grown up. Was it her bat mitzvah? Her first day of high school? Her first unchaperoned date? The day she got her driver’s license? Her high school graduation? Likewise, although the argument over the origins of the Anthropocene has been framed as a debate over when it started, what’s really at issue is what “it” is, which is a more interesting question. The debate is most valuable if we read it as a protracted dialogue on how humanity has journeyed from being just another hominid species in East Africa to the global force we are today. Regardless of the outcome, of who wins the argument over a start date, it has gotten us all talking about the history of our planetary-scale influence.
What was the question again? Is it when human activity first became detectable in the geological record? When humans first left a clear global marker, or golden spike? When human-induced change to the landscape became obvious? When we became an important agent of climate change? When we started to be the dominant geological force or pushed multiple environmental variables outside their normal ranges? Each of these would give us a different answer.
When we look at it in this way, we see there is no one beginning. None of the proposals is wrong. Each marks a different stage in the “hominization” of the planet. So rather than insist on a choice, I like to view them as a set. Together they describe a series of interesting waypoints in the development of the changing and increasing human influence.
Entering our time officially into the geologic record begs us to imagine these rocks being dug up and studied someday in the far future—but when and by whom? Imagine a geologist or astrobiologist coming to Earth a few hundred million years from now and digging up the rocks of our time, or finding them conveniently exposed in an area that has been uplifted and cut by a steep river valley. What would those scientists find? They might see novel “plastiglomerate” rocks, the isotope signatures of nuclear tests, or sudden changes in fossil species marking the disappearance of most large land mammals or coral reefs. They might denote the time span of our influence as a specific colored band in their maps. Truthfully, the choice between our various proposed Anthropocene markers may really not matter because, geologically, it’s all the same moment. It’s all right now. Ten thousand years is nothing. It’s less than the width of the line drawn between most of the epochs in the geologic timescale. Many geologic transitions are not resolved within millennial timescales, and almost none of them is resolved within centuries. Those far-future geologists mapping out our time won’t care which date we choose.
Who are these future geologists? Are they human beings? That would mean our species did not cause our own extinction or create machines that made us obsolete. And if in that distant time those scientists are still using our current stratigraphic system, that would mean our culture and our science somehow survived to be passed down, intact, through the ages. Can it? By implicitly posing these questions, the idea of including the Anthropocene epoch in our geologic timescale has value. This symbolism may be its main benefit. Recently some scientists proposed that rather than formally adopt the Anthropocene, the scientific community should simply continue to use it as an informal term. This seems to be what is happening anyway. We can break it up into subdivisions that make clearer which phase we’re talking about: the “paleo-Anthropocene” or “early Anthropocene” or “industrial” or “modern” Anthropocene.
Now let me offer my own modest proposal for a “golden spike” to mark the Anthropocene. If we must choose a geological deposit to mark our time, one that is uniquely human-born, I would suggest the area of Mare Tranquilitatis, the Sea of Tranquility, on the Moon, where the Apollo 11 astronauts first stepped onto the soil of another world, hopped about, did experiments, took rocks and soil, and left behind machines, flags, and footprints. Those boot marks will fade in a few million years as micrometeorites grind them into the surrounding dust, but the overall disturbance of this site, including the alien artifacts we left there, will surely be detectable for as long as there is an Earth and a Moon. This could not have been produced by any other species. In symbolic potency, it matches the isotopic signature of the first nuclear bomb tests. These two markers, one terrestrial and one lunar, are tied together, stemming from the same conflict, because it was the competitive rocketry of the Cold War that
carried us to the Moon. Yet our lunar excursions were also fueled by our age-old wandering spirit, the same drive that caused some of us to first leave Africa seventy thousand years ago. These disturbances are the product of our human propensity to explore in teams, to develop new tools to expand our domain to places that are not part of our “natural” habitat. This could not have been done by a species that had not developed world-changing technology. The unique evolutionary significance of this step was captured by Arthur C. Clarke in his short story “The Sentinel” (which became the basis for the novel and film 2001: A Space Odyssey), in which advanced extraterrestrials had left a signaling device on the Moon to serve as a kind of tripwire to alert them when Earth life reached a level of technological maturity worthy of their notice.
This altered lunar landscape also captures the moment we first looked back and saw the unity of our home and our common destiny with all Earth life in the great cosmic loneliness. The beginning of the space age is at least as good a contender as any other for the beginning of the Anthropocene.
Of course, as an actual proposal for correlating geological events on Earth, these Anthropocene lunar disturbances are ridiculously impractical, completely useless. But so what? There is nothing practical about the decision to formalize the Anthropocene epoch. Those geologists millions of years in the future will surely have science and technology that (as Clarke also said) would seem to us like magic. They may wish to detect and map out our early influence, but somehow I don’t think they will need our help. So, I know I’ll lose, but I vote for Tranquility Base.
A 3.6-million-year-old footprint of Australopithecus in Tanzania, next to an astronaut’s footprint on the Moon.
The Mature Anthropocene
When did our world gain a quality that is uniquely human? Many species have had a major influence on the globe, but they don’t each get their own planetary transition in the geologic timescale.* When did we begin changing things in a way that no other species has ever changed Earth before? Certainly making massive changes in landscapes is not new. Beavers do plenty of that, for example, when they build dams, alter streams, cut down forests, and create new meadows. Even changing global climate, and initiating mass extinction, is not a human first. What really distinguishes us from the other life-forms that have come along before and profoundly changed the world? Clearly it has something to do with our great cleverness and adaptability; the power that comes from communicating, planning, and working in social groups; transmitting knowledge from one generation to the next; and applying these skills toward altering our surroundings and expanding our habitable domains. This has been going on for tens of thousands of years, and has resulted in the many environmental modifications proposed as different markers of the Anthropocene start date. Yet, up until now, those causing the disturbances had no way of recognizing or even conceiving of a global change. Yes, humans have been altering our planet for millennia, but there is something new going on now that was not happening when we started doing all that world changing.
To me, what makes the Anthropocene unprecedented and perhaps fully worthy of the name is precisely our growing knowledge of what we are doing to this world. Such self-conscious global change is a completely new phenomenon on this planet. This puts us in a category all our own, and therefore, I believe, it is the best criterion for the real start of the era. The Anthropocene begins when we start realizing it has begun. This also provides a new angle on the long-vexing question of what truly differentiates humanity from other life. Perhaps more than anything else, it is this self-aware world changing that really marks us as something new on this Earth. What are we? We are the species that can change the world and come to see what we’re doing.
By this alternative criterion, the true Anthropocene, what we might call the “mature Anthropocene,” is just now beginning. And what we have experienced so far, all these different stages that have been suggested as start dates (the early, or paleo, or Columbian, or industrial beginnings), all these stages of the unconscious human remaking of Earth collectively form a kind of preamble.
Defined this way, the true Anthropocene hasn’t fully begun in earnest, or it is just now in the process of getting started. It starts when we acquire the ability to become a lasting presence on this world. The mature Anthropocene arrives with mass awareness of our role in changing the planet. This is what will allow us to transition from blundering through global changes of the third kind to deliberately making global changes of the fourth kind. It starts with the end of our innocence.
How do we lose our innocence? By developing “situational awareness,” by becoming cognizant of how we are behaving on a planetary scale, in space and time, and integrating that knowledge into our actions. This will not require altruism or idealism or self-sacrifice, only accurate self-perception and “enlightened self-interest.” Responsible global behavior is ultimately simply an act of self-preservation of, by, and for the global beast that modern technological humanity has become.
There’s never before been a geological epoch brought about by a force aware of its own actions. Humanity has at least a dim, and growing, cognizance of the effects of its presence on this planet. The possibility that we might integrate that awareness into how we interface with the Earth system is one that should give us hope. No force of nature has ever decided to change course before. If we do not like some aspects of how this epoch is getting started, its outcome is not set in stone.
What makes the Anthropocene a new type of geological age is our growing knowledge of it. So, I propose that we call this time we’ve been living through so far, in which we’ve been accidentally tinkering with planetary evolution, the “proto-Anthropocene.” We can regard this phase, where the world is largely being altered by our unconscious planetary changes of the third kind, as a first step in realizing our lasting role on Earth. It may be a necessary prelude to the mature Anthropocene, when we fully incorporate our uniquely human powers of imagination, abstraction, and foresight into our role as an integral part of the planetary system. The mature Anthropocene differentiates conscious, purposeful global change from the inadvertent, random changes that have largely brought us to this point.
Viewed this way, the Anthropocene is something to welcome, to strive for.
One Galactic Year from Now
We pace our lives to astronomical tempos: We rise and sleep with Earth’s spin; count our days in months derived from the Moon’s orbit; and we plant and harvest, measure our lives, and record history by Earth’s orbit around the Sun. Other cosmic cycles have shaped us in profound ways that were completely invisible to our prescientific ancestors. Through the Milankovič cycles, Earth’s climate, and (it turns out) human evolution have been pushed and pulled by the motions of the planets. We are also traveling, along with our Sun and the rest of the solar system, on a much longer orbit around the center of the Milky Way galaxy.4 We don’t feel this motion—good thing, since it is carrying us along at about 490,000 miles per hour—but we can measure it. In our current portion of this circuit, we are speeding almost right toward the star Vega, the second-brightest star in the northern sky, and away from Sirius, the brightest star in our sky. This orbit defines another timescale that was hidden to us before the twentieth century: we complete one lap around our galaxy about every 225 million years.5
We can assemble a scrapbook of our cosmic history measured out in these galactic, or “cosmic,” years. Our universe seems to have been around for about sixty-one of them,* and Earth has almost reached the galactic age of twenty-one. As a biosphere, we’re still a teenager of sixteen or seventeen galactic years.
We find ourselves embedded in an evolutionary sequence encompassing the entire history of the universe, which has gathered itself over billions of years into a succession of forms, increasing in both size and complexity. The story of cosmic evolution is one that natural philosophers and scientists have been telling and refining for centuries, and historians have recently joined in, adding a more human focus and calling it
“Big History.”6 Cosmic evolution describes the entire history of the universe in terms of the major transitions it has experienced.
Out of the Big Bang came a dense fog of racing, swarming particles. As the universe expanded and cooled, they clumped into small atoms of hydrogen. Gravity pulled these together into stars, where they fused into a wider diversity of atomic elements. Stars burned and exploded, spewing their guts into vast, diffuse clouds of dust enriched with heavier, chemically reactive elements, where simple molecules formed. Gravity again gathered these clouds into new stars, and planets. On some of these planets, more complex molecules formed.
Then, on at least one planet, the story kept going, and a whole new saga started in some dirty, warm ponds where organic molecules experimented upon themselves. At some point, around seventeen cosmic years back, one of these assemblages of chemicals hit upon the ability to publish exact copies of itself. This was a newfound kind of stability, a way to preserve survival of type through successive generations, so the design persisted even as the individual expired. This step was the origin of life, a method of not only perpetuation but improvement. The designs get better because not all copies are exact. Typos in the genetic code ensure that mistakes are made, some of which are keepers, with increased ability to survive and propagate. By the next spin around the galaxy, complex molecules had begat life. After that, it was more than ten more galactic years before cellular life formed into complex and giant (compared to bacteria) plants and animals.
If we were more attuned to this galactic cadence, we might now be commemorating the anniversary of a great tragedy. Just over one galactic year ago our biosphere was reeling from its greatest trauma: the Permo-Triassic extinction, or the “Great Dying,” which I write about in chapter 3.
Earth in Human Hands Page 23