An Anthropocene Journey

The word “anthropocene” has become the closest thing there is to common shorthand for this turbulent, momentous, unpredictable, hopeless, hopeful time—duration and scope still unknown

My reporting career has taken me from smoldering, fresh-cut roadsides in the Amazon rain forest to the thinning sea ice around the North Pole, from the White House and Vatican to Nairobi’s vast, still-unlit slums. Throughout most of it, I thought I was writing about environmental and social problems and solutions.

Lately I’ve come to realize that my lifelong beat, in essence, has been one species’ growing pains. After tens of thousands of years of scrabbling by, spreading around the planet, and developing tools of increasing sophistication, humans are in surge mode and have only just started to become aware that something profound is going on. The upside has been astounding. Child and maternal mortality rates have plunged. Access to education has soared. Deep poverty is in sharp retreat. Despite the 24/7 distilled drama online and on TV, violence on scales from war to homicide has been in a long decline.

It’s been only a few decades since science began building a picture of the back story to this spectacular ascent. It’s a story about how humans became such a potent environmental influence that a signature of our doings, for good or ill, will be measurable in layered rock for millions of years to come. By altering climate, landscapes, and seascapes as well as flows of species, genes, energy, and materials, we are sealing the fates of myriad other species. And, without a big shift from business-as-usual, we will undermine our own long-term welfare as well.

In 2000, after a century of earlier efforts by scholars, scientists, and at least one journalist (me) to give a name to humanity’s emerging role as a planet-scale force, one word emerged in a heated moment at a global change conference in Cuernavaca, Mexico—anthropocene.

It appears to be here for the long haul. After 16 years of percolation and debate, anthropocene has become the closest thing there is to common shorthand for this turbulent, momentous, unpredictable, hopeless, hopeful time—duration and scope still unknown.

The word is still so novel that no one has even settled on how to pronounce it; the British stress the second syllable and Americans the first. That seems appropriate, given that reactions to the emergence of the term—let alone the actual environmental changes it aims to describe—have come in all colors and flavors. There’s even been a spirited push for alternatives, some rather biting.

I imagine you’ve heard some of the competing words that have bubbled up. We’re actually in the greed-driven Capitalocene, the trash-choked Plasticene, the combustible Pyrocene, the self-loathing Misanthropocene, the testosterone-dominated Manthropocene—even the Obscene. There’s some merit as well as weakness in every label, including the word that sparked it all.

The anthropocene (both the word and the unfolding age) has so much Rorschach-like plasticity that all I can offer as guidance are my informed but subjective reflections based on what I’ve learned and unlearned in my long, quirky journey. I’d argue that what matters most is not resolving some common meaning so much as engaging in deeply felt discussions, fresh lines of inquiry, and new proposals for sustaining the human journey—all of which have been sparked by the emergence of this concept.

Origin Story

©NASA

To navigate this terrain, it’s best to start with the foundational anthropocene idea, as blurted out in February 2000 during a scientific meeting on human-caused global change. A prominent participant was Paul J. Crutzen, who’d won a Nobel Prize for helping identify the threat certain synthetic chemicals posed to the planet’s protective ozone layer. At the meeting, his frustration grew as peers described momentous shifts in Earth’s operating systems, but always anchored them in time by mentioning the Holocene. Holocene is the formal name for the “wholly recent” epoch of planetary history that began at the end of the last ice age 11,700 years ago.

At one point, Crutzen couldn’t hold back. He interrupted a colleague, as the scientist Will Steffen later described: “Stop using the word Holocene. We’re not in the Holocene any more. We’re in the … the … the … (searching for the right word) … the Anthropocene!”

In his 2014 book The Anthropocene, Christian Schwägerl describes how the room fell silent at first, and then the word became the center of conversation. “The scientists in that conference room in Mexico were profoundly shaken,” Schwägerl wrote. “[O]ne of the most frequently cited natural scientists in the world … was not only describing the past with this new term (something to which geologists are accustomed), but he was also redefining and connecting to the future … a new Earth sculpted by humans.”

The broader meaning of anthropocene, not captured in the dictionary, centers on how awareness (in theory) comes with responsibility.

Shortly after that meeting, Crutzen learned that Eugene F. Stoermer, an admired analyst of tiny lakebed diatom fossils, had used the word in the 1980s. The two scientists collaborated on an essay for a newsletter for Earth systems scientists. They laid out a scientific rationale for the term and explained why, even though there was no tradition of naming geological spans for their causative elements, in this case it was justified:

Considering these … major and still growing impacts of human activities on Earth and atmosphere, and at all, including global, scales, it seems to us more than appropriate to emphasize the central role of mankind in geology and ecology by proposing to use the term “anthropocene” for the current geological epoch.

Crutzen and several collaborators refined the concept in subsequent papers. The term quickly spread, propelled in a dizzying array of directions as if filling a linguistic vacuum. It began popping up in peer-reviewed literature in a variety of disciplines and eventually spawned at least three scientific journals (and one magazine) using “Anthropocene” in their titles.

It’s not hard to see why reverberations, pro and con, built so quickly. It was an audacious notion to recommend that a human age deserved to join the Paleocene, Eocene, Oligocene, Miocene, Pliocene, Pleistocene, and Holocene as the epochs of geological history comprising the Age of Mammals. This stretch of time, more formally called the Cenozoic Era, began 65 million years ago, after the mass extinction that ended the dinosaurs’ age and enabled ours. And it could continue for a very long time—if the most powerful mammal, Homo sapiens, demonstrates it can turn the sapience in its name into a sustainable journey.

The proposal of an Anthropocene epoch was particularly audacious because it came from a chemist and an ecologist, not a stratigrapher. Stratigraphy is the discipline within geology that develops and maintains the official Geologic Time Scale and International Chronostratigraphic Chart.

In 2008, a group of stratigraphers and other earth scientists, led by Jan Zalasiewicz of the University of Leicester, published the first careful assessments of the intriguing Crutzen-Stoermer hypothesis. Indeed, they found a concrete and durable human signature—literally. Tens of billions of tons of concrete are part of that signature, along with vast amounts of smelted aluminum and more exotic alloys, distinctive spherical particles of fly ash from power plants, bomb radioisotopes, 6 billion tons (and counting) of plastic, and so much more. In a 2008 paper, Zalasiewicz and others concluded that there appeared to be “sufficient evidence” for an Anthropocene epoch to be considered for formalization by the international geological community.

But a long road lay ahead. The following year, Zalasiewicz and some colleagues began assembling a working group on the “Anthropocene” at the invitation of one of the 16 subcommissions of the International Commission on Stratigraphy. Those quotation marks around “Anthropocene” in the group’s name won’t disappear until some final judgment on the validity of a new epoch is reached.

In 2010 I was invited to join the working group, largely because of a quirky role I had played in the evolution of this anthropocene idea in 1992, when I essentially predicted Crutzen’s Mexico moment and what has unfolded since. Since 1985, I’d been writing articles about human impacts on the climate system. In 1991, I finally got a chance to synthesize what I’d been learning, in a short book that would accompany the first major museum exhibition on global warming, at the American Museum of Natural History. Closing out a chapter on the growing human impact on Earth, I typed an almost offhand proposal that we’d jolted the planet out of the Holocene:

Perhaps earth scientists of the future will name this new post-Holocene era for its causative element—for us. We are entering an age that might someday be referred to as, say, the Anthrocene. After all, it is a geological age of our own making. The challenge now is to find a way to act that will make geologists of the future look upon this age as a remarkable time, a time in which a species began to take into account the long-term impact of its actions. The alternative will be to leave a legacy of irresponsibility and neglect that will manifest itself in the fossil record as just one more mass extinction—like the record of bones and empty footprints left behind by the dinosaurs.

The Human Age could continue for a very long time—if the most powerful mammal, Homo sapiens, demonstrates it can turn the sapience in its name into a sustainable journey.

I vaguely recall musing on how to spell my passing reference to a name for this age. (I can’t probe the floppy disks on which any trace of that process sits.) “Anthrocene” seemed more streamlined than other choices, and I was pretty naïve when it came to word roots in scientific terminology. It didn’t really matter. The book was published shortly after the end of the Persian Gulf War and the planet-cooling eruption of Mount Pinatubo. Public attention was focused elsewhere. I’m sure no more than a few thousand people read it, certainly not Crutzen or Stoermer. It now floats on Amazon.com’s used listings for as little as one US cent (plus shipping, of course)—another kind of anthropocene shard, in a way.

Reflecting on this now, I’m quite certain that when I wrote “earth scientists of the future,” I was thinking generations, if not centuries, into the future. But it took just eight years for scientific rigor to be applied to the idea of an anthropogenic geological age. We do live in fast-forward times.

A fifth-grade classroom in Emerson School, Ann Arbor, Michigan ©Connie Weber

A fifth-grade classroom in Emerson School, Ann Arbor, Michigan ©Connie Weber

Language constantly evolves. In 2014, the word passed a significant milestone. The Oxford English Dictionary (OED) adds batches of words four times a year. The 171 words added in June that year included all manner of obscurities (“cholestasis”), words reflecting trends of the moment (“selfie,” “flexitarian”), and “Anthropocene.”

According to the dictionary’s definition, the Anthropocene is “the era of geological time during which human activity is considered to be the dominant influence on the environment, climate, and ecology of the earth.”

Before including it, the OED editors had wisely let the word percolate for 14 years after it first entered widespread discourse. But I’d argue that they jumped the gun in one important technical way and missed the main, grander meaning of the word. That second point is not a criticism; it just reflects the plasticity and richness of this still-emerging neologism.

The technical problem with the definition? The word, despite having roots springing so directly from stratigraphic nomenclature, could still end up rejected as a formal “era of geological time.”

The Upper-Case Anthropocene

Golden spike emplaced in bed that is Global Standard Stratotype Section and Point (GSSP) for the Thenetian Stage.

©Stan Finney and Lucy Edwards

It was one thing for a couple of environment-oriented scientists, however lauded, to propose a new addition to those colorful stratigraphic charts familiar to millions of earth science students. It’s another thing entirely to gain the approval of a 60-percent supermajority of the leadership of the International Commission on Stratigraphy (ICS), the sage timekeepers of geology—and then the grand overarching body, the International Union of Geological Sciences. The working group’s recommendation to the ICS, when completed, would be just the first step in that process.

Of course, the Geologic Time Scale is always evolving, given the continuing emergence of new discoveries in the field. But the charts are living documents in the same way the United States Constitution is a living document: changes are made only with extreme care and conservatism and following strict protocols.

Many influential stratigraphers have expressed deep skepticism that the Anthropocene deserves formal standing. For one thing, any new addition to the time scale must be useful to science. Calling an abrupt end to the Holocene could achieve the opposite, creating confusion in the literature. There are significant debates over when to mark the starting point or lower boundary of the Anthropocene in the time scale.

Other scientists are concerned about all those flavors and colors of meaning that surround the word outside of geology—potentially tainting the time scale with environmental messaging. One of the starkest challenges came last spring in a critique written by two influential geologists, Stanley C. Finney and Lucy E. Edwards. Its title laid out what they saw as a murky and open question: The “Anthropocene” epoch: Scientific decision or political statement?

Jan Zalasiewicz at an Anthropocene Working Group meeting ©Andrew Revkin

Jan Zalasiewicz at an Anthropocene Working Group meeting ©Andrew Revkin

There was some basis for such concerns. Many scientists and others pressing for a more sustainable human relationship with the environment had latched onto the word and idea as a rallying point. In a 2011 interview with Elizabeth Kolbert for National Geographic, Crutzen had put it plainly: “What I hope … is that the term ‘Anthropocene’ will be a warning to the world.”

Now in its seventh year, the working group has been under pressure to complete its formal recommendation to the stratigraphic commission. Almost daily, emails fly back and forth among its 35 members, refining drafts of papers (including a response to Finney and Edwards) and planning next steps. There have been three face-to-face meetings of the group’s members, most recently in Oslo in April 2016.

Coincidentally, that meeting kicked off on the 46th Earth Day. We gathered around a long table in an ornate room at the Fridtjof Nansen Institute in a mansion built a century ago by the famed Arctic explorer for whom the institute was named. For two long days, discussions led by Zalasiewicz and Colin Waters of the British Geological Survey centered on a review of the “arguments against formalization.” The 17 bullet points ranged from the technical and straightforward—“stratigraphic record is minimal … based on predictions … ”—to the testy and provocative—“[T]he Anthropocene is political, not scientific.” As if to remind participants of the gravity of the task, there was a plastic-laminated copy of the scale itself at each seat, along with the usual array of writing pads and pens.

My lack of familiarity with norms of stratigraphy prevented me from engaging too deeply, although I’ve been a minor coauthor on several of the group’s papers. What I think I’ve brought to the table is context. In a presentation, I urged the geologists to take comfort in knowing they’re hardly the first discipline to be thrust into policy relevance or to have their norms shaken by disruptive change. I clicked to a slide showing how the “tree of life” envisioned by Darwin had been utterly disrupted now that DNA sequencing allows a more complete view, particularly of microbes. Just days before the Oslo meeting, a new “tree” had been published in which, as Carl Zimmer noted in the New York Times, “All the eukaryotes, from humans to flowers to amoebae, fit on a slender twig” compared to a dizzying spray of lines of bacteria.

And now the revolutionary genetic editing tool CRISPR is poised to imprint humans’ ambitions on that tree at least as profoundly as fossil fuels have changed the physical world. I also noted that the sparring in the stratigraphy community strongly echoed fights that had first erupted in meteorology and climate science 25 years ago, as new lines of evidence and new tools, such as global climate models, pointed to a growing and disruptive human warming influence. “You’re not alone,” I said. But I stressed, using climate change as an example, that it is possible to separate the “is” of science from the “ought” of society’s choices. With some bumps and bruises, the Intergovernmental Panel on Climate Change had found a way forward. Now it was geology’s turn.

There was some irony in the stroll each day between our hotel and the Nansen Institute. It took us along the shore in front of a giant Jenga-block scramble of horizontal white towers that belong to Statoil. Norway’s mostly state-owned oil company has contributed substantially not only to Norway’s economy but also to global climate change. Even as Norway was adding incentives for drivers to buy electric vehicles to take advantage of ample domestic hydro-electric power, the company announced plans to expand drilling in the Barents Sea to boost fossil-fuel exports. One got the impression that decisions made in that building would have a bigger impact on world affairs than any conclusions we produced.

But there was a second layer of irony there on the windswept shores of the fjord. The grassy stretch along the sinuous path was also a sculpture park. A vertical slab rose from the grass directly in front of the Statoil building, imprinted with an image of one of Easter Island’s moai—the haunting stone figures carved at the potent pinnacle of the great, but vanished, Rapa Nui civilization.

As August drew to a close, Colin Waters headed to the 35th Congress of the International Union of Geological Sciences in Cape Town, South Africa, to summarize the group’s findings, including the results of a vote of members on critical aspects of the evidence for an Anthropocene Epoch. The key points? “Is the Anthropocene stratigraphically real?” Thirty-four yes, one abstention. “Should the Anthropocene be formalized?” Thirty yes, three no, two abstentions (one of which was me).

In deference to the long chain of approvals that lay ahead, he stressed the work plan, which includes a global quest for an appropriate site for a “golden spike”—an actual physical point displaying the evidence for a Holocene-Anthropocene transition.

We are creating the mother of all stratigraphic marker horizons

While many geologists worry that a human-etched epoch grants us too much power on the basis of too little evidence, a few think the proponents of the geological Anthropocene are thinking way too small. One such expert is Jay Quade of the University of Arizona. After decades of fieldwork and lab analysis on six continents, Quade—whose father and grandfather were geologists—seems to live, breathe, and eat insights from ancient rock. I met him in June at a Santa Fe, New Mexico, gathering of scientists focused on the Quaternary Period. He credited the efforts of Crutzen and scientists such as those in the “Anthropocene” working group for all that they were doing but said his reading of the evidence pointed to an even more massive unfolding geological transition. It could, he believed, be akin to—if not bigger than—the Permian-Triassic mass extinction 250 million years ago and the Cretaceous-Tertiary extinction that cleared out the dinosaurs and led to the Age of Mammals—and us.

In his keynote talk, he described the human-driven changes under way on Earth as “creating the mother of all stratigraphic marker horizons.” One slide took the audience 50 million years into the future, projecting what the human imprint would look like after such a span—kind of like what geologists see now in probing previous great events. Our anthropocene moment appears as a brief pulse of trash, rare earths, and the like—along with a profound constriction of mammal species—followed in future ages by a flourishing of surviving and newly evolved mammals. Are humans among them to assess that record? Time will tell.

The Lower-Case Anthropocene

Copenhagen 2009 climate talks ©Andrew C. Revkin

To me, the geological discussion, while vital, is not nearly as important as the wider discourse that has emerged around the word and its implications.

What makes this point in entwined human and planetary history special, and has made this word controversial, isn’t our potency. Cyanobacteria, through the evolution of photosynthesis, started flooding the atmosphere with oxygen more than 2.3 billion years ago. Some earth scientists call that the Great Oxygen Catastrophe. The result was a mass extinction followed, over millions of years, by an extraordinary flourishing of life attuned to that new atmosphere.

But cyanobacteria, as far as we know, weren’t aware of their power. And we are, at least haltingly, starting to recognize ours. It remains to be seen whether the current surge of human-generated carbon dioxide, along with our other environmental impacts, creates what future civilizations might call the Great Carbon Dioxide Catastrophe—or not. The wild card is us. The broader meaning of anthropocene, not captured in the Oxford English Dictionary, centers on how awareness (in theory) comes with responsibility.

There will never be a common comfort level with a word like anthropocene, or with the signals emerging from the biogeophysical world, or with what to do about it. In essence, we are all on different journeys through this consequential juncture in the intertwined history of human beings and their home planet.

Is this the beginning of our end, as some have argued, or the turbulent beginning of a potential new age of enlightened cultural and physical evolution? Can the anthropocene, or Anthropocene, be good?

In June 2014, New Yorker staff writer Elizabeth Kolbert addressed this question in a Twitter post. That year she’d won a Pulitzer Prize for The Sixth Extinction. She’d read “The Delusion of the Good Anthropocene,” Clive Hamilton’s biting critique of a talk I’d given on the prospect of a “good” Anthropocene at Pace University.

Hamilton, known for a dark view and a sharp scalpel, is a professor of public ethics at Australia’s Charles Sturt University and the author of Requiem for a Species: Why We Resist the Truth about Climate Change, among other books.

Kolbert’s tweet distilled much:

In a subsequent conversation facilitated by the fine Grist blogger Nathanael Johnson, Hamilton and I clarified differences and found lots of common ground. He wasn’t seeking a “bad” anthropocene, for instance, and I didn’t see this as a good time for global ecology. But we agreed on the uniquely consequential nature of this moment and the value of discourse in search of common ground.

We also agree that the broader implications of humanity’s surging planet-scale impacts can be obscured by technical struggles or disciplinary turf battles over stratigraphic signals. As Hamilton wrote in a commentary in Nature in August 2016, “The new geological epoch does not concern soils, the landscape, or the environment, except inasmuch as they are changed as part of a massive shock to the functioning of Earth as a whole.”

The idea of the anthropocene resonates loudest within circles tussling over the best ways to chart a sustainable human journey. A leading proponent of the “Capitalocene” alternative, Binghamton University sociologist Jason W. Moore, has written that a focus on the anthropocene could “obscure more than it illuminates.” However well intended its supporters may be, they
are—by presenting humanity as a single entity—glossing over the real drivers of both environmental and social degradation: inequality, commodification, imperialism, and more. His pitch for Capitalocene leaves out environmental ravages committed under Communist regimes in the Soviet Union and China, where destructive policies began under Mao well before that country’s own capitalist tilt. But his point will be vital to consider as discussions flow forward. Who is the “we” when we talk of common human responsibilities?

It took me more than 20 years of regularly using the word “we” in articles or talks on new scientific insights (“we’ve learned”) or global trends (“we’re changing the climate”) before I fully absorbed that, in several important contexts, there is no “we.”

It’s humbling for me now to reflect on the naïve, preachy way I framed my “anthrocene” notion in that 1992 climate book. The passage reads like a sermon. Who was the “we” in that paragraph? Did it include Pacific islanders or rural villagers in India and Africa who scrabble to make a living facing today’s climatic and coastal threats and who contribute no meaningful amount of greenhouse gases to the atmosphere?

The simplest sources of human variability are geographic and economic. If you’re poor and vulnerable or prosperous and protected, an epic storm has completely different meaning. In 2007, I was the lead writer of a special New York Times report describing humanity’s "climate divide" along these lines. A Dutch woman who had bought a riverside house that floats off its foundation safely in a severe flood said, “We’re looking forward to floating,” as if it were an amusement park ride. A farmer in northeastern India had a very different reaction as he surveyed waterlogged fields following an early spring flood on the Baghmati River. Three acres of wheat—a third of his income—were gone. Barley, mustard, and peas were ruined.

But I also learned in examining behavioral studies that there can be fundamental differences—shaped by deep-rooted behavioral traits—in how individuals, rich or poor, north or south, perceive environmental change. Are you an edge pusher or group hugger? You know the answer. The person next to you likely has a different answer. One body of research calls the source of differences “cultural cognition.” This is why there’ll never be a common comfort level with a word like anthropocene, or with the signals emerging from the biogeophysical world, or with what to do about it. In essence, we are all on different journeys through this consequential juncture in the intertwined history of human beings and their home planet.

There is one other area where the “we” question has emerged. Who is the “we” who should be making judgments on the anthropocene, even within the constrained scientific debate? Just as Jason Moore found fault with an overly simplified anthropocene distillation of human civilization, British environmental economist Kate Raworth found fault with the composition of the “Anthropocene” working group itself.

I had written a blog post following the second meeting of the group, in Berlin in 2014; I noted, somewhat in passing, that it was “very white, [w]estern, and male.” Raworth fired an apt salvo on Twitter: “The Anthropocene is bad enough. Spare us a Manthropocene.” She included a photo gallery she’d created of nine female experts in global change. In a welcome move, more women have since been added to the group, including Naomi Oreskes of Harvard University, who combines a geologist’s and a historian’s perspectives.

It’s important not to get too caught up in this rarefied level of discussion. In the real world, however discomfiting this might be to those of us engaged with the word, I’d aggressively wager that at least 90 percent, maybe 95 percent, of humanity has not yet heard the word or considered its implications. Wealthy world citizens are insulated from environmental risk. The poorest are so caught up in survival that the future has little meaning. I haven’t found any polls yet testing awareness of the word “anthropocene.” But try a Google Trends search of “anthropocene, global warming, ISIS” and you can see the relative levels of attention.

In the meantime: Whatever you call this period of history, the biogeophysical and increasingly technological reality is playing out on scales that aren’t amenable to old ways of managing risks and opportunities.

Brad Allenby, a longtime analyst of sustainability and technology at Arizona State University, rejects the term Anthropocene entirely because it’s not nearly big enough to encompass what’s going on. He feels referencing geologic time presumes far too much stability and knowledge. “[A]s humans increasingly integrate with the technology around them, and as the evolution of that technology continues to accelerate, it is questionable that what we will have in 50 or 100 years will still be anything like ‘anthro,’” he wrote on the aptly named Future Tense blog earlier this year. “We are trying to tie geologic time to a windstorm.”

A few years ago, after Allenby and I had an onstage discussion of the Anthropocene at Arizona State, a member of the audience proposed a hopeful architecture for the coming decades:

“The way I would like to see it … in, say, 100 years in the future, the London Geological Society will look back and consider this period … a transition from the lesser Anthropocene to the greater Anthropocene.”

That has a nice feel to it. Fully integrating this awareness into our personal choices and societal norms and policies will take time. I’ve taken to encouraging people to meld urgency and patience, however irrational that might feel.

Reflecting on all that has passed and is to come, I see the prospect of slow but substantial and productive shifts in the human enterprise. They will come along with a rich array of perceptions and responses among and within communities—from the scale of global society to that of the stratigraphic community.

Will this happen fast enough? Who knows. But this is the human way. A big part of engaging with the anthropocene, to my eye, is engaging with and even embracing ourselves as individuals and as a flawed and variegated yet amazing species. In 2003, biologists identified “response diversity” as a source of resilience in ecosystems. I’d assert that the same characteristic is an asset in societies as long as they work to level playing fields, foster education and transparency—and communicate.

Perhaps the last thing the world needs is another word. But in 2011, I offered a name for that kind of engagement. It might make you chuckle, given my earlier effort at naming something, but here goes. Anthropophilia.

Edward O. Wilson’s Biophilia was a powerful look outward at the characteristics of the natural world that we inherently cherish. Now we need a dose of what I’ve taken to calling anthropophilia as well.
We have to accept ourselves, flaws and all, in order to move beyond what has been something of an unconscious, species-scale pubescent growth spurt enabled by fossil fuels in place of testosterone.
In The World without Us, Alan Weisman created a haunting, best-selling, thought experiment—imagining a planet awakening after the vanishing of its human tormentor. The challenge: There is a real experiment well under way, and we’re all in the test tube.

We’re stuck with the story of The World with Us. It’s time to grasp that uncomfortable, but ultimately hopeful, idea. Shall we form an Anthropophilia working group?

___________________________

Andrew C. Revkin has reported on science and the environment for more than three decades, including 14 years at The New York Times. He now writes the Dot Earth blog for the Times. He is the recipient of a Guggenheim Fellowship, the Senior Fellow for Environmental Understanding at Pace University, and a performing songwriter.

___________________________

About the Header Image: JR is a French photographer and artist who claims to own the biggest art gallery in the world—the streets of the world. His work is designed to catch the attention of people who are not typical museum visitors, mixing art and action and talking about commitment, freedom, identity, and limit.
The image used here is part of the Women Are Heroes Project and was created in an action in Phnom Penh, Open Eyes, Cambodge, 2009.
©jr-art.net