Last summer, when The New York Times Magazine devoted an entire issue to Nathaniel Rich’s essay “Losing Earth: The Decade We Almost Stopped Climate Change,” I resisted reading it. It was early August, and dozens of cities around the world had recently reported record-breaking high temperatures. A heat wave in Japan had killed sixty-five people during a single week, and hospitalized tens of thousands more; roads and rooftops were melting in the United Kingdom; in Finland, north of the Arctic Circle, temperatures had approached 90 degrees Fahrenheit. The real-time effects of climate change were—and are—all around us, and it is highly likely that they are going to get much worse. Must we also torture ourselves with reminders of missed opportunities?
Alas, yes. Rich’s Losing Earth: A Recent History—a slightly expanded version of his article—makes a strong case for the value of might-have-beens. In Rich’s telling, the story of climate politics between 1979 and 1989, both in the United States and internationally, is one of great possibility and almost total failure. While limiting the devastating effects of carbon emissions was more difficult to do during the 1980s than Rich suggests, he effectively excavates an era when alliances were unsettled, minds were far more open to change, and a determined, well-informed effort nevertheless came to naught.
Today, the most obvious enemy of meaningful action on climate change is the fossil fuel industry, which has emphasized the complexity of the Earth’s climate in order to divide the public and immobilize our politics. But in 1979, as Rich points out, the basic science of climate change was not considered especially complicated—or especially controversial. Many government scientists, and researchers at companies such as Exxon, understood and accepted that the carbon dioxide produced by fossil fuel combustion was radically transforming the atmosphere and heating up the planet. Scientists, after all, had been predicting this since 1896, when the Swedish scientist Svante Arrhenius calculated that a threefold increase in the concentration of carbon dioxide in the atmosphere would boost average surface temperatures in the Arctic by 8 to 9 degrees Celsius. Predicting the precise effects of climate change—exactly what will happen where and when—is complex because the global climate system is extremely complex. But for more than a century, the general consequences of loading the atmosphere with carbon dioxide have been about as debatable as gravity.
In the spring of 1979, when a thirty-two-year-old Cornell graduate named Rafe Pomerance, then the deputy legislative director of the environmental organization Friends of the Earth, stumbled on a brief reference to climate change in a government report, it wasn’t difficult for him to grasp the implications. The report, an Environmental Protection Agency analysis of the future of coal as an energy source, mentioned in passing that the continued use of fossil fuels could lead to “significant and damaging” changes to the global atmosphere within two or three decades. Pomerance had a degree in history, not science, but as Rich recounts, he was immediately struck by the possibility that humankind was knowingly destroying the conditions required for its own survival. Why didn’t he know about this? Why didn’t everyone?
Rich follows Pomerance as he chases down the available evidence for climate change, beginning with a 1979 report to the Department of Energy by the Jasons, a semisecret team of elite scientists established in 1960 and periodically convened to find scientific solutions to US national security problems. (Originally known as Project Sunrise, the Jasons named themselves after the mythological hero at the suggestion of a founding member’s wife.) Arrhenius had suggested that human industry could increase atmospheric carbon dioxide concentrations “to a noticeable degree” within a few centuries; the Jasons concluded that carbon dioxide levels would double as early as 2035 and no later than 2060. They predicted that this would increase average global surface temperatures by 2 to 3 degrees Celsius, create Dust Bowl conditions across North America, Asia, and Africa, and cause famines and droughts so severe and long-lasting that they would bring about mass human migration. The warming, the Jasons wrote, would also lead to the “ominous feature” of rapid ice melt at the poles—releasing enough water, Rich tells us, to raise the oceans by sixteen feet.
The Jasons had already sent the report to dozens of government agencies, industry groups, and individual scientists in the US and abroad, but no action had been taken. Pomerance arranged for the report’s lead scientist, Gordon MacDonald, to give a series of informal briefings to senior government officials, and soon learned that few, if any, had grasped the import of the Jasons’ findings. Even President Carter’s chief scientist, Frank Press, who was familiar with the carbon dioxide issue, had told Carter that the “present state of knowledge” did not justify taking action. When MacDonald spoke to Press and the staff of the president’s Office of Science and Technology Policy, he warned of a snowless New England, flooded coastal cities, and a 40 percent drop in US wheat production within his listeners’ lifetimes. He said that the administration’s support for synthetic fuels—liquid fuels synthesized from coal or natural gas—was a step in exactly the wrong direction. Coal production, he added, would ultimately have to end. MacDonald’s recommendations were, to say the least, politically unattractive, but his vivid description of the costs of inaction convinced Press to request a full assessment of the carbon dioxide problem from the National Academy of Sciences.
When the National Academy team convened in Woods Hole, on Cape Cod, in the summer of 1979, they called the NASA researcher James Hansen, who at the time was one of a very few scientists studying the effects of carbon emissions using computer models of the global climate. The predictions of Hansen and others led the team to conclude that the Jasons had been optimistic: according to Rich, their results showed that “when carbon dioxide doubled in 2035 or thereabouts, global temperatures would increase between 1.5 and 4.5 degrees Celsius, with the most likely outcome falling in the middle: a warming of 3 degrees.” The last time the planet had been so warm was during the Pliocene, when the seas were eighty feet higher and beech trees were growing in Antarctica. In its report, the National Academy team warned that “a wait-and-see policy may mean waiting until it is too late.”
The National Academy report was generally accepted as authoritative, so much so that the fossil fuel industry recognized, Rich writes, that a “formal consensus about the nature of the crisis had cohered.” Exxon, along with the American Petroleum Institute, had been studying the effects of carbon dioxide emissions since the mid-1950s, but lack of government concern had made it easy for the industry to justify inaction. After the release of the National Academy report, however, Exxon research laboratory manager Henry Shaw recommended to his superiors that the company “start a very aggressive defensive program, because there is a good probability that legislation affecting our business will be passed.” Exxon executives created a new climate research program with an annual budget of $600,000, charging it with quantifying the company’s responsibility for climate change—and ultimately minimizing the regulatory burden on the company.
Within a few years, the industry’s posture would shift from “aggressive defensive” to simply aggressive, and Exxon and its allies would launch an all-out attack on both climate legislation and the science supporting it. But in 1980 both Shaw and his bosses believed that wary cooperation was wiser than defiance: Congress had just held its first hearing on climate change; Carter had ordered another, more comprehensive climate change report from the National Academy; and the National Commission on Air Quality had invited Shaw and two dozen other climate and energy experts to a meeting to help develop climate legislation.
In October 1980 Shaw and the other experts met in a hotel on a barrier island off Florida’s Gulf Coast. Pomerance, who attended the meeting, had high hopes: after a year and a half of discussing the effects of climate change, he was finally going to hear a conversation about how to prevent it. The meeting got off to a strong start, with general agreement on the severity of the problem and the need to reduce carbon emissions. Even Shaw concurred, stipulating only that there would need to be a “very orderly transition” from fossil fuels to renewable energy.
But when it came time to commit to specific solutions, the experts began to hesitate. China, the Soviet Union, and the United States were all accelerating coal production; Carter was planning to invest $80 billion in synthetic fuels. Proposed laws or regulations would focus attention on the costs of emissions reduction, instantly politicizing the issue. “We are talking about some major fights in this country,” said the economist Thomas Waltz. “We had better be thinking this thing through.” By the third day, Rich recounts, the experts had abandoned solutions and were even reconsidering their statement of the problem, loading it with caveats. (Were climatic changes “highly likely” or “almost surely” to occur? Were said changes of an “undetermined” or “little-understood” nature?) In the end, the meeting’s final statement was weaker than the language the commission had used to announce the workshop, and Pomerance left in frustration.
When President Ronald Reagan was inaugurated in January 1981, he began a wide-ranging attack on US environmental policy, appointing zealously antiregulation partisans to head the Environmental Protection Agency and the Department of the Interior and threatening to open public lands to more mining, drilling, and logging. The nation’s embryonic climate policy, however, was largely left alone: the administration’s standard response to questions about the connection between rising global temperatures and carbon emissions was that no governmental action would be taken until the National Academy completed its second climate change report, the comprehensive analysis of social and economic effects commissioned by President Carter. Meanwhile, Rich writes, climate change “continued to vibrate at the periphery of public consciousness,” as a succession of major studies—and news headlines—confirmed the conclusions of the first report.
The second National Academy report was released in October 1983, and while its overall tone was cautious, it was punctuated with grim warnings. “We are deeply concerned about environmental changes of this magnitude,” the authors stated in their executive summary. “We may get into trouble in ways that we have barely imagined.” The report quoted historian Barbara Tuchman on the consequences of climate changes in Europe during the Middle Ages—“famine, the dark horseman of the Apocalypse, became familiar to all”—and recommended that present-day researchers prioritize work on renewable fuels: “The potential disruptions associated with CO2-induced climate change are sufficiently serious to make us lean away from fossil fuel energy options, if other things are equal.”
In interviews, however, lead author William Nierenberg and his coauthors emphasized the need for “caution, not panic,” and predicted that the climate problem would be “manageable in the next hundred or so years.” Like many scientists of his era, Nierenberg believed that ingenuity—especially American technological ingenuity, which had won World War II and developed the aerospace and computer industries—would protect humanity from worst-case scenarios. Headlines reflected the interviews, not the contents of the report itself, and both the Reagan administration and the fossil fuel industry readily accepted their interpretation.
For a time, Rich writes, Pomerance and others were able to link climate change to the thinning of the ozone layer, after images of an ozone “hole” created by the use of chlorofluorocarbons in aerosols and refrigerants gripped the public imagination in the mid-1980s. Climate change made headlines again during the sweltering summer of 1988, when wildfires raced through a third of Yellowstone National Park, ducks from the lower forty-eight states fled en masse to cooler climes in Alaska—temporarily swelling the state’s pintail population by fifteenfold—and James Hansen testified during a televised congressional hearing that warming of the planet caused by humans was “already happening now.”
But climate change never recaptured the sustained attention it had received earlier that decade. During his 1988 presidential campaign, George H.W. Bush promised to be an “environmental president.” “Those who think we are powerless to do anything about the greenhouse effect,” he told supporters at a campaign stop in Michigan, “are forgetting about the White House effect.” Once in office, however, Bush proved to have only a passing interest in climate change, and his chief of staff, John Sununu, was suspicious of environmentalists and environmental policy. Sununu, an MIT graduate who liked to call himself an “old engineer,” had a rudimentary climate model installed on his desktop computer and, after unsuccessfully attempting to replicate Hansen’s conclusions, declared them to be “poppycock.” He told James Baker, Bush’s secretary of state, to “stay clear of this greenhouse effect nonsense,” and issued a similarly stern warning to EPA administrator Bill Reilly, a lawyer and urban planner whose support for emissions reductions was soon drowned out by Sununu and his supporters.
In November 1989, when the world’s environmental ministers gathered in the Netherlands to agree on a framework for a global emissions treaty, US representatives sabotaged the negotiations, forcing the group to abandon any hard limits on emissions of greenhouse gases and diluting the meeting’s final statement to a vague call for reducing emissions “to a level consistent with the natural capacity of the planet.” The decade of possibility was over, and Sununu, who presided over its ignominious end, could easily be blamed for its failure. But as Rich points out, Sununu’s success was made possible by the weakness of US public and political support for climate action: by 1989, after a succession of halfhearted expert warnings, the once-widespread concern about climate change had subsided into complacency.
Losing Earth argues convincingly that during the 1980s, many people from various political backgrounds were willing to consider some sort of action on climate change. But it is an overstatement to say, as Rich does in his introduction, that we had an “excellent chance” of solving the problem in the 1980s, and that the “conditions for success were so favorable that they have the quality of a fable.” Members of the Carter administration were genuinely concerned about the issue, but in the end the administration did little more than commission reports. Some moderate Republican senators and Bush administration officials supported action, but most prominent Republicans were uninterested in or actively opposed to limiting fossil fuel production and use. And while the fossil fuel industry signaled its willingness to participate in climate policymaking, that willingness was never tested by substantive proposals. The conditions for success seemed favorable only because success was still so very far away.
What is clear from Rich’s story, and what remains relevant today, is that many of those who were concerned about climate change gave up before they even got started: anticipating opposition, they shied away from specifics, declining even to articulate the magnitude of the problem and the range of possible solutions. The heroes in Rich’s book—people like James Hansen, Rafe Pomerance, and Gordon MacDonald—didn’t risk their careers, much less their lives; they simply spoke clearly and forcefully, not only about the existing scientific evidence but also about the need to act on that evidence. The trouble was, and is, that not everyone with their level of knowledge and influence followed their example. That hesitation gave everyone else—individuals, governments, corporations—a welcome excuse for inaction.
Today, there is another layer of resistance—the climate change denialism created and encouraged by fossil fuel companies. Exxon continued its tacit cooperation with policymakers until after the 1989 conference in the Netherlands, when federal regulations had slipped from unstoppable to unlikely. Then Exxon and its competitors began to support an American Petroleum Institute (API) press campaign that paid scientists to write Op-Eds emphasizing the uncertainties in climate science. (As Stanford researcher Ben Franta noted after the publication of Rich’s magazine story, the API was downplaying the dangers of climate change as early as 1980.) The press campaign was so successful and so cheap to run that it quickly expanded. By the early 2000s, API-supported groups were questioning not only the accuracy of climate change predictions but the basic science that Svante Arrhenius had described in 1896.
Now those tactics are institutionalized: Trump administration appointees have eliminated the last of some long-range climate models, dropped worst-case scenarios from the quadrennial National Climate Assessment, and proposed a “climate review panel” that would question the work of government climate scientists. The panel would be headed by William Happer, a physicist who once compared the “demonization” of carbon dioxide to Hitler’s genocide of the Jews.
During the eight months between the publication of the article “Losing Earth” and the book Losing Earth, the Intergovernmental Panel on Climate Change (IPCC) predicted that within roughly twelve years, barring radical changes in energy use, humanity will have committed itself to at least 1.5 degrees Celsius of warming—and to all the catastrophes that come with it, from sea-level rise to increasingly severe wildfires and hurricanes. A weather station in New South Wales recorded the hottest night in Australian history. And my ten-year-old daughter suddenly became a climate activist.
For several years, she had been so disturbed by grownups’ offhand comments about “the end of the world” that she chose not to learn anything about climate change—a choice I respected. But earlier this year, her class watched Our Climate Our Future, an intelligent video about the science and politics of climate change produced by the Alliance for Climate Education. Then she read about Greta Thunberg, the sixteen-year-old whose solo sit-in outside the Swedish Parliament last year began a global wave of student demonstrations calling for action on climate change. Before long, my daughter and her friends were holding placards in front of our town hall in southern Washington State.
“Greta says we have twelve years left to fix it,” she told me. “What if we don’t?” What I told her was, I hope, both accurate and reassuring: people in power had waited much too long to act on climate change, but there were still a lot of ways to make the future better, and there would be for a long time to come. What I haven’t yet told her was that the IPCC report—as well as the sweeping UN assessment of global biodiversity released in early May, whose summary findings estimate that a million species are now threatened with extinction owing to habitat loss, climate change, and other factors—has made me ask myself more or less the same question: What if we don’t? I’ve been writing about climate change for almost two decades, but as Rich puts it in Losing Earth, I’ve only just begun to “seriously consider the prospect of failure.”
Those in power in the 1980s might have fixed the climate problem by confidently facing the facts and offering their sustained support to emissions reductions policies and other technical fixes. It’s certainly still possible to cut greenhouse gas emissions in half before the year 2030, with the right investments and policy incentives, but after decades of denialism, it’s not likely. What’s needed now is a speedy transformation of public opinion, and in the past, Rich writes, such transformations have been accomplished not with political or economic arguments but “on the strength of a moral claim that persuades enough voters to see the issue in human, rather than political, terms.” Like David Wallace-Wells in The Uninhabitable Earth (2019), Rich does not hesitate to define the stakes:
We know that the transformations of our planet, which will come gradually and suddenly, will reconfigure the political world order. We know that if we don’t sharply reduce emissions, we risk the collapse of civilization…. We also know that the coming changes will be worse for our children, worse yet for their children, and even worse still for their children’s children, whose lives, our actions have demonstrated, mean nothing to us.
If we don’t fix it, then, the Earth will be uninhabitable for future generations. And that prospect has profound consequences for us. In the weeks after the publication of the IPCC report, I ran across a short book called Death and the Afterlife (2013), based on a series of lectures by the philosopher Samuel Scheffler. Scheffler is less interested in the belief in a life after death—what he calls the “personal afterlife”—than in our assumption that humanity will endure long after our individual selves are gone. This sense of a “collective afterlife,” he argues, matters much more to us than we realize. What if you knew that thirty days after your own death the planet would be destroyed by an asteroid, Scheffler asks, or if the entire human population were rendered infertile, as in the P.D. James novel The Children of Men? Which of your current projects and activities would still seem worth pursuing?
“In some very basic respects, our own survival, and even the survival of those we love and care about most deeply, matters less to us than the survival of strangers, the survival of humanity itself,” writes Scheffler. “The prospect of the imminent disappearance of the race poses a far greater threat to our ability to treat other things as mattering to us and, in so doing, it poses a far greater threat to our continued ability to live value-laden lives.” That is, ultimately, what we stand to lose if we don’t respond to the threat of climate change: not only a habitable Earth, but also the value of our own still-unfolding lives.