• Email
  • Single Page
  • Print

Science: The Coming Century

Fifty years ago no one could confidently have predicted the geopolitical landscape of today. And scientific forecasting is just as hazardous. Three of today’s most remarkable technologies had their gestation in the 1950s. But nobody could then have guessed how pervasively they would shape our lives. It was in 1958 that Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductors built the first integrated circuit—the precursor of today’s ubiquitous silicon chips, each containing literally billions of microscopic circuit elements. This was perhaps the most transformative single invention of the past century. A second technology with huge potential began in Cambridge in the 1950s, when James Watson and Francis Crick discovered the bedrock mechanism of heredity—the famous double helix. This discovery launched the science of molecular biology, opening exciting prospects in genomics and synthetic biology.

And it’s just over fifty years since the launch of Sputnik. This event started the “space race,” and led President Kennedy to inaugurate the program to land men on the moon. Kennedy’s prime motive was of course superpower rivalry—cynics could deride it as a stunt. But it was an extraordinary technical triumph—especially as NASA’s total computing power then was far less than that of a single mobile phone today. And it had an inspirational aspect too: it offered a new perspective on our planet. Distant images of earth—its delicate biosphere of clouds, land, and oceans contrasting with the sterile moonscape where the astronauts left their footprints—have, ever since the 1960s, been iconic for environmentalists.

To young people today, however, this is ancient history: they know that the Americans went to the moon, just as they know the Egyptians built pyramids, but the motives for these two enterprises may seem equally baffling. There was no real follow-up after Apollo: there is no practical or scientific motive adequate to justify the huge expense of NASA-style manned spaceflight, and it has lost its glamour. But unmanned space technology has flourished, giving us GPS, global communications, environmental monitoring, and other everyday benefits, as well as an immense scientific yield. But of course there is a dark side. Its initial motivation was to provide missiles to carry nuclear weapons. And those weapons were themselves the outcome of a huge enterprise, the Manhattan Project, that was even more intense and focused than the Apollo program.

Soon after World War II, some physicists who had been involved in the Manhattan Project founded a journal called the Bulletin of the Atomic Scientists, aimed at promoting arms control. The logo on the Bulletin ‘s cover is a clock, the closeness of whose hands to midnight indicates the Editorial Board’s judgment on how precarious the world situation is. Every year or two, the minute hand is shifted, either forward or backward.

It was closest to midnight at the time of the Cuban missile crisis. Robert MacNamara spoke frankly about that episode in Errol Morris’s documentary Fog of War. He said that “we came within a hairbreadth of nuclear war without realizing it. It’s no credit to us that we escaped—Khrushchev and Kennedy were lucky as well as wise.” Indeed on several occasions during the cold war the superpowers could have stumbled toward Armageddon.

When the cold war ended, the Bulletin’s clock was put back to seventeen minutes to midnight. There is now far less risk of tens of thousands of H-bombs devastating our civilization. The greatest peril to confront the world from the 1950s to the 1980s—massive nuclear annihilation—has diminished. But the clock has been creeping forward again. There is increasing concern about nuclear proliferation, and about nuclear weapons being deployed in a localized conflict. And al-Qaeda-style terrorists might some day acquire a nuclear weapon. If they did, they could willingly detonate it in a city, killing tens of thousands along with themselves, and millions would acclaim them as heros.

And the threat of a global nuclear catastrophe could be merely in temporary abeyance. During this century, geopolitical realignments could be as drastic as those during the last century, and could lead to a nuclear standoff between new superpowers that might be handled less well—or less luckily—than the Cuban crisis was.

The nuclear age inaugurated an era when humans could threaten the entire earth’s future—what some have called the “anthropocene” era. We’ll never be completely rid of the nuclear threat. But the twenty-first century confronts us with new perils as grave as the bomb. They may not threaten a sudden worldwide catastrophe—the doomsday clock is not such a good metaphor—but they are, in aggregate, worrying and challenging.

Energy and Climate

High on the global agenda are energy supply and energy security. These are crucial for economic and political stability, and linked of course to the grave issue of long-term climate change.

Human actions—mainly the burning of fossil fuels—have already raised the carbon dioxide concentration in the atmosphere higher than it’s ever been in the last half-million years. Moreover, according to projections based on “business as usual” practices—i.e., the continued growth of carbon emissions at current rates—atmospheric CO2 will reach twice the pre-industrial level by 2050, and three times that level later in the century. This much is entirely uncontroversial. Nor is there significant doubt that CO2 is a greenhouse gas, and that the higher its concentration rises, the greater the warming—and, more important still, the greater the chance of triggering something grave and irreversible: rising sea levels due to the melting of Greenland’s icecap, runaway greenhouse warming due to release of methane in the tundra, and so forth.

There is a substantial uncertainty in just how sensitive the temperature is to the level of CO2 in the atmosphere. The climate models can, however, assess the likelihood of a range of temperature rises. It is the “high-end tail” of the probability distribution that should worry us most—the small probability of a really drastic climatic shift. Climate scientists now aim to refine their calculations, and to address questions like: Where will the flood risks be concentrated? What parts of Africa will suffer severest drought? Where will the worst hurricanes strike?

The “headline figures” that the climate modelers quote—a 2, 3, or 5 degrees Centigrade rise in the mean global temperature—might seem too small to fuss about. But two comments should put them into perspective. First, even in the depth of the last ice age the mean global temperature was lower by just 5 degrees. Second, these predictions do not imply a uniform warming: the land warms more than the sea, and high latitudes more than low. Quoting a single figure glosses over shifts in global weather patterns that will be more drastic in some regions than in others, and could involve relatively sudden “flips” rather than steady changes. Nations can adapt to some of the adverse effects of warming. But the most vulnerable people—in, for instance, Africa or Bangladesh—are the least able to adapt.

The science of climate change is intricate. But it’s straightforward compared to the economics and politics. Global warming poses a unique political challenge for two reasons. First, the effect is nonlocalized: British CO2 emissions have no more effect in Britain than they do in Australia, and vice versa. That means that any credible regime whereby the “polluter pays” has to be broadly international.

Second, there are long time-lags—it takes decades for the oceans to adjust to a new equilibrium, and centuries for ice sheets to melt completely. So the main downsides of global warming lie a century or more in the future. Concepts of intergenerational justice then come into play: How should we rate the rights and interests of future generations compared to our own? What discount rate should we apply?

In his influential 2006 report for the UK government, Nicholas Stern argued that equity to future generations renders a “commercial” discount rate quite inappropriate. Largely on that basis he argues that we should commit substantial resources now, to preempt much greater costs in future decades.

There are of course precedents for long-term altruism. Indeed, in discussing the safe disposal of nuclear waste, experts talk with a straight face about what might happen more than 10,000 years from now, thereby implicitly applying a zero discount rate. To concern ourselves with such a remote “post-human” era might seem bizarre. But all of us can surely empathize at least a century ahead. Especially in Europe, we’re mindful of the heritage we owe to centuries past; history will judge us harshly if we discount too heavily what might happen when our grandchildren grow old.

To ensure a better-than-even chance of avoiding a potentially dangerous “tipping point,” global CO2 emissions must, by 2050, be brought down to half the 1990 level. This is the target espoused by the G8. It corresponds to two tons of CO2 per year from each person on the planet. For comparison, current CO2 emissions for the US are about twenty tons per person, the average European figure is about ten, and the Chinese level is already four. To achieve this target without stifling economic growth—to turn around the curve of CO2 emissions well before 2050—is a huge challenge. The debates during the July meeting of the G8 in Japan indicated the problems—especially how to get the cooperation of India and China. The great emerging economies have so far made minor contributions to the atmospheric “stock” of CO2, but if they develop in as carbon-intensive a way as ours did, they could swamp and negate any measures taken by the G8 alone.

Realistically, however, there is no chance of reaching this target, or of achieving real energy security, without drastically new technologies. I’m confident that these will have emerged by the second half of the century; the worry is that this may not be soon enough. Efforts to develop a whole raft of techniques for economizing on energy, storing it, and generating it by “clean” or low-carbon methods deserve a priority and commitment from governments akin to that accorded to the Manhattan Project or the Apollo moon landing. Current R&D is far less than the scale and urgency of what the problem requires.1 To speed things up, we need a “shotgun approach”—trying all the options. And we can afford it: the stakes are colossal. The world spends around $7 trillion per year on energy and its infrastructure. The US imports $500 billion worth of oil each year.

I can’t think of anything that could do more to attract the brightest and best into science than a strongly proclaimed commitment—led by the US and Europe—to provide clean and sustainable energy for the developing and the developed world. Even optimists about prospects in solar energy, advanced biofuels, fusion, and other renewables have to acknowledge that it will be at least forty years before they can fully “take over.” Coal, oil, and gas seem set to dominate the world’s ever-growing energy needs for at least that long. Last year the Chinese built a hundred coal-fired power stations. Coal deposits representing a million years’ accumulation of primeval forest are now being burned in a single year.

  1. 1

    There have been some futuristic proposals to counteract climate change by global-scale projects to attenuate the sunlight reaching the ground, or to “soak up” atmospheric CO2. The huge costs and unpredictable downsides of such “geoengineering” would render it less politically acceptable than more conventional measures. But this approach deserves some study, provided that the focus is not diverted from reductions of CO2 emissions and from adaptation.

  • Email
  • Single Page
  • Print