natarajan_1-102314.jpg

Bonestell LLC

Chesley Bonestell: Saturn as Seen from Titan [Its Moon], 1944; from Michael Benson’s Cosmigraphics: Picturing Space Through Time, to be published by Abrams in November. ‘Along with French ­illustrator and astronomer Lucien Rudaux,’ Benson writes, Bonestell ‘pioneered a genre of speculative solar system landscapes sometimes called “space art”…We now know that Titan’s atmosphere is so thick, a view like this would be impossible, which takes nothing away from the power of Bonestell’s achievement.’

Rien ne dure que le provisoire.
—French proverb

The current misuse of scientific findings can be tragic. At 3:32 AM on April 6, 2009, a devastating earthquake that measured 6.3 on the Richter scale rocked the medieval Italian town of L’Aquila, killing about three hundred people and leveling many buildings. Residents had experienced about thirty small tremors in the preceding three months and had become very apprehensive. A week before the quake, a meeting that included leading seismologists and public officials was held to evaluate the situation. According to seismologists, it is impossible to know with certainty whether small quakes are foreshocks of a larger tremor.

One of the expert geologists at the assessment meeting, Enzo Boschi, drew attention to this scientific uncertainty and noted that while a large earthquake was “unlikely,” the possibility could not be excluded. Despite this, when the vice-director of Italy’s civil protection agency, Bernardo De Bernardinis, emerged from the meeting, he assured locals that the tremors were routine and simply symptomatic of the earth releasing pent-up energy.

When the jolt of a quake woke up his two teenage children, a local resident, Giustino Parisse, trusting the report he had heard earlier on TV, calmed them down and put them back to sleep. Later that night, his house was leveled, killing both his children. Parisse and a group of residents sued the scientists and the local public officials for failing to warn them. The failure of these estimates of risk by the National Commission for the Forecast and Prevention of Major Risks led to those expert scientists being convicted of providing “inexact, incomplete and contradictory” information about the danger; they were each given six-year jail terms in October 2012.

Closer to home, on June 12, 2012, the North Carolina Senate passed a law that effectively prohibited the use of any data about sea-level changes in determining coastal policy in the state. The law was drafted in response to a report from the state-appointed North Carolina Coastal Resources Commission’s expert scientists, who advised that sea-level rises of about thirty-nine inches could be expected in the next hundred years, putting coastal communities in the Outer Banks region at grave risk. The law, formulated to regulate development permits, discounts these projections and prescribes a new method—rejected by most qualified scientists—for calculating sea-level rises.

There is, on the contrary, near-universal agreement among climate scientists that the sea will probably rise a good meter or more within the next hundred years, potentially submerging all low-lying coastal areas around the globe. But supporters of the legislation, developers concerned about the economic consequences of basing regulations on the predicted sea-level rise, found a novel way to circumvent the scientific assessment: by simply making the use of current measurements illegal.

The law now forbids the use of any new data and allows only historical data in making estimates of the sea-level rise in awarding permits for the next four years. According to the law, measurements taken in 1900 will form the baseline from which only linear extrapolations to the present day will be allowed. Nature, though, appears to be mocking North Carolina lawmakers. Two weeks after the law’s passage, a new study of measurements from tide gauge records revealed that the fastest sea level rises since 1980 in North America are along the coast from North Carolina to Massachusetts.

What’s depressing about these two cases is the misconception of science they reflect. Much of the public clearly does not know what to make of scientific research and has a poor understanding of how findings are reached, especially when it comes to assessing future risk. This seems to be true in all countries, but it is particularly striking in the United States, where so much of today’s scientific research originates. This paradox is worth exploring.

Polls in the US regularly show nearly unanimous support for improving the quality of science education, which is perceived as being important to the country’s ability to compete globally. A poll by the Pew Research Center in 2009 found that most Americans—84 percent—saw science as a positive force in society. Yet it also found that while people under thirty were more science-savvy than those over sixty-five, all age groups had a rather flimsy grasp of simple scientific concepts, even those taught in most public high schools, such as gravity or the structure of the atom.

Advertisement

A recent survey by the National Science Foundation found that a quarter of Americans did not know if the earth moved around the sun or vice versa. Meanwhile, 33 percent of Americans deny the reality of evolution and still believe that humans and the rest of the animal kingdom have always existed in their present form. Americans have extremely high expectations of and confidence in science and technology and think of it as a national priority—yet they also distrust its results. How to explain this?

One view is that Americans are simply ignorant and lack an understanding of basic science and mathematics. The assumption is that if these skills were improved, the public would become more appreciative of science. Yet recent research by Professor Dan Kahan at Yale suggests that the rejection of science is only weakly correlated with scientific literacy and numeracy. His data find a much higher correlation with Americans’ general political and cultural outlook. Kahan’s research indicates that, even controlling for differences in math and science skills, people with different cultural values—individualists compared with egalitarians, for example—disagree sharply about how serious a threat climate change is. Kahan’s results also show that people who identify with the Tea Party have a slightly higher level of science comprehension (it’s a tiny effect but it is there) than the average American, according to a nationally representative sample of US adults.

Illuminating as it is, though, Kahan’s research does not address the degree to which people understand the scientific method—not whether they know what protons or logarithms are, but whether they have an adequate sense of what a scientific theory is, how evidence for it is collected and evaluated, how uncertainty (which is inevitable) is measured, and how one theory can displace another, either by offering a more economical, elegant, honed, and general explanation of phenomena or, in the rare event, by clearly falsifying it. The L’Aquila case shows that many people expect science to provide 100 percent certainty, while the North Carolina case reveals the possibility that any uncertainty can be used to render a theory either false or just as good as any other theory.

In a word, the general public has trouble understanding the provisionality of science. Provisionality refers to the state of knowledge at a given time. Newton’s laws of gravity, which we all learn in school, were once thought to be complete and comprehensive. Now we know that while those laws offer an accurate understanding of how fast an apple falls from a tree or how friction helps us take a curve in the road, they are inadequate to describe the motion of subatomic particles or the flight of satellites in space. For these we needed Einstein’s new conceptions.

Take, for example, the Global Positioning System (GPS) that many of us use when driving. GPS is based on a fleet of twenty-four satellites orbiting the earth, each equipped with a precise atomic clock on board. A GPS receiver on an iPhone detects radio signals from any of the satellites overhead, and computes the user’s position within one meter or less. As predicted by Einstein’s theory of special relativity, the satellite clocks circling at 14,000 kilometers per hour tick more slowly than clocks on earth, losing about seven microseconds per day. However, since the clocks are 20,000 kilometers above the earth’s surface, and since, according to Einstein’s general relativity theory, gravity curves space and time, a clock orbiting at this height should tick slightly faster. The combination of these two effects results in a net speeding up so the time on a GPS satellite clock is faster than one on earth by about thirty-eight microseconds per day. To achieve navigational accuracy this speeding up predicted by Einstein must be compensated for.

Einstein’s theories did not refute Newton’s; they simply absorbed them into a more comprehensive theory of gravity and motion. Newton’s theory has its place and it offers an adequate and accurate description, albeit in a limited sphere. As Einstein himself once put it, “The most beautiful fate of a physical theory is to point the way to the establishment of a more inclusive theory, in which it lives as a limiting case.” It is this continuously evolving nature of knowledge that makes science always provisional.

How could the public be better educated about the nature of scientific inquiry? Three recent books, read together, point us in a new direction. These books lay bare the provisionality of science and may, paradoxically, actually help us find a way to address rampant denialism. Rather than focus single-mindedly on the technical aspects of science or the need to improve basic skills, they focus our attention on the psychology of science—the drives that inspire us to inquire into nature, and the limits that our minds necessarily impose on our knowledge.

Advertisement

In Curiosity: How Science Became Interested in Everything, the science writer Philip Ball, a former editor at Nature, reveals how curiosity, combined with wonder, has driven the scientific enterprise since the seventeenth century, and how the ever-transmuting nature of curiosity shifted the practice of science to the highly specialized and impersonal activity that it is perceived as today. Ball traces the intellectual history of curiosity, from the Renaissance cabinets of curiosity to the Large Hadron Collider at CERN that harks back to a view of nature as holding secrets that must be teased out with experimental apparatuses. He shows how curiosity went from being seen as a vice in medieval Catholic Europe, to a shallow form of inquisitiveness that inspired learned societies like the London philosophical club, and then, in the latter half of the sixteenth century, got recast as a virtue. Changes in the notion of curiosity from vice to virtue, he argues, have gone hand in hand with the development of empirical methods in science.

Ball provides one of the clearest explications of the provisional nature of science by tracing the development of the currently accepted germ theory of disease. He shows how the invention of the microscope, which opened up an entirely new, formerly invisible realm, first led to the idea of “animalcules” (developed by Anton von Leeuwenhoek, Robert Boyle, and Robert Hooke), which was refined by Louis Pasteur and others in the nineteenth century, leading to our present view of pathogens as the agents of disease. Ball traces the entire process from the early proposition and its subsequent refinements, showing clearly what provisionality means—a slow and gradual honing and growing sophistication of our understanding, driven by accumulating data enabled by the invention of ever-newer instruments.

natarajan_2-102314.png

This does not mean that theories are mere placeholders waiting to be overthrown (in fact, that happens extremely rarely), but rather that as empirical evidence accumulates they aim at a more comprehensive explanation that subsumes earlier views. Although Ball’s interesting case studies extend only up to the nineteenth century, he successfully demolishes the fallacy that provisionality implies that any theory is as good as another, and illuminates how our best current understanding gets gradually altered.

Nonetheless, Ball laments that the scientific enterprise is often seen as a large and dispassionate machine in which objective scientists seek cold facts from experiments in an impersonal and flawless process of discovery. This leaves out the excitement, awe, and wonder that motivate many scientists and that only appear in popularizations of scientific discoveries today. “We first emancipated curiosity at the expense of wonder, and then re-admitted wonder to take care of public relations.” This may help explain the public’s contradictory feelings about contemporary science: an attraction to the romance of discovery and a distrust of provisional scientific results.

Besides curiosity and wonder, two other nonrational forces that condition science are serendipity and ignorance. In Ignorance: How It Drives Science Stuart Firestein goes so far as to claim that ignorance is the main force driving scientific pursuit. Firestein, a popular professor of neurobiology at Columbia, admits at the outset that he uses “the word ignorance at least in part to be intentionally provocative” and clarifies that for him it denotes a “communal gap in knowledge.” He describes clearly how scientists continually uncover new facts that confront them with the extent of their ignorance, and how they successfully grapple with uncertainty in their daily research work. With ample examples from neuroscience he describes the limits of what we currently know, what the uncertainties are, and why they arise especially in the study of complex systems like the brain, the olfactory system, human vision, climate change, and earthquakes.

Especially valuable is Firestein’s ability to capture how science gets done in fits and starts. One example is the discovery of thermophiles. Originally a mere oddity of nature, these are microorganisms that can survive at very high temperatures, and yet the very enzymes that enable them to do so led to the development of the polymerase chain reaction (PCR) technique that is fundamental to most of today’s biotechnology experiments. He demystifies the day-to-day activities of research scientists across a variety of disciplines with case studies illustrating how breakthroughs in understanding, however humble or grand, are essentially unforeseeable even to a seasoned mind. For example: the serendipitous discovery of the cosmic microwave background radiation—the hiss of the Big Bang—that was the result of building a radio telescope. Yet serendipity is not entirely serendipitous; it depends on curiosity and keeping an open mind since we are “not smart enough to predict how things should be” and just need to explore.

A slightly different take on how science works comes from the astrophysicist Mario Livio in his new book Brilliant Blunders.1 Like Firestein, Livio debunks the idea that science is a methodical enterprise that produces fixed truths, and shows how dependent it is on wrong turns and dead ends in research. But by “scientific blunders” Livio means serious conceptual errors that might have held back science. Through exquisitely rendered case studies, he shows how even towering intellectual giants of science—Charles Darwin, Lord Kelvin, Linus Pauling, Fred Hoyle, and Albert Einstein—made grave errors in reasoning. He skillfully peels back the emotional layers of the scientific enterprise and discusses its social setting, showing how eminent researchers can be held captive by their entrenched intuitions and refuse to accept new ideas until they are faced with overwhelming empirical evidence contradicting their views. Even geniuses, it turns out, have trouble recognizing the inherent provisionality of science.

Yet throughout the book Livio stresses that blunders are not only inevitable but an essential part of scientific progress—and have, in fact, led to some of the most impressive intellectual breakthroughs. Take, for instance, Einstein’s firm belief in a static universe, a belief that was motivated in part by purely aesthetic feelings. Applying his general theory of relativity to the universe in 1917, he suggested that a homogeneous, static, and spatially curved model was the relevant solution to the equations governing our universe. His hypothesis, however, had a fatal flaw: in the absence of any other forces Einstein’s universe would simply collapse under gravity’s power. In order to retain the elegance and stability that a static universe provides, he went so far as to add an extraneous term called the “cosmological constant” into his mathematical equations.

After remaining firm for more than a decade, Einstein finally conceded in 1931 the validity of the theory of an expanding universe, but only in the face of overwhelming data from the astronomer Edwin Hubble. Hubble had found that all nearby galaxies were receding from us with speeds proportional to their distance from us, therefore conclusively ruling out a fixed universe model. In collaboration with another physicist, William de Sitter, Einstein then went on to propose an eternally expanding universe in 1932 that no longer required the artifact of the cosmological constant.

All three of these books offer a ringside view of how science actually works, demystifying it in the process. This is all to the good, since demystification might actually reduce the misunderstanding and distrust of science.

In his disturbing book Denialism, the New Yorker writer Michael Specter made the persuasive claim that the accelerating pace of change wrought by scientific and technological progress, and the resultant sense of destabilization, have evoked fear in the public at large.2 Having to contend with more complicated truths, coupled with the fact that scientific progress has brought peril as well—Chernobyl, the thalidomide disaster, mad cow disease—has exacerbated the widespread distrust of science. The instinct is to turn away from a complex reality and yearn for a simpler life.

How can that be addressed? The biologist Jon D. Miller of the University of Michigan has been advocating a new standard of “civic scientific literacy,” by which he means a basic level of scientific understanding that would be necessary to make sense of public policy issues involving science or technology. He proposes the teaching of scientific concepts rather than the retention of information. Since the old model is ill-suited to the current pace of scientific and technological progress, the public needs to learn how to reason in the evidence-based manner that is central to science.

Demystifying the practice of science and describing how scientists themselves deal with uncertainty and provisionality could also help, by humanizing the enterprise, although conveying the information necessary to understand scientific problems and procedures accurately is never easy. Still, acquainting the public with the power and limits of curiosity, and how scientists are persuaded to accept new ideas in the face of accumulating evidence, as Livio does, could prove to be useful. It could lessen the disorienting shock when the “best current understanding” of a certain phenomenon changes. The best way to shore up respect for science may require not defending it as a fortress, but showing its exciting provisionality—while maintaining that it’s the best thing we’ve got.