“If disease is an expression of individual life under unfavorable conditions, then epidemics must be indicators of mass disturbances in mass life.”

—Rudolph Virchow

1.

Begin with a thought experiment: What might it take to produce a virus with the potential to eliminate Homo sapiens? For a start, it should be one that we are unfamiliar with; our physical naiveté insures only perfunctory resistance to virulent infection. To preserve the element of surprise, the virus must cross to humans from another species. Airborne transmission would encourage such a leap: a cough or simply sharing a breath, especially if only a tiny amount of virus were needed to establish a human foothold. Once inside us, the virus must multiply with extraordinary rapidity, producing catastrophic and irreversible damage to all major organs: liver, heart, lungs, brain, kidneys, and gut. During this phase of fertile proliferation, subtle but significant changes to its structure (mutation) would enable the virus to evade any rear-guard attempt by our immune system to reestablish control. To give the virus the ultimate upper hand, we should possess neither drug nor vaccine to challenge the infection. Finally, we should be denied the means to restrain viral spread, an easy condition to fulfill if one is ignorant of where it normally (and peacefully) resides.1

If this wish list of virulence sounds improbable, Richard Preston will quickly extinguish your skepticism. With almost unseemly relish, he describes the dramatic emergence from Africa of two viruses—Marburg and Ebola—that fit our “perfect” virus rather too well. For instance,

Ebola…triggers a creeping, spotty necrosis that spreads through all the internal organs. The liver bulges up and turns yellow, begins to liquefy, and then it cracks apart…. The kidneys become jammed with blood clots and dead cells, and cease functioning. As the kidneys fail, the blood becomes toxic with urine. The spleen turns into a single huge, hard blood clot the size of a baseball. The intestines may fill up completely with blood. The lining of the gut dies and sloughs off into the bowels and is defecated along with large amounts of blood.

The strain of Ebola found in Sudan, and first discovered in 1976, is twice as lethal as Marburg, killing half of those it infects. The Zaire strain of Ebola is nearly twice as lethal as its Sudanese counterpart. In The Coming Plague Laurie Garrett recounts the details of these discoveries. In the Yambuku Mission Hospital in northern Zaire, Belgian nuns gave out injections of antimalarial drugs with unsterilized needles. Thirteen days after Mabola Lokela, a schoolteacher recently returned from vacation, received such an injection, he became the first known fatality from Ebola Zaire. Eighteen members of his family and friends perished soon after. The virus proceeded to spread through the hospital and surrounding villages. Thirtyeight of the Yambuku staff died, including all of the missionary nurses. A single needle at Yambuku had magnified this chance tragedy into a devastating epidemic.

The first Western physician to be notified of this public health emergency by Zaire’s Minister of Health was Dr. William Close (whose daughter is the actress Glenn Close). He immediately informed the US Centers for Disease Control (CDC) in Atlanta. Samples of blood and tissue from the Ebola victims were distributed to laboratories throughout the world for analysis. At the University of Anvers in Antwerp, a youthful Peter Piot (who was recently appointed director of the new and streamlined United Nations Program on HIV/AIDS2 ) and his colleagues had discovered an unknown virus in the Yambuku samples. With not unnoticed irony, the microscopic image of the virus assumed the appearance of a “?”.

A small team of virologists, including Piot, was sent to Zaire to investigate. They found that villagers had soon recognized the highly contagious nature of Ebola. Families had been quarantined, the dead had been buried far away from villages, and strict roadblocks had been placed between settlements. The international team of scientists visited 34,000 families in more than 500 villages. Of 358 confirmed episodes of infection, there were 325 deaths. Yet despite intense study, the origin of Ebola remained a mystery. Where was it hiding? Most likely in a nonhuman host, perhaps a spider, a bat, or a monkey. But as a World Health Organization report later noted, “As in the case of Marburg virus, the source of Ebola virus is completely unknown beyond the simple fact that it is African in origin.”

Certainly these events were alarming, but they were also remote. To most people living in the northern hemisphere, the risks that these new viruses posed seemed distant. But an unknown strain of Ebola turned up in Reston, Virginia, a few miles outside Washington, DC, in 1989. One hundred crab-eating monkeys had arrived at the Reston Primate Quarantine Unit from the coastal forests of the Philippines on October 4. Two monkeys were dead on arrival and a further score died during the next few days. Such evidence of the unpredictable and far-flung meanderings of Ebola caused panic.

Advertisement

The laboratory’s veterinarian suspected a common monkey virus, but soon, and contrary to all expectations, the illness spread to the unit’s non-Philippine monkeys. Samples of monkey tissue were sent at once to the US Army Medical Research Institute of Infectious Diseases for examination. When Thomas Geisbert looked into his microscope, he saw the tell-tale “?”-like virus. To the horror of the American team, their tests proved positive for Ebola Zaire. The investigation at Reston quickly moved from an intriguing diagnostic treasure hunt to a potentially volatile political crisis. The quarantine unit was a hot zone containing an organism classified as “biosafety level four”: one for which neither cure nor vaccine existed. Ebola Zaire was now only a few yards away from the Beltway. While the military was being alerted, more Reston monkeys were dying far away from the Philippine shipment, strongly suggesting airborne spread. The biocontainment mission began.3

At this point one might wonder what the press was doing while these events were taking place. The genuine sense of panic and anxiety that surrounded Reston is difficult to convey even after the passage of only a few years. Yet while dozens of monkeys were being slaughtered for safety, most military scientists believed that they were dealing with the world’s most dangerous virus, Ebola Zaire. What was their response, given that these infected animals had been in the US for almost eight weeks? Were nearby residents evacuated? No. Was a surveillance center opened to catch early cases of infection? No.

Instead, the military instituted a planned policy of disinformation, even though two men had already been taken ill, diagnosis unknown. One man was thought likely to have Ebola infection. He was taken to a local community hospital. By complete contrast with the Zaire government’s policy of openhanded collaboration, the American response was to mislead. To stave off unwanted attention from the mass media, children were allowed to play freely around the Reston unit; press inquiries were complacently deflected; and local communities and hospitals were recklessly exposed to danger by medical authorities, who were portrayed as heroic when the story of the virus was finally released.

By a stroke of unbelievable and unexpected good fortune, the Reston strain of Ebola only affected monkeys. Four of the animal workers in the quarantine unit developed symptoms of viral infection, and three could have acquired the virus only through the air. If that virus had proven to be Ebola Zaire, the consequences for Reston—and Washington, DC—are unimaginable.

Richard Preston and Laurie Garrett have both collected an impressive amount of evidence proving the global importance of newly emerging infections, of which Ebola and Marburg are, perhaps, the most dramatic instances. Preston adopts a narrative style of reportage which does not sacrifice scientific accuracy. His tale began as an article published in The New Yorker in 1992, and it benefits from expansion. By rooting the events of Reston in the lives of its central characters, Preston is able to convey the startling fears and uncertainties that these scientists felt as the crisis unfolded. He succeeds in translating the sober facts of research literature, which usually provide post hoc rationalizations and justifications for often hazardous decision-making, into a perceptive account of an emergency.

In describing the discovery of Marburg he has much to say about the lives (and deaths) of two victims who were infected by the virus in 1980 and 1987. Remarkably, a careful exploration of their histories suggested that both might have been infected in a single cave near Mount Elgon on the Uganda-Kenya border. Such odd events wrap further layers of mystery around the origin of these novel viruses. Garrett applies a more conventional, though no less persuasive, journalistic method to her subject, explaining step by step the discovery and evolution of particular viral and bacterial diseases, and considering the various possible ways of controlling them.

Still, in their praise of the physicians and scientists who work on these infections, Preston and Garrett ignore one striking truth: that the single occasion of a potentially species-threatening event in the US produced a disquieting response which exposed remarkable passivity and arrogance among the American research community. Little has happened since to suggest that this same mistake would not be repeated. The rule that secrecy equals safety still holds in the government’s public health service.

2.

The increasingly vociferous message about infectious disease is one of impending apocalypse. Here Garrett brilliantly develops her theme that rapidly increasing dangers are being ignored. Her investigations have taken over a decade to complete, and her findings are meticulously discussed and distilled. Her book is a manifesto for those who see our biological future from a somewhat pessimistic perspective. According to Garrett,

Advertisement

While the human race battles itself, fighting over ever more crowded turf and scarcer resources, the advantage moves to the microbes’ court. They are our predators and they will be victorious if we…do not learn how to live in a rational global village that affords the microbes few opportunities.

But what, one might ask, of cataclysmic epidemics of the past? If the ancient Athenians had understood the meaning of microbes, I suspect that they would have shared Garrett’s gloomy view. In 430 BC, Thucydides wrote,

Those with naturally strong constitutions were no better able than the weak to resist the disease, which carried away all alike, even those who were treated and dieted with the greatest care. The most terrible thing of all was the despair into which people fell when they realized that they had caught the plague; for they would immediately adopt an attitude of utter hopelessness, and, by giving in in this way, would lose their powers of resistance. Terrible, too, was the sight of people dying like sheep through having caught the disease as a result of nursing others. This indeed caused more deaths than anything else.4

The Plague of Athens originated in Ethiopia and spread rapidly to Greece. Victims were overtaken with violent fever and declined toward death within a week. By the end of the fourth year of plague, one quarter of the population had perished. The cause of the plague remains a matter of speculation, with smallpox, measles, and typhus the leading candidates. Its origin in central Africa—indeed, in the country neighboring Sudan—raises the additional possibility of a Marburg- or Ebola-like agent. The unusual velocity of physical disintegration, together with symptoms and mortality rates not dissimilar to those of Marburg, adds weight to this possibility.

Our most alarming notions of plague come from the two episodes of Black Death which cut through Europe from the Byzantine era onward. The Plague of Justinian arrived in Europe in 547 and continued to recur sporadically for the next two hundred years. As much as one quarter of the Roman Empire might have been wiped out during this epidemic. A second wave persisted for some four hundred years, beginning near the Caspian Sea in 1346.

Black Death, or bubonic plague, is caused by a bacterium—Yersinia pestis (named after Alexandre Yersin, a student of Louis Pasteur)—transmitted to human beings by human fleas and fleas from the black rat, Rattus rattus. From Eastern Mongolia plague spread to Constantinople and then to Europe across trade routes. Over half of those infected with bubonic plague died within ten days; if you contracted the pneumonic form of Yersinia—spread by airborne bacteria—the chance of survival was zero. The response of the Christian Church was to claim plague as God’s punishment and to accept death as unavoidable repentance. (There is a familiar echo here in the pronouncements about AIDS from the likes of Jesse Helms.) By contrast, the response of most people was summed up in Guy de Chauliac’s suggestion to “flee quickly, go far, and come back slowly,” advice that was heeded most eagerly of all by physicians.

The few doctors who remained conjured up some fanciful theories about the sources of the plague. When Philip VI of France ordered his chief physicians to investigate them, the message came back that the conjunction of Saturn, Jupiter, and Mars at one PM on March 20, 1345, was the indubitable explanation. Endemic plague also led to some startling instances of improvisation. Perhaps the first recorded episode of chemical warfare took place in 1346, when furious Mongolian soldiers hurled plague-ridden cadavers over the walls of a neighboring Italian trading post.

Despite the lack of any germ theory of disease (Galenic humors were still the fashion), Europeans understood the concept of plague’s transmissibility only too well. The Archbishop of Milan noted that “contagion can occur by contact or by breath.” Primitive anti-contagion policies were widely introduced. Cities were quarantined (the word derives from the isolation of incoming vessels for forty days in port), the sick were separated from the healthy in pesthouses (lazarettos), trading was tightly regulated, and burials were carefully monitored.

Human losses remained calamitous. Italy had one of the most advanced public health systems in Europe—each city had a Magistrate for Health—and the effects of plague were recorded carefully. Between 1600 and 1650 the population of Italy actually fell from 13.1 million to 11.4 million. In Venice, an average of six hundred bodies were collected daily on barges. More than 50,000 Venetians died in the plague of 1630–1631, leaving a population smaller than at any time during the fifteenth century. The Baroque masterpiece that presides over the entrance to the Grand Canal, Santa Maria della Salute, was erected in 1630 and dedicated to health and salvation. On November 21 each year Venetians still cross a bridge of boats to celebrate mass and commemorate their deliverance from pestilence. It was Florence, however, that took the brunt of the epidemic. Giovanni Boccaccio wrote of the 1348 plague that

many breathed their last in the open street, whilst other many, for all they died in their houses, made it known to the neighbors that they were dead rather by the stench of their rotting bodies than otherwise…It is believed for certain that upward of a hundred thousand human beings perished within the walls of the city of Florence, which, peradventure, before the advent of that death-dealing calamity, had not been believed to hold so many….

During this entire period, the practice of medicine made little headway. Gastaldi—a Roman cardinal responsible for his local health board between 1656–1657—wryly commented that “the writings of doctors on the cure of plague produce much smoke and offer little light. Medical remedies against the plague have been proven by practice to be of no use and at times dangerous.” Despite huge bribes, prosperous physicians refused to work in the lazarettos. When they did submit to the public authorities, they offered advice at a conveniently discrete distance: the surgeon would call out the patient’s history from a window of the pesthouse, while the physician would shout back the treatment.

If epidemics are ancient phenomena, why the growing sense that there is an unprecedented threat now? Jonathan Mann, a former director of the WHO’s Global Program on AIDS, answers this question in his preface to Garrett’s account:

The world has rapidly become much more vulnerable to the eruption and, most critically, to the widespread and even global spread of both new and old infectious diseases. This new and heightened vulnerability is not mysterious. The dramatic increases in worldwide movement of people, goods, and ideas is the driving force behind the globalization of disease. For not only do people travel increasingly, but they travel much more rapidly, and go to many more places than ever before.

The threat from viruses has received the greatest attention, with HIV rightly demanding our deepest concern. But there are lesser dangers. Hantavirus arrived in the Four Corners area of the southwestern United States (where Colorado, Utah, Arizona, and New Mexico converge) in May 1993. This rodent-borne virus has infected almost one hundred people in the US, over half of whom have died. There is no definitive treatment. Rabies, the most lethal virus affecting humans (death almost always follows soon after infection takes hold), has recently become widely prevalent among wild raccoons in the northeastern United States.5 The WHO records about 50,000 cases of human rabies annually, with 30,000 deaths; there have been only twenty-one reported cases in the US since 1980. Infected insectivorous bats also pose a serious threat. The first rabid bat bite in the US was described in 1953. Since then, over five hundred cases of exposure to rabid bats are recorded annually.6 Nine of the twenty-one US deaths are thought to be due to bat-associated virus. The resurgence of rabies in a largely urban and suburban animal population raises new anxieties that most of the public (and many physicians) are unaware of.

Ordinary bacteria also deserve scrutiny. The specter of tuberculosis continues to cast a somber shadow over global health. Not long ago many experts thought that TB would be controlled by antibiotics, but the latest reports estimate that the decade of the 1990s will bring 90 million new cases of tuberculosis and 30 million deaths. The WHO declared tuberculosis an international health emergency in 1993. The most dramatic increases in incidence have been in Africa, Southeast Asia, western Pacific areas, and the eastern Mediterranean. Ninety-five percent of cases occur in developing nations, where tuberculosis is now the major cause of disease. The economic effects on these low-income nations are devastating: 80 percent of cases are among working people between fifteen and fifty-nine years old. In these countries HIV is endemic and it continues to be largely responsible for the huge burden of tuberculous illness, since the breakdown of the immune system makes it vulnerable to TB. Such synergy between infectious agents is an especially worrisome feature of emerging modern infections.

More worrying still, particularly in developed countries, is the discovery of drug-resistant tuberculosis. In the US, almost three hundred cases of multiple drug-resistant tuberculosis have been diagnosed in hospitals and prisons, mostly in patients with HIV infection. With the incidence of HIV likely to double or even triple in non-industrialized regions of the world, the global incidence of resistant strains of tuberculosis is also likely to rise substantially. Over two thirds of those with drug-resistant infection die. In addition to HIV and drug resistance, the third stimulus to recent tuberculosis outbreaks has been imported infection. In 1991, over two thirds of all cases of tuberculosis in the US were among ethnic minority groups.7 The lack of adequate access to health care for these people raises the risk not only of widespread transmission but also of receiving inadequate treatment once diagnosed, a further factor in producing drug-resistant strains.8 The problem of drug resistance also affects many other bacterial infections.9

The same story applies to fungi. A swelling diversity of infectious types of fungi, emerging resistance to the drugs usually prescribed to treat them, and lack of new antifungal agents—all these present dangers that parallel those for viruses and bacteria. The fungal threat is greatest in hospital settings. When CDC experts reviewed their figures from 1980 to 1990, they found that hospital-acquired fungal infections had doubled.10

3.

Why have infectious diseases emerged again as such a major threat to the human species? Part of the explanation has come from Lee Reichman, the director of the National Tuberculosis Center at New Jersey Medical School, who, as Garrett points out, was one of the first to sound the alarm about a new epidemic of tuberculosis. He has described both medical and public complacency as a U-shaped curve of concern:11 initial successes in improving public health gave way rapidly to a rhetoric of false hope. The relaxation in surveillance of common diseases, such as tuberculosis, has permitted their phenomenal resurgence. The decline between 1950 and 1980 in US tuberculosis rates has eased the pressure on those responsible for funding tuberculosis elimination programs, with potentially damaging consequences. We might already be moving into the trough of another U-shaped curve.

The 1950s notion of a “health transition” to a new state of physical well-being was based on the almost continual success with which scientists developed new drugs and vaccines against common infectious agents. The eradication of microbial disease was widely foreseen; hyperbole was piled on hyperbole by organizations that should have known better. For instance, WHO announced that a time was soon approaching when malaria would be “no longer of major importance.” Such a statement would now seem laughable if it were not so tragically wrongheaded. Currently malaria affects over 300 million people world-wide; many of them suffer from strains of the malarial parasite Plasmodium that have developed resistance to antimalarial drugs. One and a half million die from the disease annually, of whom two thirds are young children. Yet twenty years ago with the gradual eradication of smallpox, optimism did not seem so far-fetched. Only when HIV emerged into public consciousness in 1981 was the “health transition” finally proven to be an embarrassing myth.

The history of HIV points to the second reason for a recrudescence of infection. Richard Preston claims that the paving of the Kinshasa Highway, which traverses sub-Saharan Africa, was one of the most significant events of the twentieth century. This transportation artery allowed HIV to be swept out of central Africa and to be distributed worldwide.12 Our continued disruption and pollution of ancient ecosystems has led to the rapid displacement of unfamiliar organisms into more immediate human environments.

The huge population pressure that we face—we are likely to double our numbers between now and 2050—has produced our own global hot zone. The world’s population currently stands at 5.6 billion but Ismail Serageldin, a vice-president of the World Bank, estimates that its population will stabilize somewhere between 8.5 and 12 billion during the next century. Ninety-five percent of these newcomers will live in the world’s poorest countries. The convergence of these demographic trends, atmospheric warming, widespread chemical pollution, environmental destruction, the consequences of war (famine and refugee populations approached 18 million by the end of 1992), and our own vastly increased mobility has created a boiling broth of infection.

Garrett quotes Rita Colwell from the University of Maryland who estimates that we have identified only 1 percent of up to one million types of bacteria and only 4 percent of five thousand species of virus. A global redistribution of these organisms among biologically vulnerable animal populations, including Homo sapiens, with little herd immunity to new infectious agents will have profound effects. In truth, despite the recent awakening by many scientists to this threat, the coordinated efforts that would be needed to control these organisms, together with the vast regulatory apparatus and enforcement strategies to quell persistent ecological imbalance, seem almost impossible to achieve. The threat is just too distant to be real, the world too fragmented to act as the “rational global village” Ms. Garrett hopes will come into being.

In any case, denial remains common. At a world gathering of over twenty thousand gastroenterologists in 1994, three plenary sessions were devoted to infectious diarrheal disease, which accounts for more than nine thousand deaths per day world-wide. Despite such prominence at an international gathering, fewer than thirty people attended each session, suggesting the lack of concern felt by these physicians and scientists about the most important public health issue facing their specialty.

Nonindustrialized countries are particularly ill-prepared to deal with these new (and old) infectious threats. Political instability, economic chaos, huge foreign debts, poorly developed health-care systems, and the persistence of war have all led to what Garrett calls a “paradigm of perpetual poverty.” Onto this background has been projected a process of relentless urbanization. A British scientist, John Cairns, has called cities “the graveyards of mankind,” and with ample reason. Continuous rural depopulation has increased the number of city dwellers from 275 million in 1955 to 750 million in 1985. This figure is projected to increase to 1.3 billion by the year 2000. The number of megacities with populations greater than ten million will rise from two in 1950 (London and New York) to an estimated twenty-four in 2000, including Bombay, Calcutta, Delhi, Madras, Karachi, Dhaka, and Bangkok. Urban growth is now often a sign of economic and social regress. Hand in hand with urbanization have come epidemics of diseases that heretofore were usually confined to rural areas, such as tapeworms, roundworms, schistosomiasis, trypanosomiasis, and dengue.

These social changes are also affecting industrialized countries. Poverty, unemployment, malnutrition, drug use, homelessness, and lack of access to health care have all led to a process of “thirdworldization.” The health effects are striking. In San Francisco, a 1991 study found that 11 percent of the homeless were HIV positive. Immunization campaigns have stumbled (new cases of measles rose by half from 1989 to 1990) and multiple drug-resistant infections are common. For example, each year in the US, almost seventy thousand adults die from three illnesses that are largely preventable by vaccines: influenza, hepatitis B, and pneumococcal infection. The US public health system is falling apart. In 1988, a committee of the US Institute of Medicine concluded that “we have let down our public health guard as a nation and the health of the public is unnecessarily threatened as a result.”

What can be done? If the coming plagues are embedded in deep-rooted sociocultural trends and distorted economic development, the inevitable answer is, not much. This conclusion may sound defeatist, but it is implicit in the warning of Jonathan Mann quoted by Laurie Garrett:

A worldwide “early-warning system” is needed to detect quickly the eruption of new diseases or the unusual spread of old diseases. Without such a system, operating at a truly global level, we are essentially defenseless, relying on good luck to protect us.

Luck? This sounds as if we are back in the court of Philip VI, and relying on celestial prophecy. Laurie Garrett devotes twenty-nine pages of her 622-page text to possible solutions. And rather thin they are too. The experience at Reston proved how unprepared the US was for a potentially lethal epidemic. In the most recent laboratory scare on August 8, 1994, a scientist at the internationally recognized Yale Arbovirus Research Unit became infected with Sabia, a newly emerging insect-borne agent that had been discovered in Brazil only two years earlier. The scientist inhaled virus material from a leaking plastic centrifuge tube. An inquiry by the CDC charged the researcher with misconduct and the unit with poor emergency response protocols.13 Shouldn’t the public be informed about the dangers of this research before it is allowed to continue?

In India, the two epidemics of Yersinia pestis in the fall of 1994 (one in Maharashtra State and the other in Surat) both became more and more acute despite clear indications of their presence—widespread deaths among the local rat population—for many weeks.14 Two hundred people died of plague before the disease was confirmed. Two hundred thousand people fled the area, some of whom were infected, spreading Yersinia to Delhi and Calcutta Even though the large number of suspected cases has not been confirmed, poor surveillance combined with poor communication led to unwarranted panic. No single agency—CDC, WHO, the military, or a non-governmental organization (e.g., Médecins Sans Frontières)—currently has the resources, staff, or equipment to act as a rapid-response strike force during a civilian health emergency. Any proposed early warning system will require excellent surveillance in regions of highest risk (e.g., in places where the ecosystem is severely disrupted, as by deforestation) and highly efficient communication. Surveillance will be hampered if there is poor indigenous primary health care or inadequate health education. Efforts to streamline communication systems have made some progress with the use of satellite technology. In January this year, the CDC began an international on-line journal on emerging infectious diseases, receiving reports of new infections filed from the field. The collection and dissemination of information about new outbreaks of infectious disease will greatly help scientists to track, isolate, identify, and control these agents.

There is a continuing and urgent need for research and more research. Garrett argues for inquiries into ways to change social behavior (e.g., how to empower women in the developing countries in their relations with men; how to introduce successful needle-exchange schemes) as well as into microbial ecology (patterns of disease that follow urbanization and “third-worldization”). Both kinds of inquiry cross traditional disciplines of sociology, biology, microbiology, and environmental science. This dual strategy has been forcefully endorsed in a recent report from the US Institute of Medicine:

The lesson of history is that prevention of infectious diseases by prophylaxis or immunization will be only partially effective in the absence of changes in human behavior and ecology.15

It is also being recognized that past vaccine research has been misdirected.16 In the rush to develop new vaccines, scientists have only belatedly understood that their technical ability to mass-produce vaccines has failed to match their knowledge about the cellular and molecular processes used by the body to protect itself from invading pathogens. The most appalling example of this failure was the abandonment of several clinical trials of candidate HIV vaccines in the US at a late stage in their development because of their poor efficacy.17

All these strategies—an anti-infective strike force, surveillance, education, improved health-care systems, and research—depend on funding. And funding depends on politicians. Here is one of the worst fears associated with the prospect of a plague: Any protection we might conceivably design depends on the foresight and commitment of politicians. As a recent report from the Brookings Institution accurately pointed out, politicians frequently have to make decisions under circumstances of deep uncertainty, and lack of information can produce disastrous consequences.18 For instance, the swine-fluvaccine program was authorized in 1976 after advice indicating that the US was soon to face a substantial epidemic. No such epidemic ensued, although fifty-eight people died from Guillain-Barré syndrome, an unforeseen complication of administering the vaccine. The omens are not good. A US Congressional bill to trim the Health and Human Services budget for 1995 proposes to cut CDC funding by $32 million. This includes a reduction by $2.8 million of the amount appropriated for the CDC’s infectious disease program. Effective reporting of the risks from infectious diseases is critical if politicians are to understand the seriousness of this threat. The Reston debacle provides a cautionary lesson here.

Finally, should we begin to confront the unthinkable? Are our efforts at prevention largely inadequate measures to deal with an increasingly destructive process over which we have little or no control? Why do we believe that we alone of all animals can alter the pace of change of this emerging threat? In a provocative analysis, Marc Lappé invites us to view the infectious epidemic in an evolutionary light. He writes that,

At the root of the resurgence of old infectious diseases is an evolutionary paradox: the more vigorously we have assailed the world of microorganisms, the more varied the repertoire of bacterial and viral strains thrown up against us.19

Lappé claims that our perception of man as the dominant and most successful animal species is mistaken. Rather than viewing organisms in relation to Homo sapiens, we could regard human beings as a part of their evolution. Our relentless reductionist scientific focus on the minute biochemical and genetic details of these organisms may prevent us from observing a larger truth: that our efforts to control the environment have produced challengers to our species that are more and more resistant to control. Human societies are responsible for the accelerated evolution of infectious diseases. It may be that only wholesale reversal of our social development, in a direction we can hardly imagine, would check this process. But such a reversal will never happen.

Yet we can also recall, however grimly, that bubonic plague once proved to have positive as well as negative consequences. The postmedieval public health tradition was strengthened, the impulse to care for the sick was initiated, a revolution in scholarship took place, and medical science was pursued with new energy. Will it once again take a species-threatening epidemic to provide the opportunity for vigorous human renewal?

This Issue

April 6, 1995