The US eradicated malaria in 1951. Until then, this parasitic disease, transmitted largely by infected mosquitoes, had been endemic across much of the country. In the Tennessee River Valley, for example, malaria affected almost a third of the population in 1933. By the time the US National Malaria Eradication Program was launched on July 1, 1947, malaria had become concentrated in thirteen southeastern states. The program was led by the newly created federal Communicable Disease Center (now the Centers for Disease Control and Prevention, or CDC) based in Atlanta.
Justin M. Andrews, the CDC’s director at the time, was also Georgia’s chief malariologist. The CDC had itself evolved from the Office of Malaria Control in War Areas, which had been created to defeat malaria in the United States during World War II. Perhaps surprisingly to a modern audience that thinks of it as a disease of poor countries, the histories of American health and malaria are tightly bound. As the historian Margaret Humphreys has revealed, malaria “shaped southern and western [American] history in particular through its impact on labor patterns, mortality rates, and settlement choices.”1
It is easy to forget today how dangerous malaria continues to be. Ninety-nine countries (40 percent of the world’s population, or about three billion people) live under the threat of malaria. The World Health Organization (WHO) reported 225 million cases worldwide in 2008, with 781,000 deaths. These figures are almost certainly underestimates. Most deaths—85 percent—are in children under five years of age.
For a disease that exacts such an enormous toll of human death and misery, it remains shocking that so little has been done by affected countries and large international donors to control malaria. This long epoch of neglect is gradually coming to an end. As Bill Shore explains in his survey of “baffling and surprising” strategies to eradicate the world’s most devastating parasite, “a small number of heroic idealists” are beginning to reverse decades of failure. They have recognized that traditional approaches to malaria control “always fall short.” Instead, defeating malaria requires “moral vision and imagination,” “a deeply intrinsic drive to achieve what others have dismissed as unachievable,” “a willingness to take risks,” and “irrational self-confidence.”
But Shore also shows an aspect of the organizations concerned with malaria that is less heroic, less moral, and certainly not at all idealistic. He exposes how a spirited culture of creativity, confidence, and competition in malaria research too often expresses itself as hyperbole, hubris, and personal enmity. There are frequent examples of scientists who confidently ridicule the work of fellow scientists: “Rival researchers are polite but mostly dismissive of one another,” Shore notes. As he concludes …
1 See Margaret Humphreys, Malaria: Poverty, Race, and Public Health in the United States (Johns Hopkins University Press, 2001), p. 1. ↩
This article is available to subscribers only.
Please choose from one of the options below to access this article:
Purchase a print subscription (20 issues per year) and also receive online access to all articles published within the last five years.
Purchase an Online Edition subscription and receive full access to all articles published by the Review since 1963.
Purchase a trial Online Edition subscription and receive unlimited access for one week to all the content on nybooks.com.
See Margaret Humphreys, Malaria: Poverty, Race, and Public Health in the United States (Johns Hopkins University Press, 2001), p. 1. ↩