Human history is full of blood, and it has not only been spilled on battlefields and in dark alleys. Blood itself has had an active part in world events. In his fascinating book Blood: A History of Medicine and Commerce, Douglas Starr explains, among other things, how American expertise in blood banking helped the Allies win World War II in Europe and how controversy over a shady plasma bank in Managua, known locally as the casa de vampiros, sparked the Nicaraguan civil war in the 1970s.

Blood, the book, contains many images of squeamish fascination: about “sixteen million gallons of blood and plasma are collected annually worldwide,” Starr informs us, “the equivalent of thirty-two Olympic-size swimming pools.” Later, he describes how a doctor “sliced through the skin with a scalpel, snipping through a layer of glistening pink connective tissue and exposing the tiny vein, matchstick thin with a reddish-blue tinge.”

If these images make you queasy, you are not alone. Fear of blood is a common reaction, an ancient, evolved response to threat, and red is the universal symbol of danger. There is nothing irrational about this. We have every reason to be afraid of it. Starr’s book is the story of blood, but it is also the story of money, and the dance of death the two of them have lately been doing.

The story really begins in the seventeenth century when William Harvey discovered how circulation works, and how the heart pumps blood in one direction, from arteries through capillaries to veins and back to the heart. It would be some time before the real function of blood, to carry oxygen and other nutrients around the body and to help defend it against disease, was clearly and widely understood, and even longer before anything useful could be derived from this knowledge. This did not stop physicians from at least trying to use blood to treat a variety of afflictions, both physical and spiritual. Bloodletting, for example, is perhaps the most ancient and international medical art. Starr tells us it

originated in the ancient civilizations of Egypt and Greece, persisted through the medieval, Renaissance, and Enlightenment periods, and lasted through the second Industrial Revolution. It flourished in Arabic and Indian medicine….

Doctors bled patients for every ailment imaginable. They bled for pneumonia, fevers, and back pain; for diseases of the liver and spleen; for rheumatism; for a nonspecific ailment known as “going into a decline”; for headaches and melancholia, hypertension and apoplexy. They bled to heal bone fractures, to stop other wounds from bleeding, and simply to maintain bodily tone…. And yet there was never any evidence that bloodletting did any good.

One of the most famous American practitioners of bleeding was Benjamin Rush, a signer of the Declaration of Independence and a great, although misguided, humanitarian. It was during a yellow fever epidemic in Philadelphia in 1793 that he became convinced of how well bloodletting worked. He went from house to house with his bloody lancet, draining liters of blood from hundreds of yellow fever sufferers into bowls or onto the floor. Many of his patients died, but many also survived. In fact yellow fever is not always fatal, and Rush gave himself credit for curing those cases that almost certainly would have subsided anyway. The number of dead was undoubtedly higher than it would have been had he simply advised people to rest.

At the time, however, many doctors swore by bloodletting, which in the days before modern medicine was one of the few treatments they could offer. “Bleed in an upright posture to fainting” was the arrogant or simply ignorant, but widely administered, prescription of the day. It is not known how many people died from the procedure. In 1799, General George Washington, suffering from strep throat, was bled to death by a team of well-meaning Virginia medics. At around the same time, the esteemed British physicians’ journal The Lancet was named for the preferred bloodletting instrument. It wasn’t really until doctors started counting the number of patients who died from it that bleeding went into decline. Statistics was surely, along with antibiotics, vaccination, and open-heart surgery, one of the most important life-saving techniques ever invented.

Doctors would not only take blood out; they also liked to put other fluids, most commonly animal blood, back into the circulatory system and watch what happened. Rudimentary transfusion techniques were invented in the mid-seventeenth century. At the time it was supposed that souls and character traits might also be transferred through blood. The French transfusion pioneer Jean-Baptiste Denis wrote that “sadness, Envy, Anger, Melancholy, Disquiet…corrupt the whole substance of the blood.” He prescribed transfusions of the “mild and laudable blood” of animals. Calf’s blood was transfused to make madmen sane, lamb’s blood to calm the souls of sick children. When foreign substances like animal blood enter into the human circulatory system, the immune system reacts by sending the body into a state of shock. Circulation of the blood comes almost to a halt, blood pressure falls dangerously low, the kidneys clog up with dead calf’s blood cells, and a high fever develops. Even though death was common after animal blood transfusions, the treatment remained popular for nearly two centuries.

Advertisement

There must have been something appealing about the idea that blood carried some profound, if hard to define, power distinguishing classes, families, and races and the personalities thought to go with them. Vitalism, or the belief that blood serves more than a mere mechanical function, may sound old-fashioned, but even in the late 1950s it was a misdemeanor in the states of Louisiana and Arkansas for a doctor to transfuse blood from a black donor into a white person without the recipient’s permission. Scientists knew at the time that the blood of blacks and whites was technically indistinguishable, so it is not clear, although one can guess, on what grounds black blood was deemed to be dangerous to whites.

Transfusion of human blood was in fact a great medical advance. Without enough blood to nourish their cells, our organs suffocate and die. Wounded people or people undergoing operations need new blood to replace what they have lost; hemophiliacs need concentrates derived from blood that will cause clotting—“clotting factors”—because they don’t have the genes to make their own and would otherwise bleed to death; people at risk of infectious diseases need extra gamma globulin to boost their immune systems.

A major figure in the discovery of blood transfusion was Alexis Carrel (1873-1944). Carrel, Starr tells us, was a technically brilliant surgeon who trained in his native France and was able, through the expert teachings of a seamstress named Madame Leroudier, to stitch even tiny arteries and veins together. After emigrating to the US in his early thirties, he made his first dramatic attempt to save human beings with his techniques during the early hours of a Sunday morning in 1908 in an apartment on West 36th Street in New York. There a newborn baby was dying, and her father, also a doctor, believed that a blood transfusion would save her. Carrel stitched one of the father’s arteries to one of the child’s veins and the ashen baby, who had seemed about to die, suddenly began to glow and wail. The baby survived, the tabloid newspapers made Carrel a hero, and the new age of blood transfusion dawned.

Carrel was not only a brave and wise scientist, but also a confirmed believer in faith healing and mysticism. His ideas formed a link between past and future, between the mystical view of the body as a symphony of diverse and erratic humors and the prosaic modern view of the body as something more like a car, with components that sometimes need replacing. At the same time blood came to be seen less like the soul and more as a commodity, like tires or brake pads, that could be bought and sold.

Unfortunately, as soon as blood became a commodity, squalor accumulated around it. Blood and blood products now save many thousands of lives every year. There is no question that blood technology is a great gift to mankind, and that we owe the scientists, doctors, engineers, and even the businessmen who have helped create it tremendous gratitude. However, as Starr’s story shows, the history of blood transfusion has been at times ugly and frightening.

My only criticism of Mr. Starr is that sometimes he seems too nice a man, excusing some of the darker figures in his story. For example, Starr tells us about the Japanese blood transfusion doctor Ryoichi Naito, who during World War II was an advisor to the notorious Japanese Army Unit 731, which carried out atrocities in the occupied city of Harbin, Manchuria. There prisoners, resistance fighters, and civilian Russians and Chinese, including women and children, were used

to study frostbite and starvation, allowing the conditions to progress until death. They tested such new weapons as flame-throwers on the prisoners—and new forms of transfusion, such as emptying their circulatory systems and refilling them with horse blood. Most important, however, was the use of these people for the development of microbial weapons. Each time the researchers devised a new way to spread lethal pathogens, they would infect the prisoners with injections or aerosols, with contaminated feathers brushed under their noses or bacteria poured into their food, and then observe the progress of the disease…. Alternatively, [the] staff would dissect the subjects of the experiment—whether or not the prisoners were still alive.

After the war Naito, in Starr’s opinion, changed. He became a Roman Catholic, retreated to the countryside, and became “a humble, devoted, and honorable” doctor, bicycling from one patient to another in the small village where he settled. According to Starr, this new style of life was a way of coming to terms with his deep sense of humiliation about Japan’s defeat in the war. Remaining obscure may also have been a way of avoiding retribution for what he had done during the war. Whatever the reason, he did not remain in the village for long. “Working among the poor,” says Starr, “he saw how desperately they needed transfusions, and how all too often they could not receive them.” Naito later founded the Japan Blood Bank, subsequently renamed the Green Cross, which became Japan’s biggest blood bank, worth, at its peak, $1.5 billion. Starr believes Naito’s blood-banking ambitions arose from a quiet but profound change of heart, brought on by his work among his defeated nation’s poor. There is perhaps another interpretation. In war, as well as in peace, perhaps Naito knew an opportunity for power when he saw one.

Advertisement

Naito, like virtually everyone else in the postwar blood-banking business throughout the world, was reluctant to admit the dangers of blood transfusion. Hepatitis B is a viral infection that causes jaundice, nausea, and fever and can never be fully cured. In many people who carry the infection, the liver is eventually destroyed or becomes cancerous. Much of the world’s blood supply was infected with the hepatitis B virus in the 1960s. It was particularly common in blood banks, like Japan’s, that purchased blood from their donors, rather than relying on freely given donations. This is so because people who donate blood voluntarily tend to come from the middle class, and do so out of a sense of charity. On the other hand, those who earn money from donating blood tend to be those who have nothing else to sell. Commercial blood banks often attract a disheveled clientele consisting of drug addicts, drunks, criminals, and the destitute, whose blood is dangerous because they have much higher rates of many infections, including syphilis, hepatitis, and HIV.

Richard Titmuss, in his book The Gift Relationship: From Human Blood to Social Policy, contrasts blood systems like Britain’s, which relied on voluntary blood donations, with those, like America’s, which relied heavily on paid donations.1 After examining both systems in detail, Titmuss concluded that those in which donors were paid were at the same time more prone to blood shortages, and also far more wasteful. Most worrying, paid systems were associated with much higher rates of transfusion-related infections. The practice of soliciting blood from paid donors stopped in the late 1980s in most industrialized countries; but it continues today in some developing countries, including India and Pakistan, and in many parts of Africa, where testing for pathogens is often at best intermittent.

In the early 1960s there were many calls for Japan to clean up its blood supply. However, it was not until an American ambassador was infected with hepatitis from a blood transfusion that Japan banned the collection of blood for money. The ban, however, created a severe shortage of blood products, and by the mid-1970s Japan was importing, mainly from America, nearly all of its refined plasma products, from clotting factors to gamma globulin to albumin. At least some of America’s blood in turn came from prisons, impoverished paid donors in the third world, and other risky sources. The Japanese had not cleaned up their blood supply; they had merely shifted the responsibility for keeping it safe onto others, or so they thought.

In the US, Canada, Switzerland, France, and Japan, blood collection systems generally failed to heed the warning of hepatitis, and the horror of HIV transmission through blood transfusion and blood products dawned late on institutions everywhere. Throughout the world, as many as three million people were probably infected with HIV through blood products, including 40,000 hemophiliacs. Many of these cases could not have been avoided. AIDS has a long incubation period, and when the first cases of the disease started turning up in the early 1980s, no one knew where it came from or how serious the epidemic would turn out to be. By that time, many people had already been infected. However, by early 1983, once the disease had been identified and demonstrated to be transmissible through blood, every blood bank in the world should have been placed on war footing.

Even though there was no definitive HIV test at the time, there were other ways of protecting the blood supply. People who have a high risk of hepatitis infection also tend to have a high risk of contracting HIV, so screening for hepatitis, for which there was a test, might have helped prevent the transmission of both diseases through the blood supply; so would have the avoidance of donations from the groups in which most AIDS cases were then occurring, such as gay men, intravenous drug users, and prisoners, as well as from anyone who sold blood for money. Nearly all institutions that could have done something, such as blood banks, hospitals, and even hemophilia societies, were complacent and stubborn, and some were much worse than this. Thousands of people were needlessly infected and died. Even now two hemophiliacs infected in the 1980s die of AIDS in America every day.

The Japanese case illustrates how money and institutional inertia exacerbated the spread of HIV in hemophiliacs. By the spring of 1983 the American company Hyland, a subsidiary of Baxter Laboratories, developed a technique to make clotting factors safe from hepatitis, no matter where they came from, by treating them with heat and chemicals. The HIV virus had not yet been discovered, but by this time it was widely predicted that a virus like hepatitis would turn out to be the cause of AIDS, in which case heat treatment would almost certainly eliminate it as well. The Green Cross, by this time run by a man named Renzo Matsushita, and advised by Takeshi Abe, could have begun importing this safer material in mid-1983. Instead the Japanese waited two years, during which time they continued to import dangerous, untreated clotting factors that would cause nearly two thousand extra AIDS cases among Japanese hemophiliacs. Batches of untreated clotting factors were still being sold in 1987. The problem seemed to lie partly with Japanese national pride. “The ‘not invented here’ syndrome existed in Green Cross to the nth degree,” said an American scientist who had helped develop the heat-treatment process. It was not until the Japanese had developed and licensed their own heat-treatment technology that they agreed also to license the American version.

The list of countries and institutions that could have done better to protect people from HIV in the blood supply is a long one. The French authorities did rely on voluntary blood donations from the public, but this often included prisoners and others considered to be at high risk of various infections. Even after blood product-related HIV cases were detected in France, the National Center for Blood Transfusion, like their colleagues in Japan, resisted providing safer, heat-treated clotting factors to hemophiliacs for much the same reasons as the Japanese. In addition, after an American company developed an HIV test that could be used for screening blood, both the French and the British postponed registering it until their own scientists had developed French and British blood tests which could be marketed instead.

Meanwhile, in both nations, unsafe blood continued to be transfused to patients. In Britain, there had long been a campaign to avoid using blood products from America, which even in the late 1970s were already considered suspect. Hospitals did in fact collect enough blood from well-screened British volunteers for the nation to be self-sufficient. However, in order to process the blood into clotting factors, gamma globulin, and other products, the work at Britain’s largest processing plant, which was state-owned, would have had to be stepped up. That in turn would have meant negotiating with the unions, something Margaret Thatcher was not prepared to do. Describing the episode, Starr writes that we still don’t know how many of the 32 percent of British hemophiliacs who contracted HIV did so as a result of the government’s decisions.

In the US, even after a reliable HIV-screening test was available, large quantities of untested blood products were sold to other countries, mostly in the developing world, that had not yet banned unscreened material. The hemophilia societies set up to protect the interests of hemophiliacs, such as the National Hemophilia Foundation in the US and the French Hemophilia Association, obtained most of their funding from the blood products industry. Even when the signs were becoming very clear, these societies continued to encourage hemophiliacs to use untested clotting factors, rather than advise them to stop until the situation was better understood. Clotting factors permit hemophiliacs to live nearly normal lives, so that they do not have to worry about skinned knees and bleeding joints. Some hemophiliacs would have objected to being more or less housebound for about two years, waiting for doctors to find ways to keep HIV out of the blood supply; but many would have found this preferable to becoming one of the 74 percent of American hemophiliacs who became HIV-positive.

Titmuss saw the inefficiency and danger of systems in which blood was exchanged for money as an indictment of the idea that societies are best governed by free market capitalism alone. The kind of altruism that encourages people to donate blood free of charge to complete strangers in need was, for Titmuss, vitally important. Now we know that a voluntary blood donation system alone was not enough to keep HIV out of France’s blood supply. However, perhaps we can say that whenever money and blood get mixed up, everyone had better beware. In light of the worldwide trend toward increasing privatization of health care systems and the emergence of new and horrifying diseases like Creutzfeldt-Jakob, and finally the increasing demand not only for blood but for organs, which can also carry disease, this is a worrying message indeed.

In the forest, a herbalist once told me, poisonous plants tend to grow very near the ones that can be used for healing. In the same way, most modern medicines are drawn from the very things that do us harm. Vaccines are killed or weakened versions of deadly viruses. Antibiotics that cure bacterial infections mainly come from other microbes. Clotting factors, plasma, and whole blood can save many lives, but they may also transmit hepatitis, syphilis, and AIDS; and it seems possible that they may also transmit the Creutzfeldt-Jakob syndrome and other diseases we don’t even know about yet. The relationship between the things in nature that happen to kill us and the things that happen to save us is an intricate one, and we do not fully understand it. The blood banks have learned some lessons from the AIDS experience, and blood products today are probably safer than they have ever been, at least in Western countries. Even so, every year in the US around seventy cases of HIV infection occur through transfusion because there is a period of several weeks between the time when a person becomes infected with HIV and when a test shows that he is indeed positive. If a person donates blood during this time, the infection will not be detected and will enter the blood supply.

Why has the message about the danger of blood taken so many centuries to sink in? During the early 1980s, Starr writes, such scientists as Donald Francis, Bruce Evatt, and Edward Shanbrom tried to raise the alarm about HIV, but the institutions, blood banks, hospitals, and so on moved much more slowly. Taking safety measures, screening out risky donors, conducting tests, and throwing out old stocks seemed excessively expensive and difficult, like “making a U-turn with the Titanic,” Starr says. So blood was decreed safe even when many people had strong doubts about it. Nearly every institution seemed to harbor a will not to know. In 1992 four French doctors and bureaucrats in the blood-banking industry were found guilty of deception, failure to inform their hemophiliac patients of the risks they were taking, and “non-assistance to persons in danger.” These were intelligent men, entrusted with important responsibilities because of their expertise. Why did they fail to use the heating techniques and blood tests that were available to them? And what have we to fear from the rest of the medical industry, the food industry, or any other industry?

Perhaps a great deal. The blood bankers were prepared to take risks with other people’s lives in order to protect their hospitals, blood banks, and companies from expense and inconvenience. The anthropologist Mary Douglas has explored how cultures and institutions shape our perception of risk. In her essay “The Self as Risk-Taker: A cultural theory of contagion in relation to AIDS,” Douglas tries to explain this particular form of denial. The members of particular groups, she writes, are “tempted to pay more attention to protecting [their own communities] than to protecting the vulnerable points of access in the body itself.”2

For Douglas, the community implies all institutions, including, presumably, the blood industry. Starr’s book describes how that industry developed its own culture, one that always believed it could save lives and money at the same time. This combination of beliefs blinded everyone to the risks. “Let us be careful not to idealize the community,” Douglas warns. “It does not always deal kindly with its members.”

This Issue

February 4, 1999