Here in Britain we are all criminals: guilty of conniving at a crime against humanity committed by a government that is polluting the Irish Sea, the British Isles, the entire globe with the radioactive discharges from its nuclear plants at Sellafield, a village in northwest England, on the Irish Sea. According to Marilynne Robinson, the author of the novel House-keeping and now of the book under review, “The earth has been under nuclear attack [from Sellafield] for almost half a century.” This book is aflame with indignation at the diabolical practices of the British Atomic Energy Authority, at the irresponsibility of our National Radiological Protection Board, at the careless indifference of our venal members of Parliament and of the British public, at the American press for failing to warn unsuspecting tourists of the deadly dangers threatening their health if they set foot on these poisoned isles, and the American government for wasting its armed forces on their protection.

Since reports of scandalous happenings that at first seemed beyond belief have often turned out to be true, I approached these accusations, which have been taken seriously in some reviews of the book, with an open mind. I had read of an accidental release of radioactive smoke from Sellafield and of radioactive wastes being discharged into the Irish Sea, but without knowing how much these discharges had added to the natural radioactivity that surrounds us, I had not been able to judge how dangerous they were.

The nuclear plants at Sellafield were constructed shortly after the end of World War II by the Labour government of Clement Attlee, in the first instance to produce plutonium for atomic bombs. Attlee and a few of his close associates reached that decision because the war had left Britain without allies. The United States had entered the war against Germany only after being attacked by Japan, and the war had ended without any treaty pledging the United States and Britain to come to each other’s aid in case of another attack. Attlee feared that Britain might again find itself alone, as it did in 1940, and decided that having the ultimate weapon was essential for its security.

Under an agreement between Franklin Roosevelt and Winston Churchill signed at Quebec in August 1943, the first atomic bomb was developed at Los Alamos by a joint Anglo-American-Canadian team. According to this agreement,

any post-war advantages of an industrial or commercial character should be dealt with as between the United States and Britain on terms to be specified by the President of the United States to the Prime Minister of Great Britain.

Doubts about postwar collaboration left by this agreement were allayed by an aide-mémoire signed by Roosevelt and Churchill at Hyde Park in September 1944, promising that full atomic collaboration between the two countries for military and commercial purposes should continue after the war, unless and until terminated by joint agreement. Seven months later Roosevelt died, and it seems that no other American officials knew of that agreement until they were told of it by the British. After the victory over Japan, Attlee and President Truman signed another document stating: “We desire that there should be full and effective cooperation in the field of atomic energy between the United States, the United Kingdom and Canada,” but the following year Congress made most forms of atomic collaboration with other countries, including Britain and Canada, illegal.1

Nuclear reactors use the fission of uranium atoms to produce heat and plutonium. Natural uranium consists of two kinds of atoms, one having 235 and the other 238 times the weight of a hydrogen atom. For each atom of the former there are 140 atoms of the latter. Every so often an atom of uranium 235 splits up spontaneously into two lighter atoms with the emission of neutrons. If one of these neutrons collides with and is absorbed by another atom of uranium 235, that atom in turn splits, with the emission of more neutrons. In a large lump of pure uranium 235 this sets up an uncontrolled chain reaction leading to an atomic explosion.

In natural uranium, chain reactions do not occur, at least not on Earth, because the atoms of uranium 235 are too thinly spread and most of the neutrons emitted by them travel so fast that they escape without being absorbed. In nuclear reactors that escape is prevented by a “moderator,” a substance made of light atoms that bounce the neutrons back and forth until they have lost most of their speed and therefore have a better chance of being absorbed. The first American reactor for plutonium production at Hanford in the state of Washington consisted of a pile of uranium rods immersed in water that acted both as a moderator and as a coolant, and thus allowed a controlled chain reaction to take place. In that reaction neutrons captured by uranium 235 generated more neutrons, together with radioactive fission products and energy, while neutrons captured by uranium 238 generated plutonium that was later extracted from the uranium rods in a chemical processing plant. The reactor required a large supply of very pure water, a safe way of discharging it, and a safe distance from large centers of population. No suitable site of this kind could be found in Britain.

Advertisement

The British team that returned from Los Alamos had to design their first atomic piles and the chemical separation plant for the extraction of plutonium with knowledge of only part of the American experience. They decided to use an as yet untried system: a pile of uranium rods interspersed with rods of graphite (pure carbon) as a moderator was cooled by a stream of air drawn in from below the reactor; the air was discharged, after being filtered, from 400-foot-high chimney stacks. The atomic piles were built at Windscale, the site of a wartime ordnance factory near the village of Sellafield, on the Cumberland coast. The first pile went into operation in November 1950, and the first British atomic bomb was exploded in Australia in November 1952, the same month as the first American hydrogen bomb.

Under the neutron bombardment the graphite rods in the Sellafield plant gradually became brittle. That brittleness could be cured by allowing the pile to warm up above its normal working temperature for several hours. In 1957, during one such operation, some of the fuel rods overheated and caught fire. While the operators tried to cool the rods by blowing more air through the pile, highly radioactive vapor escaped through the chimney stacks; finally the fire was extinguished by flooding the pile with water. Most of the dangerous radioactivity that resulted came in the form of radioiodine that contaminated the nearby countryside and made the milk from the cows grazing there unfit to drink for several weeks. More came in the form of polonium (the radioactive element Marie Curie named after her native land). At the request of the prime minister, Harold Macmillan, the Medical Research Council (an autonomous body equivalent to the National Institutes of Health) set up an independent committee to consider the consequences of the accident on the workers at Windscale and on the public, but the committee was not told about the release of polonium.

Radioiodine can give rise to cancer of the thyroid, but monitoring of the radioactivity of the thyroids of workers at Windscale and of people living nearby showed that none of them had received dangerous doses. The committee concluded “that it is in the highest degree unlikely that any harm has been done to anyone in the course of this incident.”2

Before 1957 exposure to radioactivity below a certain threshold was generally believed to be harmless, but in the years that followed scientists became increasingly concerned about the biological effects of the radioactive fallout from atomic weapons tests. They found that the probability of a mouse developing cancer, or of a fruit fly’s offspring being affected by a genetic mutation, increased if it received a dose of radiation, however small. It may increase only from one in 50,000 to one in 49,999, but this means that absorption of the same small dose by each of 50 million people may give rise to a hundred additional cases of cancer.3

In the light of these findings the National Radiological Protection Board, an autonomous body set up by the British government in 1970, later reevaluated the likely aftereffects of the Windscale fire. A plume of radioactive iodine and polonium spreading out from Windscale over parts of Britain and Northern Europe would have caused in many people traces of radioiodine to be taken up by the thyroid glands and traces of polonium by the lungs. Even though most of them would have received only minute doses of each, the probability that some of them would later develop cancer was thereby increased. Calculations showed that, in the forty years following the fire, there might be about 260 cases of thyroid cancer over and above the 27,000 or so naturally occurring ones in the affected populations. Of these additional cases about thirteen might prove fatal. Nine cases of other fatal cancers might be caused by the fallout of polonium.4

However, according to Rosalyn Yalow, the American physicist who received the Nobel Prize for Medicine for her invention of radioimmunoassays, an important and widely used tool in diagnostic medicine, there is no trustworthy experimental evidence to support these views. On the contrary, a great variety of observations indicate that our bodies are well equipped to withstand moderate doses of radiation. For example, no increased incidence of cancer or genetic abnormalities has been found in populations living in regions where the natural background radiation is abnormally high.

People in the Rocky Mountain states in the US receive twice as much natural radiation as the rest of the American population, but cancer rates there are lower than average. In certain districts of India and Brazil, people’s exposure to natural background radiation over a period of twenty-five years equals the acute exposure of Hiroshima and Nagasaki survivors, yet no deleterious health effects could be found there.5 If Rosalyn Yalow is right, there would have been no additional cases of cancer, nor any other deleterious effects as a result of the Windscale fire.

Advertisement

Marilynne Robinson writes that the Windscale fire bore “an uncanny, not to say unnerving, similarity” to the nuclear accident at Chernobyl. In fact, the two reactors were quite different and so were the accidents. The atomic piles at Windscale were air-cooled, while those at Chernobyl were cooled by water under high pressure. At Chernobyl the cooling water turned into steam that reacted with hot metals and graphite rods, producing hydrogen and carbon monoxide, while the nuclear reaction was still continuing. The hydrogen and carbon monoxide ignited, causing a tremendous explosion that lifted the roof off the building. There followed a meltdown of the reactor that could have contaminated the ground water of the region had it not been contained by the heroic efforts of the workers who excavated a tunnel underneath the reactor and filled it with concrete.6 At Windscale the uranium and graphite rods caught fire after the nuclear reaction had already been shut down, and the smoke from the fire escaped through the chimney stacks. There was no explosion and no meltdown. Extinguishing the Windscale fire with water could have initiated the same dangerous reaction between the steam and the graphite rods as at Chernobyl, but fortunately it did not, and the fire was put out.

Doses of ionizing radiation are now measured in units called sieverts (after the Swedish radiation physicist Rolf Sievert). The dose received by an entire population is obtained by multiplying the dose received by one typical individual by the number of people in the population; the product of these two numbers is called a man-sievert. 7 On that basis the dose released by the accident at Three Mile Island amounted to 20 man-sieverts, the one at Windscale to 1,300 man-sieverts, and the one at Chernobyl to 150,000 man-sieverts, or over a thousand times greater than at Windscale.

So much for the unnerving similarity between the two accidents. For comparison, in 1963, the year of the atmospheric test ban, the radioactive fallout from nuclear weapons tests had caused the atmosphere to release 500,000 man-sieverts to people at ground level, and in 1986 it was still releasing 50,000 man-sieverts. The global exposure to natural background radiation amounts to twelve million man-sieverts per year. Distributed evenly over the world’s 6,000 million people, this gives each of them an annual dose of two millisieverts, but in fact the distribution is very uneven. The International Commission on Radiological Protection recommends that the average annual exposure of members of the public to man-made radiation should not exceed one millisievert per person and that the annual dose to the most exposed workers should, on average, not exceed fifteen.

Robinson is indignant that no one was evacuated from Windscale. She writes:

Comparison in this regard is to the advantage of the Russians who only delayed evacuation and who only temporized for a few days about the severity of the accident.

In fact there was no case for evacuation, and there is none today, not even with hindsight, because even those most exposed did not receive more than ten millisieverts. Robinson alleges that the staff at the reactor had undertaken an experiment whose nature has never been revealed. As we have seen, the cause of the fire was a routine maintenance procedure whose danger was not appreciated. It was described in detail in the White Paper published in 1957. Robinson writes that the Magnox reactors at Calder Hall next to Sellafield are similar to the one at Chernobyl. This is untrue: the piles of the Magnox reactors are cooled with carbon dioxide, a gas used to extinguish fires, thus avoiding the danger of fire as well as of explosion, while the Russian reactors are cooled by water under high pressure. According to Robinson the type of reactor that caught fire still is being used. In fact that reactor was never repaired and its twin was closed down immediately after the fire.

Originally the reprocessing plant at Sellafield was constructed to separate plutonium for military purposes only, but it was used later also to reprocess the spent fuel of civil reactors in Britain and other countries. The ensuing radioactive waste is separated into three categories of different radioactivity: high, intermediate, and low. The first two categories are stored. After treatment and further reduction of radioactivity, low-level liquid waste is discharged through a two-mile-long pipe into the Irish Sea. One of its components is plutonium, whose compounds are practically insoluble in water, are as dense as gold, and were expected to sink to the bottom and get covered with sediment. Another component is caesium 137, which resembles sodium in its chemical properties. Its salts are soluble and were expected to be diluted and dispersed without causing any perceptible rise in radioactivity of the oceans.

Between 1957 and 1982 British Nuclear Fuels discharged into the Irish Sea about a quarter of a ton of plutonium dioxide, which corresponds to 17,000 curies (not 50,000, as Robinson writes) and 650,000 curies of caesium 137, in addition to other smaller quantities of long-lived fission products and other radioactive elements. In 1982 the Atomic Energy Establishment at Harwell and the National Radiological Protection Board discovered that measurable quantities of plutonium and americium were getting washed ashore and were carried inland by the wind, even though their concentration in sea water is very low. In one mile along the coast from Sellafield the concentration of plutonium was seventy times greater than that deposited there by the atmospheric nuclear weapons tests of the 1950s and the early 1960s; three miles away it was ten times greater, seven miles away five times greater, and twenty miles away and beyond it was undetectable. The excess of caesium 137 was only five times above background level at its highest and fell to that level three to nine miles inland and beyond. In 1982 the total radioactivity deposited on land amounted to twice that deposited on the same small area by the nuclear weapons tests.8

In response to concern that radioactive elements might be taken up and concentrated by marine life, the Minister of Agriculture, Fisheries and Food commissioned regular annual studies of the fish, crabs, mussels, and seaweed near Sellafield. This showed the concentrations of plutonium in shellfish and seaweed to be up to a thousand times greater than in the seawater. Even so, heavy consumers of seafood caught near Sellafield would have been exposed to only about a third of the annual dose of one millisievert recommended as a safe limit by the International Commission on Radiological Protection. 9

What about the caesium 137 that was poured into the Irish Sea? In 1987 contamination of the Irish Sea with caesium 137 produced a radioactivity of one tenth of a becquerel per kilogram of seawater, except near the Northwest coast of England where the activity rose to half a becquerel per kilogram.10 For comparison, the natural radioactivity of seawater amounts to twelve becquerels per kilogram, nearly all of it from potassium 40. Hence the discharge from Sellafield has increased the radioactivity of the Irish Sea by just under 1 percent, which can hardly be called a danger.11

All the same, the buildup of radioactivity could not be allowed to continue. The Sellafield plants are now being modernized and the outflux of radioactive waste is being reduced to near zero at a projected cost of over three billion dollars.

Robinson reports that in the village of Seascale, a few miles from Sellafield, one child in sixty died of lymphoid leukemia. Between 1955 and 1984 there have been seven deaths from leukemia among 1,068 children born there, or one in 152. This is ten times the national average. On the other hand, mortality among 1,546 children living there, but born elsewhere, was normal. No case of leukemia or lymphoma was reported among them; nine of the ten deaths that did occur were caused by accidents. The additional frequency of lymphoid leukemia expected from the levels of radioactivity determined at Sellafield in a long series of painstaking measurements should be not one in 152, but one in 50,000.12

This disturbing discrepancy has stimulated statistical analyses of the incidence of leukemia near and far from nuclear installations. They showed a significantly raised incidence of lymphoid leukemia near Sellafield and Dounreay, two nuclear installations built before 1955, and a significantly lowered incidence near other nuclear installations. The study was headed by Sir Richard Doll, the distinguished epidemiologist and codiscoverer of the association between smoking and lung cancer. The increased incidences of lymphoid leukemia were too large to be owing to chance, but they could not be explained by the observed levels of radiation.13 Nevertheless, in response to a lawyer’s advertisement which offered his services, several of the families at Sellafield whose children contracted cancer will now sue British Nuclear Fuels for damages.

Robinson heaps scorn on an enquiry headed by Sir Douglas Black,14 a former chief medical officer of health whose “line of reasoning was ingenious rather than persuasive,” because he argued that the very low level of additional radiation from the nuclear plant could not account for the high incidence of leukemia. She chooses to ignore the results of two other independent inquiries that confirmed his conclusions.15

Despite these findings, public suspicion that some hitherto undiscovered effect of radiation is responsible may persist until an alternative explanation has been found. Leo Kinlen of the Cancer Research Campaign Epidemiology Unit in Edinburgh has advanced such an explanation and tested it by making a bold prediction.16 Both Sellafield and Dounreay were small isolated villages until the building of the nuclear plants brought large influxes of people. Such movements of people into isolated rural areas are liable to bring infections with diseases to which larger populations have become immune. If we suppose childhood leukemia was caused by some unidentified virus, as has often been suspected, then an influx of population into an isolated rural community, distant from any nuclear installation, should have given rise to an increased incidence of leukemia similar to that seen at Sellafield and Dounreay.

Before 1948, Kirkcaldy in Scotland, with a population of 1,100 people, was such a relatively isolated rural community, far from any nuclear installation. By 1961, the founding of the new town of Glenrothes had raised the population to 12,750. Kinlen predicted that, if his hypothesis were right, this rise should have led to an increase in childhood leukemia. Examination of the medical records did indeed show a significant excess of leukemia deaths below the age of twenty-five; ten observed deaths compared to 3.6 expected, seven of them below the age of five; and six occurred between 1954 and 1959. After 1968 there was no excess, and indeed a significant deficit. Kinlen found no such excess in other regions where the population had not increased. The cluster of cases of childhood leukemia in Glenrothes is the first to be predicted by a hypothesis that was formulated before such data were collected. There is as yet no direct evidence for a viral origin of most human leukemias, but the similarity between them and animal leukemias known to be caused by viruses has long been suggestive. Kinlen’s important result will intensify the search for a possible virus, especially since unexplained clusters of lymphoid leukemia have also been found in other places far from nuclear installations.17

In 1985 a Committee at the Department of Health discussed another conceivable explanation. Perhaps children had some special pathway, not found in adults, that would cause traces of plutonium to be selectively absorbed and concentrated in their bone marrow where it would give rise to leukemia. The minutes of that meeting were leaked to Greenpeace who informed the House of Commons Environment Committee: “Unbelievably, it was suggested that Cumbrian children should be fed contaminated food and monitored to see what effect it had on them in terms of concentration within their bodies.” The Environment Committee reports: “Not surprisingly, we were very shocked by this.” Journalists were equally shocked. The Times carried a front page article headlined ” ‘PLUTONIUM FOOD’ SOUGHT FOR CHILDREN,” the Daily Mail headlined “SHOCK OVER ‘NUCLEAR TEST’ CHILDREN.” Other articles followed. The Committee’s report states:

We questioned Greenpeace witnesses closely on their statement and found that under examination they began to shift their ground. The experiments became “voluntary”—as if parents would submit their children to these risks. However we were assured by Greenpeace that their claim could be fully substantiated. At our insistence they sent us a confidential copy of minutes taken at the meeting in the DHSS at which the proposal for the experiment was allegedly made. We examined these carefully and could find no reference which could be construed as supporting the claim. The nearest we could come to it was a discussion that the only way in which incontrovertible evidence could be obtained of the effects of ingestion of contaminated shellfish on the human system was by finding a group which had never eaten shellfish, such as children. But, it was added, such an experiment would be wholly unacceptable. Thus on the most generous of interpretations, Greenpeace stretched a passing reference to the point of extreme distortion, just for the sake of sensation or, more seriously, in order to mislead the Committee

The report of the House of Commons Committee has been published,18 but this does not deter Robinson from gleefully citing the reported intention to feed plutonium to children as a prime example of “the moral aphasia” of British society and alleging Britain to be so contaminated with plutonium dust that many children would have eaten it already.

Is Britain really “befouled” by radio-activity? Table 1 shows that the exposure to radioactivity of the average American is half as much again as that of the average Briton, because in the US exposure is greater both to medical X-rays and to radon.

Table 1

Radon is a natural radioactive gas given off by certain rocks. Recent research has shown that it can accumulate dangerously in people’s houses. Radioactive effluents and fumes from nuclear installations account for only 0.02 percent of the exposure in either country. Radon in houses, not nuclear effluent, presents the greatest single radiation risk in both countries.19

Some of Robinson’s most venomous diatribes are directed at the British National Radiological Protection Board. She calls it “the incredibly feckless agency responsible for monitoring public exposure to radiation,” “small and besieged,” “struggling with a shrinking budget,”

a creature of the state, funded, shielded and patronized by the government and flourishing in the balmy atmosphere of Crown Immunity, where no acts of parliament apply, and under the protection of laws affecting national defense and commercial confidentiality as well as the Official Secrets Act.

In fact the board is an independent advisory body set up by an act of Parliament, and answerable for its own actions. It does not have crown immunity and is only partly funded by the government. The board regularly publishes detailed reports of radioactivity throughout the British Isles, and all its publications are freely available in the Cambridge University Library, where I went to study them.20 Robinson alleges that British doctors “are legally prohibited from giving out information that is not officially authorized.” In fact, British doctors’ contracts with the Department of Health contain no such restrictions; doctors do not have to sign the Official Secrets Act. Robinson’s book abounds with scientific errors and unfounded allegations, reminding me of the lines in Heinrich Kleist’s play The Broken Jug, where the judge says to the defendant: “In your head science and error are kneaded together intimately as in a dough; with every slice you give me some of each.”

I wonder why Robinson turned a blind eye to American nuclear plants that have polluted the countryside with radiochemicals. I read that at Hanford an estimated 15 million gallons of high-level liquid waste containing plutonium has been pumped into rocks saturated with water beneath the Hanford reservations. Last year water from a local spring that flows into the Columbia river was found to contain 350 becquerels of plutonium per liter. Compare this with one tenth of a becquerel of caesium and one thousandth of a becquerel per liter of plutonium in the Irish Sea.21 I was alerted to the American discharges by a British newspaper report alleging the discharge of 4 million kilos of plutonium into the rocks below Hanford, which seemed absurd. When I remonstrated with the editor, he checked with his Washington correspondent, who told him that it was four kilos. Nearly all of Robinson’s more than three hundred cited sources are newspaper reports, but she apparently never checked what she read there.

There is much to be criticized about the operation of the plutonium factories at Sellafield and the misleading information they issued repeatedly about their radioactive discharges. Both the original piles and the reprocessing plant were built hurriedly and suffered from technical defects. The government’s original decision to pour low-level radioactive waste into the Irish Sea was taken in 1950, at a time when less was known than today about the harmful biological effects of radiation and the possible buildup of radio-nuclides in living creatures, but the discharge should not have been continued for over thirty years. It was inexcusable that the government concealed the escape of polonium during the Windscale fire from the Medical Research Council Committee set up in 1957 to study the health effects of the accident and from the National Radiological Protection Board’s reassessment of its impact in 1982. That escape became known only after publication of the board’s report, and was the subject of an addition to the report published in 1983, twenty-six years after the event.22

In 1983 Greenpeace found radioactive debris washed up on a beach near Sellafield. This turned out to be caused by faulty separation of the effluent from the reprocessing plant about which the firm had kept quiet. It was the National Radiological Protection Board that Robinson so maligns that told the government to warn the public about the contamination of the beaches and urged it to clean them up. These and other scandalous malpractices shook public confidence in the management of Sellafield, but none of them had the severe ecological consequences of global significance that Robinson attributes to them.

Robinson’s account of Sellafield forms the second part of her book. In the first part she presents a social history of England from the fourteenth century to the present day intended as background for her discussion of contamination at Sellafield. She writes:

For decades the British Government has presided over the release of deadly toxins into its own environment…. Such behaviour…has…a history in which the inhibitions which expedite it and the relations it expresses evolve together…. The core of British culture is the Poor Law…. A very important article of faith was that wages of workers should not exceed subsistence.

This last claim is broadly true. In the seventeenth century Sir William Temple said that high wages would make the poor “loose, debauched, insolent, idle and licentious,” and in the next century Adam Smith declared that “everyone but an idiot knows that the lower classes must be kept poor or they will never be industrious.” He might have added “ignorant,” for England did not introduce universal schooling until 1880, while Prussia had introduced it already in 1763.

However, these snobbish attitudes were not the real cause of widespread poverty. In preindustrial times most people were poor everywhere because insufficient wealth was produced to keep everyone housed, fed, and clothed. The Englishman Gregory King showed this vividly in the “Scheme of the Income and Expense of Social Families of England” that he published in 1688.23 At the top of the scale, he lists 186 families of spiritual and temporal Lords with annual incomes of $6,000 per head (calculated on the basis of one pound sterling in 1688 being equivalent to $75 at present prices). At the bottom there are 850,000 families of laborers, servants, cottagers, paupers, soldiers, and seamen with yearly incomes of between $150 and $500 per head, and finally 30,000 vagrants, gypsies, thieves, and beggars without any income. Only 2,700,000 people earned more than they needed for their bare subsistence, while 2,800,000 earned less and had to be helped by the others.

The distribution of as much as four fifths of the income of the 60,000 wealthiest people would have raised the annual income of the nearly three million poorest by only $200 per head. This might have doubled the income of some of them, but it would still have left them desperately poor. In 1622 a preacher declared that laborers “are scarce able to put bread into their mouths at the week’s end, and clothes on their backs at the year’s end.” The Cambridge historian Peter Laslett writes:

It is probably safe to assume that at all times before the beginning of industrialization, a good half of all those living were judged by their contemporaries to be poor, and their standards must have been extremely harsh, even in comparison with those laid down by the Victorian poor law authorities.

This was true of much of the rest of Europe, where the annual GNP per person was about the same as in India today ($235). The Poor Law, contrary to Ms. Robinson’s view, was a safety net, designed to keep people within the village community from starvation; it was a Christian institution that compelled the more fortunate half of the population to help the other half to survive.24

Robinson alleges that high infant mortality rates and short life expectancy have been particularly characteristic of England, but we know them to have been universal before the present century. In 1693 the English astronomer Edmund Halley published a study of life expectancy in the German city of Breslau, where good records of births and deaths were kept. Of every hundred children born, only fifty-one were alive at the age of ten and only thirty-six survived to the age of forty.25 Most cities used to be death traps, where life expectancy was no more than twenty years, because people lived crowded together and perished from infections. In England, child mortality was lower, and life expectancy longer than in Breslau: between 1550 and 1800 about three quarters of English children born survived to the age of ten, probably because well over four fifths of the population lived in villages, and people were therefore less exposed to infections. Robinson’s scathing picture of nineteenth-century England would make one expect that Americans lived longer than Englishmen. To my surprise I found the contrary. In 1850 the average life expectancy in the state of Massachusetts was 38.3 years for males and 40.5 for females;26 in England it was 40 years for males and 42 for females.27

Robinson alleges that the present Welfare State defrauds the poor. According to her,

The British government turns a profit on the National Insurance System which goes into the treasury. So those who pay National Insurance” [which includes all those employed] “are taxed at a rate that subsidizes other activities of government.

The truth is that National Insurance Contributions almost exactly balance benefits that include pensions, unemployment and sickness pay, and others. The National Health Service from which everyone benefits is financed almost entirely out of taxes which the poorest do not pay.

Robinson’s social history lacks historical perspective, because she fails to compare social conditions and attitudes in England to those prevalent throughout Christian Europe at the time. Through much of her history, Robinson confuses literary impressions and distortions with historical fact and ignores modern research based on numerical analyses of historical records. Sneers about every aspect of English character and institutions fill page after dreary page, with tedium enough to turn an IRA man into a Loyalist. Her account would make one believe that social deprivation has never existed in the United States. Has she never read John Steinbeck’s Grapes of Wrath?

Robinson’s account of Sellafield is based on press reports and antinuclear pamphlets. Knowing no science, she has spurned study of the abundant technical literature that would have saved her from her monstrous exaggerations of the dangers presented by Sellafield. In the middle Seventies, when discharges from Sellafield were at their height, the collective annual dose from its atmospheric discharges amounted to six man-sieverts,28 compared to about 100,000 from the atmospheric atomic weapon tests carried out until 1963. We have seen that the discharges into the Irish Sea have raised its radioactivity by less than 1 percent. That can hardly be called “a nuclear attack” on our planet. Nor is Britain “the largest source, by far, of radioactive contamination of the World’s environment.” That source is natural radioactivity; next comes fallout from nuclear weapons tests and from Chernobyl. By contrast, radioactive contamination from nuclear plants accounts for less than one thousandth of the average Briton’s radioactive exposure, and away from the British Isles it is barely detectable. Ms. Robinson, as we might expect, fails to tell her reader that the discharges from Sellafield are now being reduced to near zero. She should have stuck to writing novels.

This Issue

November 23, 1989