• Email
  • Single Page
  • Print

Doing Without Nuclear Power

The case for nuclear power has always rested on two claims: that reactors were reasonably safe and that they were indispensable as a source of energy. Now the accident at Three Mile Island has shaken the first claim, and we will soon have to face the flaws in the second. The result should be the abandonment of nuclear power and the emergence of a more rational energy policy, based on measures to improve the efficiency with which energy from fossil fuels is used.

The dangers of nuclear power have been greatly underestimated, while its potential to replace oil as the world’s primary energy source has been vastly exaggerated. Rather than being indispensable, nuclear power can make, at best, only a modest and easily replaceable contribution to future energy requirements.

Nuclear power plants currently generate 13 percent of the electricity produced in the United States, and slightly smaller percentages in Western Europe and Japan. (Countries such as Sweden and Switzerland which depend on nuclear reactors for 20 percent of their electricity are exceptions.) Since electricity accounts for only 30 percent of the total energy supply, nuclear power provides less than 4 percent of the overall energy of the industrial countries. More surprisingly, as the economist Vince Taylor has shown in a report to the US Arms Control and Disarmament Agency, nuclear power will provide, at most, only a 10 to 15 percent share of the energy supply of the advanced countries by the year 2000.1

Taylor’s argument begins with the fact that nuclear power provides only electricity—an expensive form of energy which absorbs only 10 percent of the oil used in the US and other advanced countries. The great hopes for nuclear energy were based on the possibility that electricity could be substituted for oil in processes where electric power had not been heavily used. In fact, little such substitution has occurred. The petrochemical and transport industries, including automobiles, now use 50 percent of available oil. There is no foreseeable technical possibility of electrifying large proportions of these industries. The remaining 40 percent of oil is used for heating and for industrial energy. In both cases, major electrification is ruled out by nuclear power’s high cost.

What are the relative costs of energy from nuclear power? They are rarely compared to the costs of burning oil in engines or furnaces, yet such a comparison is central if we are to weigh the prospects of substituting nuclear power for oil. If we carefully estimate the different kinds of costs involved in producing kilowatts at nuclear plants and in obtaining barrels of oil, we find that the cost of energy from a nuclear plant built today can be calculated at $100 for the “heat equivalent of a barrel of oil.” This figure reflects fuel, maintenance, and—most important—the cost of constructing the nuclear plant itself. It is over four times the cost of heat from oil at OPEC prices.

The heat in electricity, of course, can be made use of with more efficiency than can heat from oil and gas—as much as twice as efficiently. For example, electric furnaces produce about 50 percent more glass, per unit of applied heat, than gas-burning furnaces. When we allow for this, nuclear electricity still remains two to three times as expensive as oil. As a result, the market for nuclear electricity has slowly but inexorably been drying up. Notwithstanding plans to build hundreds of reactors, the projected demand for nuclear power—and with it the financial backing for its expansion—has not materialized.

It wasn’t always this way. At the start of 1972, the first plants purchased by the electric utilities on a commercial basis were entering service at a cost of $225 million for a standard thousand-megawatt generator. This cost converts to the equivalent of about $25 per barrel of oil. Costs were expected to fall as more plants were built. The tripling of prices for crude oil and coal in the wake of the 1973 Arab oil embargo was expected to put nuclear power in a commanding market position. Not only was it apparently cheaper than coal as a source of electricity, but nuclear electricity seemed competitive with oil both for heating and for industrial purposes. Nuclear electricity cost only twice as much per British thermal unit—i.e., per unit of delivered heat; and since this heat was available in more efficient form, it was therefore close to oil in energy value. Moreover, nuclear electricity was expected to decline in cost while oil rose. Few doubted the Atomic Energy Commission’s projection in 1974 that 1,000 thousand-megawatt reactors would be built by the end of the century—capable of producing roughly the equivalent of the entire US energy supply in 1972.

Instead, nuclear costs have soared as the industry expanded, and primarily for two reasons: safety systems had to be added to meet rising public concerns; and improvements in design were needed to correct defects that emerged as operating experience increased. By the end of 1977, a thousand-megawatt reactor cost $900 million, four times the cost of a 1972 plant. The rate of increase was an astounding 26 percent per year. This was triple the rate of general US inflation and half again as great as the increase in construction costs of power generators using coal, notwithstanding new emission controls. It was about the same rate of increase as that for the price of oil, despite the actions of the OPEC cartel. Nuclear power had thus lost its edge over coal in the electric power market, and had lost the opportunity to make inroads on the larger market occupied by oil and gas.

To be sure, these figures represent the findings of the author’s own cost research. They have not as yet been digested by the power industry, which still claims nuclear reactors to be the least costly source of electricity.2 But the impact of steeply rising costs and concomitant delays in licensing and constructing reactors has not been lost on nuclear advocates. Before the breakdown at Harrisburg, they were stridently demanding that Congress and the regulatory authorities “stabilize” reactor design standards as a solution to rising costs. Now the Harrisburg accident has raised the prospect of a reactor disaster from an infinitesimally remote possibility to a reality. This guarantees that safety requirements will be stiffened so that costs will continue to rise sharply, offsetting probable increases in the cost of oil. The events at Harrisburg have pushed nuclear power beyond the brink of economic acceptability.

Moreover, the accident has undermined confidence in the nuclear industry. As the laxity of safety regulations at Three Mile Island becomes widely known, it may undermine confidence in the Nuclear Regulatory Commission as well. The Commission had composed a supposedly exhaustive list of possible “initiating chains” for nuclear accidents and concluded that, in view of its precautions, US reactors were statistically safe. The sequence of events which caused the Harrisburg accident was not among these “chains.” Henceforth every major decision about nuclear power, especially those concerned with disposal of radioactive wastes, will require so much public scrutiny that delays and costs will become intolerable. Well before the Harrisburg events seven state legislatures had already passed laws to stop, or control, the storage and transport of nuclear wastes. We can now expect that pressure for more stringent legislation will mount.

Of course, for many people the prospect of permanent oil shortages causing economic stagnation is worse than unsafe and expensive nuclear power. And, in fact, nuclear power will be justified in the months ahead by references to the world’s limited store of easily extractable oil. The median estimate of the total quantity of world oil that remains to be exploited is one-and-a-half to two trillion barrels. This would be enough for a one-hundred-year supply at the current rate of consumption. It will not suffice if world needs grow at several percent per year—the rate of energy growth generally considered necessary to support healthy economic growth. At 3 percent annual energy growth, for example, this supply would be exhausted in fifty years. And, well before then, physical limits on the potential rate of discovery and extraction would force the level of oil output below demand.

The possibility that potential oil resources may be greater than is now anticipated does not significantly alter this pessimistic picture. Aside from the doubtful wisdom of tying the world’s economic prospects to unexpected discoveries, even a doubling of total oil resources would add only another twenty years of supply, assuming 3 percent annual growth in consumption. Oil is simply incapable of sustaining world energy growth for more than a few decades.

Nor are coal or natural gas, the other fossil fuels, likely to take up much of the slack. Natural gas is less abundant than oil. Moreover, continued growth of gas consumption would require that it be transported in liquefied from—a matter of growing controversy—and would maintain the present, undesirable level of dependence on imports from the Middle East.

The world’s recoverable coal reserves are several times those of oil and will now be increasingly exploited in the US. But environmental and cost constraints analogous to those impeding nuclear power will probably prohibit developing coal on the scale necessary to increase energy growth significantly. During the past decade, efforts to reduce injuries and deaths in US mines, and to contain pollution from coal burning, have caused the cost of coal-fired electricity to increase at twice the general inflation rate. Any increase in the rate of expansion of the coal industry would probably accelerate this trend.

Moreover, hopes of manufacturing large quantities of synthetic oil and gas from coal or shale appear doomed both by inherent inefficiencies and environmental effects. The costs of dealing with both would be staggering. According to a study by energy specialists at the Massachusetts Institute of Technology,3 the production of synthetic fuels equivalent to only one-tenth of the current US energy supply would require mining the equivalent of the entire US coal output in 1975. To process this coal would require forty-two gasification facilities and six oil shale plants, costing over $1 billion each, as well as forty-seven liquefaction facilities, each costing one-half billion dollars. Oil and gas from such facilities would be several times as costly as OPEC oil and gas.

Furthermore, since no commercialsize synthetic fuel plants are operating as yet, little attention has been devoted to the environmental effects of these technologies. If we follow the known pattern for energy projects with potentially large effects on the environment, we can estimate that the construction costs currently cited for commercial synthetic fuel plants could easily double or triple, ensuring that such fuels would remain uneconomical.

Similar considerations apply to nuclear fusion and solar space satellites. Even if the considerable scientific and engineering problems can be solved, high costs will consign these “exotic” high-technology schemes to a miniscule share of energy supply.

  1. 1

    Vince Taylor, Energy: The Easy Path, available from Pan Heuristics, Suite 1221, 1801 Avenue of the Stars, Los Angeles, CA 90067, January 1979.

  2. 2

    The Edison Electric Institute claims that power from nuclear reactors costs 1.5 cents per kilowatt-hour, as compared to 2 cents for power from coal and 3.9 cents for power from oil. But this claim is based on a sample dominated by nuclear plants completed before expensive safety standards were imposed and including coal and oil plants operating at low efficiencies because of the surplus of generating capacity. More important, this comparison pertains only to the 10 percent of oil which is used to generate electricity.

  3. 3

    Carroll L. Wilson, ed., Energy: Global Prospects 1985–2000, Report of the Workshop on Alternative Energy Strategies (McGraw-Hill, 1977).

  • Email
  • Single Page
  • Print