At the heart of modern theories of nuclear strategy lies the premise of accuracy. The great debates of recent years—over the SALT II treaty, deployment of the MX missile, and the limited nuclear war scenarios most recently considered in Presidential Directive 59—all take for granted the assumption that a missile fired 6,000 miles can land within 600 feet of a target no more than fifty yards in diameter. 1

In the view of a former member of the Carter administration the concession that such accuracy was possible was “the greatest single mistake of the Carter administration’s defense policy.”2 It led to the decision to build the MX; it severely weakened the case for SALT II; it added significantly to the peril of a nuclear conflict actually breaking out. For once it was accepted that a US or a Russian missile could strike a missile silo in enemy territory the specter of “vulnerability” began to dominate debate and propelled huge new defense appropriations through Congress.

The presidential campaign of 1980 has contained, in the debate between the two main contenders, no doves on the matter of strategic nuclear armament, no dissent on the issue of vulnerability.3 Nor were the premises of that debate ever challenged at their most sensitive point: accuracy. There are some obvious reasons for this silence. Data about the accuracy of US missiles, and hence many of the suppositions about the performance of their Soviet equivalents, are drawn from test results which—along with codes—are among the most highly classified secrets of the government. The propositions and conclusions that follow are based on extensive interviews with scientists and former government officials.4 They deal with a problem that has led a semi-secret existence for a decade. The shorthand phrase often used to describe this problem is “the bias factor.” Bias is a term used to describe the distance between the center of a “scatter” of missiles and the center of the intended target. This distance, the most important component in the computation of missile accuracy, accounts for the fact that the predictions of missile accuracy cited above are impossible to achieve with any certainty, hence that the premises behind “vulnerability,” the MX, and Presidential Directive 59 are expensively and dangerously misleading.

The strategists assume that only a missile fired from a precisely surveyed site on dry land has the accuracy to hit a missile silo in the enemy’s territory. Such land-based missiles are, in the jargon of the strategists, called “counter-force.” A missile fired from a submarine is not expected to hit anything smaller than a city, inhabited by people. It is therefore called a “counter-value” weapon.

Here, according to the Defense Department, is how a consequent scenario could unfold: a surprise Soviet missile attack wipes out US land-based (i.e. counter-force) missiles. The only response that the president can order is a counter-value strike by US submarine-launched missiles against Russian cities. But the president will also know that the Russians will have enough missiles left to destroy American cities. Therefore, in the view of the Defense Department, a US president contemplating the certain destruction of his cities might well prefer to surrender.5

To avoid such a contingency the Defense Department produced, and the president and Congress accepted, the plan to develop the mobile MX, which proposes to make US land-based missiles movable and hence immune to a surprise Soviet attack, or “first strike.” The assumptions of accuracy attributed to the Russians apply to US missiles as well. One of the missions of the land-based Minutemen is to destroy Soviet missile silos. The MX missiles, carrying up to ten warheads apiece, are planned to have even greater accuracy.

Fears of Soviet accuracy and confidence in our own were the burden of the secret Presidential Directive 59, whose outline was first leaked in late July of this year.6 Secretary of Defense Harold Brown, addressing the Naval War College on August 20, discussed the new strategy and said, “For planning purposes, therefore, we must assume that the ICBM leg of our triad [the other components of which are submarines and bombers] could be destroyed within a very short time as one result of a Soviet attack.” A week earlier President Carter had dispatched a letter to the delegates assembled in Madison Square Garden for the Democratic Convention, urging them to defeat a motion opposing the MX on the grounds that “it is crucial that our strategic nuclear forces not be vulnerable to a pre-emptive Soviet attack.”

In fact the Presidential Directive does not, on the basis of all that is known about it, promulgate a new strategy but merely ratifies an old one in force since the early 1950s. The US Air Force has always conceived of the Strategic Air Command’s mission as “counter force,” aimed at Soviet military targets. US targeting strategy is exactly the imputed Soviet strategy in reverse. That is, the US should be capable of picking off Soviet targets, whether missile silos, Brezhnev’s bunker, or the Karma river truck plant, at will.

Advertisement

The Carter administration’s surrender to the notion of the vulnerability of its land-based missiles has been a costly one. The bill for the MX alone is officially $34 billion, and unofficially reckoned to be at least twice that sum. The total bill for upgrading US strategic forces is authoritatively calculated at $100 billion over the next five years.7 The SALT II treaty was, even before President Carter shelved it in January, in peril of defeat in the Senate because of these growing fears of vulnerability.

The talk about accuracy and precision has had a sanitizing effect on the perception of what nuclear warfare would actually be like: not the holocaust of Hiroshima but something comfortably exact and controlled; not the incineration of New York or Moscow, but the elimination of a silo in North Dakota or central Siberia. Nuclear war becomes, in the minds of its theorists and programmers, a feasible extension of conventional war, with strikes and counter strikes, losers and winners; not the instantaneous Götterdämmerung embalmed in the concept of Mutual Assured Destruction.

Among the chief proponents of “limited” or “controlled” nuclear war are, and always have been, the US Air Force generals, flanked by their academic secretariat in the think tanks. Despite the negative evidence of World War II and Vietnam,8 the air force’s institutional confidence in bombing has been unimpaired, thanks to two sorties over Hiroshima and Nagasaki (although the Nagasaki bomb was well off target). In the wake of those two explosions the generally prevailing belief was that nuclear warfare—if the unthinkable came to pass—would be a relatively crude affair in which planes would ferry nuclear bombs over Russia and drop them on cities. The air force had different ideas. From the earliest days of the nuclear age it adopted the notion of “counter force”—long before it became a matter for public discussion in the early 1970s. Participants in air force planning scenarios in the 1950s recall that the targets were exclusively military and military-industrial, and numbered upward of 20,000.9

The first serious challenge to the air force’s dominance in maintaining the strategic arsenal came with the introduction of submarine-launched missiles, under the control of the US Navy. The submarine-launched missile constitutes the very antithesis of a controllable nuclear engagement of the kind espoused in the air force.

Two facts about submarine-launched missiles are of paramount importance. The submarine is invulnerable to the threat of a pre-emptive first strike by the enemy. This has encouraged the suggestion—keenly advanced by the navy as well as by many in the arms control community—that the nuclear deterrent could perfectly well be preserved by removing all missiles from the continental United States and putting them on submarines. Thus the Russians would have no incentive to attack US territory and its inhabitants.

The second fact about submarine-launched missiles is that they are acknowledged to be inaccurate, useful only for “counter-value” strikes against cities. This inaccuracy stems from navigational uncertainty about the precise location, within the necessary margin of errror, of the submarine at the moment the missile is fired.10 To ward off the threat posed to its strategic dominance by the navy, the air force promulgated with even greater vigor its policy of counter-force targeting, with accompanying assertions of accuracy, essential if that policy was to survive challenge.

The air force has accordingly devoted strenuous efforts in the last twenty years to improving the accuracy of its missiles. There are two ways in which “silo-killing accuracy”—as the professionals term it—is described. The first is “CEP” or “Circular Error Probable.” This is defined as the radius of a circle, centered on the intended end point, in which one half of all re-entry vehicles are expected to land. The second way is to describe the percentage probability of “killing” a silo with one or more shots by calculating the CEP together with the yield of the warhead and “hardness” of the target. Thus the Titan II intercontinental ballistic missile, first introduced in 1962 and carrying a nine megaton warhead, is now officially deemed to have a CEP of 1,400 yards, or an 11 percent probability of “killing” the silo with one shot, and a 21 percent probability with two.

The Titan II was followed by the Minuteman series, the latest of which—Minuteman III—carries three much smaller warheads of 335 kilotons each, and is officially estimated to have a CEP of 200 yards and a 91 percent chance of killing the silo with two shots. The MX, if armed with the Advance Ballistic Re-Entry Vehicle, will—it can be deduced from Defense Department figures—purportedly have a single shot kill probability of 92 percent.11 Such statistics have an awesome air of certainty and are accepted by both the propagandists for and opponents of counter-force theology. By friend and foe alike they are exempted from challenge.

Advertisement

A sense of proportion should be maintained all the same. In 1966 the air force introduced the Minuteman II. This had an official CEP rating of 600 yards. In January, 1967, Ernest Fitzgerald was the deputy for management systems in the US Air Force. In his memoir, The High Priests of Waste,12 Fitzgerald recalls a high-level meeting at the Pentagon to discuss the status of the Minuteman II program:

The program director, Brigadier General Arthur Cruikshank, came to the meeting in the Secretary of the Air Force’s large and impressive conference room armed with the usual collection of Vu-Graphs and charts…. After a very half-hearted attempt to follow the standard evasion procedures Art Cruikshank broke down and told all. “Look,” he said, “I’ve only been in charge of this program since last summer. When I came aboard I was told that the Minuteman was the best program in the Air Force. There are no problems, it was in great shape, I was told. The briefings I got sounded too good to be true. And guess what? They weren’t true.”…I then asked, “But what are you going to do? Isn’t a big portion of the Minuteman fleet out of commission?”

“Yes, about 40 percent of the new missiles are down [i.e. out of action],” he answered.

Under further interrogation by Fitzgerald and others it emerged that 40 percent of the missiles were down because of failures in their guidance system.

Missiles, as Fitzgerald’s account suggests, do not exist in the orderly universe of the strategic theologians but in the actual world of contract mismanagement, faulty parts, slipshod maintenance, bureaucratic cover-up, and the accidents that have afflicted military equipment since the world’s first bow string got wet in the rain. The nuclear scenarists are impatient at citation of such quotidian mishaps which, to be sure, do not refute but merely impair the claim that a missile will successfully strike its objective 6,000 miles away.

As prologue to consideration of just this claim, it is useful to bear in mind the accuracy being envisaged. By the standards demanded of an ICBM fired from the United States to the Soviet Union (or vice versa), a shell from an artillery piece—fired with no preliminary spotting rounds—would fall no more than one yard from a target thirty miles away.

An intercontinental ballistic missile is far more like an artillery shell than might be supposed. What happens when a missile is fired? It is boosted into space by three or four rocket stages which fall away as they burn out. After the last stage burns out, or is turned off, the warhead (or “re-entry vehicle”) continues in free flight through space until it re-enters the atmosphere and descends toward the target. With the moment of final burn-out, or termination of the rocket, vanishes the last chance for the warhead to change direction. It is on its own, as is a shell leaving the muzzle of an artillery piece, or a stone leaving a person’s hand.

From the moment the missile leaves its silo, no exterior system is guiding its course. No human hand, back on the ground, can interfere or correct its flight. Nor is any piece of equipment inside the missile taking bearings from some external point of reference. The missile depends for its guidance on inertial sensing. The simplest way of understanding this is to think of yourself sitting in an airplane looking at a glass of water. The movement of the water will reflect the angular movements of the plane. All that the inertial system in a missile essentially does is to compare the movements of a more sophisticated equivalent of the water in the glass with its programmed version of what those movements should be, if the rocket is going in the right direction. Undesirable variations are corrected accordingly, up until the moment the rocket motor burns out.

Once that happens, there is nothing that can be done. The warhead can receive no signal, and contains no targeting mechanism of its own. As can be understood, everything depends on the accuracy of the programmed data in the missile’s computer. If the missile’s flight were to take place in entirely predictable conditions, the inertial system would be perfectly satisfactory and no silo would be safe. But in reality the missile’s journey takes it through forces that either cannot be compensated for, or are entirely unpredictable, or are not understood.

During the flight of the missile, from launch to target, it is under the influence of two principal external factors: the pull of gravity and the drag of the atmosphere. Since the earth is not a perfect sphere and varies in density, its gravitational field is not constant. The inertial guidance system cannot tell the difference between the effect of its own motion and the effect of gravity.

If an unprogrammed variation in the earth’s gravitational force pulls the missile fractionally down, the guidance system has no way of distinguishing that movement from an equivalent force produced by an upward motion of the missile. Detecting what it records as an unprogrammed upward motion, the system adjusts the missile’s trajectory accordingly—off course.

Extensive satellite observations enable the missile’s programmers to supply its computer with reasonably precise information about the gravitational forces it will meet while traveling over the North Pole toward the Soviet Union (or vice versa). Even so, scientists operating in this field concede that enough anomalies in the earth’s gravitational field occur for precise prediction to be impossible, as demonstrated by uncertainties about the position of satellites now aloft.

A solution being considered is for the missile to carry a device known as a gravity gradiometer, which would measure such anomalies during the flight, and pass the information to the guidance system. Although such an instrument is theoretically possible, and indeed a large-scale working model has been demonstrated in the benign conditions of the Draper laboratory in MIT, even scientists intimately involved with the program doubt whether a practical model of the size and sturdiness necessary for operational use can ever be developed.

The atmospheric forces affecting the missile present even greater problems, which appear insuperable even on a theoretical level. Detached from its rocket, the warhead hurtles toward its target at an initial speed of some 12,000 miles per hour, descending into the atmosphere at a relatively shallow angle. The atmosphere extends upward in irregular contours to anywhere from fifty to one hundred miles above the earth’s surface. Given its re-entry angle of about 25° the warhead has to penetrate the atmosphere for a distance ranging between 120 and 240 miles. This atmosphere is far from placid, and is largely unpredictable.

To take one example, which produces some twenty possible variables: a solar flare will drastically affect the density of the atmosphere. But the effect will not be constant; it will be determined by the season of the year, whether it is day or night, whether the patch of atmosphere in question is over snow or ice, and by meteorological conditions generally. Since it is impossible to isolate the effect of one variable—the interaction, let us say, between the solar flare and a positively charged cloud—and all the other variables (day/night, etc.) which condition the effect of the flare on the atmosphere, it is impossible even in theory to construct an accurate profile of the atmosphere.

Yet such a profile is essential if the guidance system is to release the warhead at the correct instant and angle to secure the desired results. What is true of solar flares is true of innumerable other, constantly changing meteorological events: the jet streams above 30,000 feet, barometric pressure, a thunderstorm anywhere along the re-entry trajectory, and so forth.

A layman might regard the variations in wind and atmospheric density as immaterial to the direction of a missile decelerating from 12,000 miles per hour toward its target. This view is not shared by guidance specialists,13 who have long acknowledged that one of the main problems in designing a re-entry vehicle is to strike a balance between streamlined high-speed shapes (which cause fatal overheating) and blunt, low-speed shapes, which can be blown disastrously off course by the wind.14

Such are the conditions and uncertainties which determine the flight and affect the accuracy of a missile. The testing of these missiles, a process which produces the apparently confident CEPs and kill probabilities, has not alleviated the problem.

US strategic land-based missiles are test-fired from Vandenberg Air Force Base in southern California to Kwajalein lagoon in the Marshall Islands. Soviet missiles are test-fired from nothern European Russia to the Kamchatka peninsula, at the eastern end of Siberia, or beyond this point into the northern Pacific.

Testing conditions are very different from the realities of a nuclear exchange, and the problems have been pithily summed up for us by Dr. Richard Garwin, of the IBM Research Center, former presidential scientific adviser, currently a member of the Defense Science Board, and professor of public policy at Harvard University:

In every ICBM you have an inertial package. Accelerometers and gyros and things like that are mounted in your missile. You’ve got to fire your missiles from operational silos to points in your enemy’s country. Now, obviously you’ve never done this before and so you have to base your calculation on test shots—in our case from Vandenberg to Kwajalein lagoon, that is, east to west; and in the Russians’ case from northern European Russia to Kamchatka in the northern Pacific, west to east. Judging from how far each test shot falls from the target, you adjust your accelerometer or your gyro, to compensate for the inaccuracy, until in the end your test shots are landing within the prescribed area. But every time you fire a new-model missile over the same range or the same missile over a slightly different range, the bias changes. Sometimes it is greater, sometimes it is smaller, but it never has been calculated beforehand.

So you have to go back to readjusting-the gyros and so on, to try and eliminate the novel bias. But if we were firing operationally, both we and the Russians would be firing over a new range in an untried direction—north. And a whole new set of random factors would come into play—anomalies in the earth’s gravitational field, varying densities of the upper atmosphere or unknown wind velocities. They may adjust and readjust in testing and eventually they might feel sure that they have eliminated the bias. But they can never be absolutely certain. We certainly cannot be: and although we are less well informed about the Russian ICBM test program than our own, there is no reason to suspect that they are any more successful than we are at dealing with the problem. If you cannot be sure that you would be able to hit the enemy’s silos, then there is no point in even trying—because the idea is that one side could wipe out the other’s missiles before they are launched in a first strike.15

It goes without saying that there is one further difficulty. The laborious adjustments necessary to reduce bias described by Dr. Garwin will be difficult to achieve over the novel northern trajectory, since no previous test data are available. The first shot is presumptively the only shot.

Such plain speech as that of Dr. Garwin is rare. Imputations of inaccuracy and discussion of the bias factor do not regularly take their place among the public pronouncements of the Defense Department. To a recent inquiry Colonel Alan MacLaren (USAF), in the Office of Dr. Seymour Zeiberg, deputy under-secretary for Strategic and Space Systems, replied, “I do not know what prompts Dr. Garwin to make these statements. You cannot say that these factors [gravity, atmosphere] can be neglected, but they are unimportant if we have done our homework right…. We understand these factors well enough so that they are not so serious than we do not have [silo-busting] accuracy.” But it appears unlikely that Dr. Garwin’s views will be lightly dismissed. Dr. Albert C. Vosburgh, former deputy for Strategic and Space Systems, Office of the Assistant Secretary of the Air Force for Research, Development, and Logistics, said recently, “Dr. Garwin is a brilliant scientist, whose arguments on any scientific or technical subject should be taken seriously.”

The problems have been forcefully acknowledged by one senior Defense Department official in a position to know. On March 4, 1974, in secret testimony before the Arms Control Subcommittee of the Senate Foreign Relations Committee, subsequently declassified, then Secretary of Defense James Schlesinger said:

I believe that there is some misunderstanding about the degree of reliability and accuracy of missiles…. It is impossible for either side to acquire the degree of accuracy that would give them a high confidence first strike, because we will not know what the actual accuracy would be like in a real world context. As you know, we have acquired from the western test range a fairly precise accuracy, but in the real world we would have to fly from operational bases to targets in the Soviet Union. The parameters of the flight from the western test range are not really very helpful in determining those accuracies to the Soviet Union. We can never know what degrees of accuracy would be achieved in the real world….

The point I would like to make is that if you have any degradation in operational accuracy, American counter-force capability goes to the dogs very quickly. We know that, and the Soviets should know it, and that is one of the reasons that I can publicly state that neither side can acquire a high confidence first strike capability. I want the President of the United States to know that for all the future years, and I want the Soviet leadership to know that for all the future years.

The situation, as Dr. Garwin acknowledges, has not changed significantly since Schlesinger made these amazingly forthright observations.

The only way out of the impasse currently being considered is to contrive a method of guiding the warhead through the terminal stages of its descent. One method is to employ a navigational system (known as NAVSTAR, or Global Positioning System) now under development for nonstrategic military and civilian purposes. The system is to employ a series of satellites, transmitting radio signals which will allow the receiver to determine its exact position. A maneuverable warhead, equipped with such a receiver, could correct its course as it descended. In fact two missile-borne receiver sets have reportedly flown on Minuteman test missiles.16

Alluring as such a solution may seem, it runs into the objections which forced the reliance on an inertial guidance system for ICBMs, rather than on exterior radio communication of the sort used to control the initial ascent of the German V2 during the Second World War. Such exterior communications are inevitably vulnerable to interference by the enemy, in the form of jamming. Since it is unlikely that the satellite could generate power greater than 200 watts for transmission of the signals, a series of ground-based jammers, of one kilowatt each and a hundred miles apart, could effectively neutralize the system for guiding re-entry vehicles right to their targets. Dr. Garwin believes that before the re-entry vehicles are released from the missile the NAVSTAR could be used to give submarine and other mobile missiles accuracies at least as great as present land-based missiles. However, this could make the missile crucially dependent on a satellite system itself highly vulnerable to enemy action.

Other schemes have been proposed to overcome accuracy deficiencies of the self-contained inertial navigation approach. Most of these rely on having the warhead of the missile home in on some “signature” of the target—for example, radar, infrared, or optical images of the area to be hit. None of these schemes has demonstrated that it could cope with simple enemy “spoofing” or jamming of the target signature—countermeasures that are vastly less expensive than the guidance system being proposed. Nor is the solution to be found in a bigger warhead to compensate for inaccuracy. An unverifiable amount of bias uncertainty remains, with the enormous added expense of building one launcher for each warhead—in the case of the MX, two thousand instead of two hundred.

We have concentrated on accuracy in guidance, for this is essential to all conceptions of nuclear war which go beyond the simple objective of leveling a city. Such a focus should not allow many other “data free” premises of the strategic thinkers to go unchallenged.

For example, the nuclear explosive deemed necessary to destroy a silo is a purely theoretical calculation. No practical tests have ever been carried out. The precise power of Soviet warheads, which inspire estimates of US silo vulnerability, is unknown. It is hard to find a convincing refutation of “fratricide”—the effect on an incoming warhead of the thermonuclear explosion caused by one that has already landed.17

So long as US and Soviet leaders pay heed to Schlesinger’s warning of 1974 there remains some hope that they will appreciate that there are no certainties in strategic nuclear warfare. Yet amid the mounting alarmism which has characterized the defense debate in the years since Schlesinger testified, and which has reached a peak of frenzy in this election year, it is hard to retain even that minimal confidence.

This Issue

November 20, 1980