• Email
  • Single Page
  • Print

The Myth of Missile Accuracy

At the heart of modern theories of nuclear strategy lies the premise of accuracy. The great debates of recent years—over the SALT II treaty, deployment of the MX missile, and the limited nuclear war scenarios most recently considered in Presidential Directive 59—all take for granted the assumption that a missile fired 6,000 miles can land within 600 feet of a target no more than fifty yards in diameter. 1

In the view of a former member of the Carter administration the concession that such accuracy was possible was “the greatest single mistake of the Carter administration’s defense policy.”2 It led to the decision to build the MX; it severely weakened the case for SALT II; it added significantly to the peril of a nuclear conflict actually breaking out. For once it was accepted that a US or a Russian missile could strike a missile silo in enemy territory the specter of “vulnerability” began to dominate debate and propelled huge new defense appropriations through Congress.

The presidential campaign of 1980 has contained, in the debate between the two main contenders, no doves on the matter of strategic nuclear armament, no dissent on the issue of vulnerability.3 Nor were the premises of that debate ever challenged at their most sensitive point: accuracy. There are some obvious reasons for this silence. Data about the accuracy of US missiles, and hence many of the suppositions about the performance of their Soviet equivalents, are drawn from test results which—along with codes—are among the most highly classified secrets of the government. The propositions and conclusions that follow are based on extensive interviews with scientists and former government officials.4 They deal with a problem that has led a semi-secret existence for a decade. The shorthand phrase often used to describe this problem is “the bias factor.” Bias is a term used to describe the distance between the center of a “scatter” of missiles and the center of the intended target. This distance, the most important component in the computation of missile accuracy, accounts for the fact that the predictions of missile accuracy cited above are impossible to achieve with any certainty, hence that the premises behind “vulnerability,” the MX, and Presidential Directive 59 are expensively and dangerously misleading.

The strategists assume that only a missile fired from a precisely surveyed site on dry land has the accuracy to hit a missile silo in the enemy’s territory. Such land-based missiles are, in the jargon of the strategists, called “counter-force.” A missile fired from a submarine is not expected to hit anything smaller than a city, inhabited by people. It is therefore called a “counter-value” weapon.

Here, according to the Defense Department, is how a consequent scenario could unfold: a surprise Soviet missile attack wipes out US land-based (i.e. counter-force) missiles. The only response that the president can order is a counter-value strike by US submarine-launched missiles against Russian cities. But the president will also know that the Russians will have enough missiles left to destroy American cities. Therefore, in the view of the Defense Department, a US president contemplating the certain destruction of his cities might well prefer to surrender.5

To avoid such a contingency the Defense Department produced, and the president and Congress accepted, the plan to develop the mobile MX, which proposes to make US land-based missiles movable and hence immune to a surprise Soviet attack, or “first strike.” The assumptions of accuracy attributed to the Russians apply to US missiles as well. One of the missions of the land-based Minutemen is to destroy Soviet missile silos. The MX missiles, carrying up to ten warheads apiece, are planned to have even greater accuracy.

Fears of Soviet accuracy and confidence in our own were the burden of the secret Presidential Directive 59, whose outline was first leaked in late July of this year.6 Secretary of Defense Harold Brown, addressing the Naval War College on August 20, discussed the new strategy and said, “For planning purposes, therefore, we must assume that the ICBM leg of our triad [the other components of which are submarines and bombers] could be destroyed within a very short time as one result of a Soviet attack.” A week earlier President Carter had dispatched a letter to the delegates assembled in Madison Square Garden for the Democratic Convention, urging them to defeat a motion opposing the MX on the grounds that “it is crucial that our strategic nuclear forces not be vulnerable to a pre-emptive Soviet attack.”

In fact the Presidential Directive does not, on the basis of all that is known about it, promulgate a new strategy but merely ratifies an old one in force since the early 1950s. The US Air Force has always conceived of the Strategic Air Command’s mission as “counter force,” aimed at Soviet military targets. US targeting strategy is exactly the imputed Soviet strategy in reverse. That is, the US should be capable of picking off Soviet targets, whether missile silos, Brezhnev’s bunker, or the Karma river truck plant, at will.

The Carter administration’s surrender to the notion of the vulnerability of its land-based missiles has been a costly one. The bill for the MX alone is officially $34 billion, and unofficially reckoned to be at least twice that sum. The total bill for upgrading US strategic forces is authoritatively calculated at $100 billion over the next five years.7 The SALT II treaty was, even before President Carter shelved it in January, in peril of defeat in the Senate because of these growing fears of vulnerability.

The talk about accuracy and precision has had a sanitizing effect on the perception of what nuclear warfare would actually be like: not the holocaust of Hiroshima but something comfortably exact and controlled; not the incineration of New York or Moscow, but the elimination of a silo in North Dakota or central Siberia. Nuclear war becomes, in the minds of its theorists and programmers, a feasible extension of conventional war, with strikes and counter strikes, losers and winners; not the instantaneous Götterdämmerung embalmed in the concept of Mutual Assured Destruction.

Among the chief proponents of “limited” or “controlled” nuclear war are, and always have been, the US Air Force generals, flanked by their academic secretariat in the think tanks. Despite the negative evidence of World War II and Vietnam,8 the air force’s institutional confidence in bombing has been unimpaired, thanks to two sorties over Hiroshima and Nagasaki (although the Nagasaki bomb was well off target). In the wake of those two explosions the generally prevailing belief was that nuclear warfare—if the unthinkable came to pass—would be a relatively crude affair in which planes would ferry nuclear bombs over Russia and drop them on cities. The air force had different ideas. From the earliest days of the nuclear age it adopted the notion of “counter force”—long before it became a matter for public discussion in the early 1970s. Participants in air force planning scenarios in the 1950s recall that the targets were exclusively military and military-industrial, and numbered upward of 20,000.9

The first serious challenge to the air force’s dominance in maintaining the strategic arsenal came with the introduction of submarine-launched missiles, under the control of the US Navy. The submarine-launched missile constitutes the very antithesis of a controllable nuclear engagement of the kind espoused in the air force.

Two facts about submarine-launched missiles are of paramount importance. The submarine is invulnerable to the threat of a pre-emptive first strike by the enemy. This has encouraged the suggestion—keenly advanced by the navy as well as by many in the arms control community—that the nuclear deterrent could perfectly well be preserved by removing all missiles from the continental United States and putting them on submarines. Thus the Russians would have no incentive to attack US territory and its inhabitants.

The second fact about submarine-launched missiles is that they are acknowledged to be inaccurate, useful only for “counter-value” strikes against cities. This inaccuracy stems from navigational uncertainty about the precise location, within the necessary margin of errror, of the submarine at the moment the missile is fired.10 To ward off the threat posed to its strategic dominance by the navy, the air force promulgated with even greater vigor its policy of counter-force targeting, with accompanying assertions of accuracy, essential if that policy was to survive challenge.

The air force has accordingly devoted strenuous efforts in the last twenty years to improving the accuracy of its missiles. There are two ways in which “silo-killing accuracy”—as the professionals term it—is described. The first is “CEP” or “Circular Error Probable.” This is defined as the radius of a circle, centered on the intended end point, in which one half of all re-entry vehicles are expected to land. The second way is to describe the percentage probability of “killing” a silo with one or more shots by calculating the CEP together with the yield of the warhead and “hardness” of the target. Thus the Titan II intercontinental ballistic missile, first introduced in 1962 and carrying a nine megaton warhead, is now officially deemed to have a CEP of 1,400 yards, or an 11 percent probability of “killing” the silo with one shot, and a 21 percent probability with two.

The Titan II was followed by the Minuteman series, the latest of which—Minuteman III—carries three much smaller warheads of 335 kilotons each, and is officially estimated to have a CEP of 200 yards and a 91 percent chance of killing the silo with two shots. The MX, if armed with the Advance Ballistic Re-Entry Vehicle, will—it can be deduced from Defense Department figures—purportedly have a single shot kill probability of 92 percent.11 Such statistics have an awesome air of certainty and are accepted by both the propagandists for and opponents of counter-force theology. By friend and foe alike they are exempted from challenge.

A sense of proportion should be maintained all the same. In 1966 the air force introduced the Minuteman II. This had an official CEP rating of 600 yards. In January, 1967, Ernest Fitzgerald was the deputy for management systems in the US Air Force. In his memoir, The High Priests of Waste,12 Fitzgerald recalls a high-level meeting at the Pentagon to discuss the status of the Minuteman II program:

The program director, Brigadier General Arthur Cruikshank, came to the meeting in the Secretary of the Air Force’s large and impressive conference room armed with the usual collection of Vu-Graphs and charts…. After a very half-hearted attempt to follow the standard evasion procedures Art Cruikshank broke down and told all. “Look,” he said, “I’ve only been in charge of this program since last summer. When I came aboard I was told that the Minuteman was the best program in the Air Force. There are no problems, it was in great shape, I was told. The briefings I got sounded too good to be true. And guess what? They weren’t true.”…I then asked, “But what are you going to do? Isn’t a big portion of the Minuteman fleet out of commission?”

Yes, about 40 percent of the new missiles are down [i.e. out of action],” he answered.

  1. 1

    Calculation based on a Mark 12A warhead on a Minuteman III missile.

  2. 2

    Leslie Gelb, former director of the Politico-Military Bureau of the State Department. Statement to the authors, October 10, 1980.

  3. 3

    See, for example, the debate between defense spokesmen for Carter, Reagan, and Anderson in The New York Times “Week in Review” for October 12, 1980. David Aaron (for Carter): “Clearly the Soviet capacity to attack land-based missiles now…requires deployment of the MX.” William Van Cleave (for Reagan): “We’ve allowed our deterrent forces to become vulnerable.” Alton Frye (for Anderson): “We would like to overcome a serious problem of ICBM vulnerability.”

  4. 4

    Among them, Dr. Herbert York, former director of defense research and engineering, and science adviser to President Eisenhower; Dr. Richard Garwin, IBM Research, professor of public policy at Harvard, member of the Defense Science Board; Dr. Herbert Scoville, former head of scientific intelligence at the CIA; Dr. James Schlesinger, former secretary of defense.

  5. 5

    There are innumerable expositions of this scenario. See, for example, the testimony before the House Armed Services Research and Development Sub-committee, on March 6, 1978, of General Alton D. Slay (US Government Printing Office, 1978), p. 862.

  6. 6

    See William Beecher’s story in the Boston Globe, July 27, 1980. Richard Burt of The New York Times reported on the Directive ten days later.

  7. 7

    See Aviation Week and Space Technology, June 16, 1980, pp. 56-63.

  8. 8

    For the failure of British bombing in World War II, see The Broken Wing—A Study in the British Exercise of Air Power, by David Divine (Hutchinson, 1966). The US Strategic Bombing Survey (superintended in part by George Ball and Paul Nitze) concluded that the US bombing offensive had been no more efficacious than the British effort in defeating Germany. For Vietnam, see Pierre M. Sprey: “Impact of Avionics on Tactical Air Effectiveness,” a staff study prepared for the Office of the Assistant Secretary of Defense for Systems Analysis, 1968; declassified, 1974.

  9. 9

    Authors’ conversations with Dr. Herbert York and Daniel Ellsberg, at that time strategic analyst with RAND.

  10. 10

    The navy has tried. Its efforts have been directed toward achieving “stellar reference”; that is, equipment of the missile with a device to take bearings from the stars. These efforts have not so far met with any significant success.

  11. 11

    These figures, based on data from the Institute of Strategic Studies and from Defense Department sources, are conveniently tabulated in Dubious Specter, by Fred M. Kaplan (Institute for Policy Studies, 1980).

  12. 12

    Ernest Fitzgerald, The High Priests of Waste (Norton, 1972).

  • Email
  • Single Page
  • Print