• Email
  • Single Page
  • Print

A Special Supplement: Technology: The Opiate of the Intellectuals

What reasons do they give to believe that the principle of laissez innover will normally function for the benefit of mankind rather than, say, merely for the benefit of the immediate practitioners of technology, their managerial cronies, and for the profits accruing to their corporations? As Mesthene and other writers of his school are aware, this is a very real problem, for they all believe that the normal tendency of technology is, and ought to be, the increasing concentration of decision-making power in the hands of larger and larger scientific-technical bureaucracies. In principle, their solution is relatively simple, though not often explicitly stated.4

Their argument goes as follows: the men and women who are elevated by technology into commanding positions within various decision-making bureaucracies exhibit no generalized drive for power such as characterized, say, the landed gentry of pre-industrial Europe or the capitalist entrepreneur of the last century. For their social and institutional position and its supporting culture as well are defined solely by the fact that these men are problem solvers. (Organized knowledge for practical purposes again.) That is, they gain advantage and reward only to the extent that they can bring specific technical knowledge to bear on the solution of specific technical problems. Any more general drive for power would undercut the bases of their usefulness and legitimacy.

Moreover their specific training and professional commitment to solving technical problems creates a bias against ideologies in general which inhibits any attempts to formulate a justifying ideology for the group. Consequently, they do not constitute a class and have no general interests antagonistic to those of their problem-beset clients. We may refer to all of this as the disinterested character of the scientific-technical decision-maker, or, more briefly and cynically, as the principle of the Altruistic Bureaucrat.

As if not satisfied by the force of this (unstated) principle, Mesthene like many of his school fellows spends many pages commenting around the belief that the concentration of power at the top of technology’s organizations is a problem, but that like other problems technology should be able to solve it successfully through institutional innovation. You may trust in it; the principle of laissez innover knows no logical or other hurdle.

This combination of guileless optimism with scientific toughmindedness might seem to be no more than an eccentric delusion were the American technology it supports not moving in directions that are strongly anti-democratic. To show why this is so we must examine more closely Mesthene’s seemingly innocuous distinction between technology’s positive opportunities and its “negative externalities.” In order to do this I will make use of an example drawn from the very frontier of American technology, the Vietnam War.

II

At least two fundamentally different bombing programs are now being carried out in South Vietnam. There are fairly conventional attacks against targets which consist of identified enemy troops, fortifications, medical centers, vessels, and so forth. The other program is quite different and, at least since March, 1968, infinitely more important. With some oversimplification it can be described as follows:

Intelligence data is gathered from all kinds of sources, of all degrees of reliability, on all manner of subjects, and fed into a computer complex located, I believe, at Bien Hoa. From this data and using mathematical models developed for the purpose, the computer then assigns probabilities to a range of potential targets, probabilities which represent the likelihood that the latter contain enemy forces or supplies. These potential targets might include: a canal-river crossing known to be used occasionally by the NLF; a section of trail which would have to be used to attack such and such an American base, now overdue for attack; a square mile of plain rumored to contain enemy troops; a mountainside from which camp fire smoke was seen rising. Again using models developed for the purpose, the computer divides pre-programmed levels of bombardment among those potential targets which have the highest probability of containing actual targets. Following the raids, data provided by further reconnaissance is fed into the computer and conclusions are drawn (usually optimistic ones) on the effectiveness of the raids. This estimate of effectiveness then becomes part of the data governing current and future operations, and so on.

Two features must be noted regarding this program, features which are superficially hinted at but fundamentally obscured by Mesthene’s distinction between the abstractions of positive opportunity and “negative externality.” First, when considered from the standpoint of its planners, the bombing program is extraordinarily rational, for it creates previously unavailable “opportunities” to pursue their goals in Vietnam. It would make no sense to bomb South Vietnam simply at random, and no serious person or Air Force General would care to mount the effort to do so. So the system employed in Vietnam significantly reduces, though it does not eliminate, that randomness. That canal-river crossing which is bombed at least once every eleven days or so is a very poor target compared to an NLF battalion observed in a village. But it is an infinitely more promising target than would be selected by throwing a dart at a grid map of South Vietnam. In addition to bombing the battalion, why not bomb the canal crossing to the frequency and extent that it might be used by enemy troops?

Even when we take into account the crudity of the mathematical models and the consequent slapstick way in which poor information is evaluated, it is a “good” program. No single raid will definitely kill an enemy soldier but a whole series of them increases the “opportunity” to kill a calculable number of them (as well, of course, as a calculable but not calculated number of non-soldiers). This is the most rational bombing system to follow if American lives are very expensive and American weapons and Vietnamese lives very cheap. Which, of course, is the case.

Secondly, however, considered from the standpoint of goals and values not programmed in by its designers, the bombing program is incredibly irrational. In Mesthene’s terms, these “negative externalities” would include, in the present case, the lives and well-being of various Vietnamese as well as the feelings and opinions of some less important Americans. Significantly, this exclusion of the interests of people not among the managerial class is based quite as much on the so-called “technical” means being employed as on the political goals of the system. In the particular case of the Vietnamese bombing system, the political goals of the bombing system clearly exclude the interests of certain Vietnamese. After all, the victims of the bombardment are communists or their supporters, they are our enemies, they resist US intervention. In short, their interests are fully antagonistic to the goals of the program and simply must be excluded from consideration. The technical reasons for this exclusion require explanation, being less familiar and more important, especially in the light of Mesthene’s belief in the malleability of technological systems.

Advanced technological systems such as those employed in the bombardment of South Vietnam make use not only of extremely complex and expensive equipment but, quite as important, of large numbers of relatively scarce and expensive-to-train technicians. They have immense capital costs; a thousand aircraft of a very advanced type, literally hundreds of thousands of spare parts, enormous stocks of rockets, bombs, shells and bullets, in addition to tens of thousands of technical specialists; pilots, bombardiers, navigators, radar operators, computer programmers, accountants, engineers, electronic and mechanical technicians, to name only a few. In short, they are “capital intensive.”

Moreover, the coordination of this immense mass of esoteric equipment and its operators in the most effective possible way depends upon an extremely highly developed technique both in the employment of each piece of equipment by a specific team of operators and in the management of the program itself. Of course, all large organizations standardize their operating procedures, but it is peculiar to advanced technological systems that their operating procedures embody a very high degree of information drawn from the physical sciences, while their managerial procedures are equally dependent on information drawn from the social sciences. We may describe this situation by saying that advanced technological systems are both “technique intensive” and “management intensive.”

It should be clear, moreover, even to the most casual observer that such intensive use of capital, technique, and management spills over into almost every area touched by the technological system in question. An attack program delivering 330,000 tons of munitions more or less selectively to several thousand different targets monthly would be an anomaly if forced to rely on sporadic intelligence data, erratic maintenance systems, or a fluctuating and unpredictable supply of heavy bombs, rockets, jet fuel, and napalm tanks. Thus it is precisely because the bombing program requires an intensive use of capital, technique, and management that the same properties are normally transferred to the intelligence, maintenance, supply, coordination and training systems which support it. Accordingly, each of these supporting systems is subject to sharp pressures to improve and rationalize the performance of its machines and men, the reliability of its techniques, and the efficiency and sensitivity of the management controls under which it operates. Within integrated technical systems, higher levels of technology drive out lower, and the normal tendency is to integrate systems.

From this perverse Gresham’s Law of Technology follow some of the main social and organizational characteristics of contemporary technological systems: the radical increase in the scale and complexity of operations that they demand and encourage; the rapid and widespread diffusion of technology to new areas; the great diversity of activities which can be directed by central management; an increase in the ambition of management’s goals; and, as a corollary, especially to the last, growing resistance to the influence of so-called “negative externalities.”

Complex technological systems are extraordinarily resistant to intervention by persons or problems operating outside or below their managing groups, and this is so regardless of the “politics” of a given situation. Technology creates its own politics. The point of such advanced systems is to minimize the incidence of personal or social behavior which is erratic or otherwise not easily classified, of tools and equipment with poor performance, of improvisory techniques, and of unresponsiveness to central management.

For example, enlisted men who are “unrealistically soft” on the subject of civilian casualties and farmers in contested districts pose a mortal threat to the integral character of systems like that used in Vietnam. In the case of the soldier this means he must be kept under tight military discipline. In the case of the farmer, he must be easily placed in one of two categories; collaborator or enemy. This is done by assigning a probability to him, his hamlet, his village, or his district, and by incorporating that probability into the targeting plans of the bombing system. Then the enlisted man may be controlled by training and indoctrination as well as by highly developed techniques of command and coercion, and the farmers may be bombed according to the most advanced statistical models. In both cases the system’s authority over its farmer subjects or enlisted men is a technical one. The technical means which make that system rational and efficient in its aggregate terms, i.e., as viewed from the top, themselves tend by design to filter out the “non-rational” or “non-efficient” elements of its components and subjects, i.e., those rising from the bottom.

  1. 4

    For a more complete statement of the argument which follows, see Suzanne Keller, Beyond the Ruling Class (Random House, 1963).

  • Email
  • Single Page
  • Print