What is the meaning of the election of Ronald Reagan? The initial reaction of most commentators was that Reagan’s victory was evidence of a “conservative tide,” a tide so strong and deep that some described it as “revolutionary,” others as a “turning point” to the “right.” The most telling evidence for this interpretation was the results of the congressional elections. For the first time in a quarter-century, the Republicans have gained control of the Senate while substantially reducing the Democratic majority in the House. Several prominent liberal senators and representatives were defeated, many of them after having been singled out by rabid conservative action groups. One major consequence is that conservatives will become chairmen of some key committees in the two houses.

There can be little doubt that the election of Reagan and the new composition of the Congress will make a genuine difference in some important matters of public policy and in future appointments, especially to the judiciary. Unemployment, environmental protection, nuclear energy, aid to the cities, SALT, and civil liberties will be approached differently, both in spirit and in substance. The ethos will probably be reminiscent of the first Nixon administration.

While “conservative” might fairly describe what is likely to be done by the Reagan team, it does not capture the significance of the election and may even be deeply misleading. Some important facts count against the thesis that we have just witnessed a conservative revolution. Only slightly more than 50 percent of those eligible voted and of these only about 10 percent described themselves as “true conservative.” In addition, we should remember that the campaign failed to dramatize the election as a stark contest between liberals and conservatives; that the president was the most conservative Democratic candidate since John W. Davis in 1924; that as the campaign wore on, Reagan gradually separated himself from the rhetoric and issues dear to the rabid right; and that the most frequent experience of the voters was uncertainty, even anguish at the alternatives. Apathy, uncertainty, and indistinct choices are not the stuff of Thermidor.

The meaning of the election has to do with other questions. What is portended by the deeply antipolitical quality of the election—the desultory, unfocused campaign, the apathetic turnout, the apolitical images cast by the protagonists? Did the election mark the demise of liberalism rather than the triumph of conservatism? Was Reagan less the symbol of conservatism than of traditionalism and, if so, what are the prospects of a revival of such traditional values as family, religion, and simple morality?

The proper setting for addressing these questions is the great changes that have taken place in the system of national political institutions during this century. Stated briefly, there has been an evolution from a loose structure of “government” to something like a state system. A state exists when power and authority are centralized; when their scope and application are, in principle and for the most part, unlimited except by procedural requirements; and when the basic tendency is toward the integration of the various branches of government rather than toward their separation. It is, I would emphasize, these basic tendencies, not the perfect realization of them, that warrants the description “state system.”

If we bear these tendencies in mind, some of the questions raised by the election can be resolved. The apathetic electorate ceases to be an anomaly and appears, instead, as a necessary condition for the legitimation of the state whose effectiveness would be impaired if the electorate were to be seized by an extended fit of participatory zeal. The state needs taxpayers and soldiers, not active citizens. It requires occasional citizens in order to lend plausibility to the fiction that the state is based upon democratic consent and that its actions are therefore legitimate. But in a world of nuclear weapons, a rapidly changing international economy, and uncertain power relations throughout the world, the state must be free in order to act quickly and rationally. The unspoken assumption of its leaders is that it neither needs nor can it function with the uncertainties and divisions inherent in a democratic politics.

The presidency has been a powerful factor in perpetuating the illusion of democracy and thereby legitimating the exercise of state power. As the only official who can credibly claim to be both selected and elected by the society as a whole, the president has assumed the role of the people’s mediator, persuading a parochial Congress to rise to the level of the general good, energizing a stodgy bureaucracy, and defending the people from foreign powers and dominations. The Congress can only bring a limited legitimation to the state. It is unable to represent the democratic principle except in a fragmented and provincial form. In contrast, the president seems to embody the seamless authority of the sovereign people itself.


But in reality, the power of the twentieth-century president has been amassed at the expense both of the electorate—he is the sublimation and the manipulation of their political impulses—and of the Congress, whose legislative enactments invariably delegate large discretionary “law-making” powers to the president. If the president is the personification of the state, he is also the personification of its internal contradiction. The power of the state is legitimated by appealing to the principle of democracy and thereby implying that the state is a democratic state. But it is a commonplace that the modern state precludes continuous political participation by the citizens as well as genuine self-government. The modern state is operated by technicians according to the hierarchical model of administrative management, rather than by equal participants according to a model of deliberation and persuasion. During the past two decades a growing trend among politicians, administrators, influential private citizens, and academic political scientists has been to urge the liberation of the state from the few remaining democratic constraints. As we shall see, President Carter made a historic contribution toward ending the tension between democracy and the state, thereby helping to prepare the way for a new principle of legitimation.

The concept of the state does not, however, adequately describe the American system of power and hence the meaning of the election cannot be fully understood without first characterizing that system. The most appropriate phrase for it is “political economy.” The political element is represented by the state, especially the president and the giant military-administrative establishment of which he is the formal head. The “economy” is a system for the production, consumption, and use of goods and services. Within it, disproportionately large amounts of power, resources, and money are distributed among, or owned by, a relatively small number of giant corporations. The economy of a highly developed society, such as our own, is distinctive for its integration of science, technology, and production. This is the source of the dynamic, expansionist character of so-called “advanced societies.” For, in principle, there are no “natural” limits to the increase of scientific knowledge or to technological innovation. They represent the possibility of infinite power, subject only to the limitations of finite resources. Modernization is not synonymous with ordinary notions of “growth” which assume that living things have an “appointed end,” that they grow to a certain size or shape and then cease to grow. Modernization, in contrast, is “growth” without end.

A political economy is thus a state grounded in an economy of potentially infinite power. In the capitalist political economies of the twentieth century, a division of labor has developed in which the so-called “private sector” of the economy has provided much—but by no means all—of the stimulus for “modernization.” Although the state has had to take the initiative in many important areas—nuclear power and space exploration are only the most recent examples—and, since the beginning of the republic, has subsidized private enterprise with public money, the distinctive qualities of innovation, risk-taking, and experimentation (new processes, products, and markets) are, as Marx recognized long ago, characteristic of the competitive dynamic of capitalism.

But innovation is destructive: machines take the place of workers and their skills; factories deplete villages. Ideally innovation should be “creative destruction” (Schumpeter); the danger is that it will be self-destructive. So innovation needs to be encouraged yet restrained from excess. Although capitalism has experimented with a wide variety of self-restraints, from cartels to trade associations, from divisions of the market to price-fixing agreements, these have had only limited success. The reason is simple: a system which is built upon self-interest and relentlessly cultivates that motive cannot plausibly foster disinterestedness or civic virtue. Yet capitalism needs some power that can stabilize the conditions needed for innovation and competition. It needs regulation of money, credit, investment markets, tariffs, contracts, patents, and some types of fraud. It needs, too, an assured supply of educated (but not overly educated) workers, trained technicians, and a few scientific geniuses. It needs a system of social control, perhaps even pacification, to assure peace and order and to balance off resentment by motivation. Thus a modernizing economy needs rationalization of the conditions that make innovation possible.

Rationalization has been the primary responsibility of the federal bureaucracy: of the great departments (Defense, Treasury, Commerce, Agriculture, etc.); executive offices (Management and Budget, Council of Economic Advisers); regulatory commissions (FTC, SEC, ICC, etc.); and the Federal Reserve Board. Although currently there is considerable discussion of the merits of “deregulation” and even some progress toward it, the predominant view among corporate leaders and high administrators is that general deregulation would produce uncertainty, cutthroat competition, and instability throughout the market.

The legendary vices of bureaucracy—to be slow-moving, unimaginative, obsessed by routine and legal norms—are, in effect, the virtues needed if it is to perform the function of rationalizing and controlling economic life. To expect the opposite virtues is to confuse the public domain of rationalization with the private domain of innovation. In a political economy the state rationalizes, the economy modernizes.


During the last two decades the political economy has been subjected to increasing strains. At the risk of being too schematic, one can interpret the 1960s as the decade that saw the political side of the economy challenged, while the 1970s saw widespread doubts about the future of the economic side of the polity. The various protest movements of the 1960s and the search for participatory forms reflected a general awareness that the major institutions and processes of American politics were at odds with certain basic democratic ideas about self-government, freedom, equality, and shared power. Since the 1960s, however, the antidemocratic character of the political system has become, if anything, more pronounced. The enormous federal bureaucracy is reared on principles of hierarchy, authority, secrecy, lack of accountability, the use of technical knowledge to mystify the public, and the treatment of people as though they were categories rather than autonomous beings. The Congress, instead of being the instrument of popular control, has adapted itself to bureaucratic politics. Each senator and representative is surrounded (some would say imprisoned) by his staff; every committee has its staff to carry on negotiations with the bureaucracy.

By rejecting the prevailing dogma, which equated elections and representation with democracy, the critical voices of the 1960s helped to create an ideological quandary for the system: if elections couldn’t supply democratic legitimacy to the decisions of state, where could legitimacy be found? Equally crucial, if democratic ideas and sentiments were nurtured independently of elections and representation, they might pose a threat to a state which had evolved to a point where it could not qualify as democratic even by crude criteria. The strategic problem was to discourage democratic tendencies without depriving the state of legitimacy and hampering its work of rationalization.

Beginning with the 1968 election, the preferred solution has been to use elections to define the people. Vast sums of money are collected, and spent for television and organization to create an “electorate” that is worked over intensively during the long primary campaigns and on into the election period. Once the election is over, the electorate disappears, leaving behind a “mandate” that stands as the legitimating principle for a new “administration.” What is important is that the “decision” of the electorate is understood as endowing the new officers of the state with legitimate authority. The smashing Nixon victory over McGovern in 1972 was the system’s response to the 1960s. It was, however, an incomplete victory that threatened to be undone by the Watergate revelations and the disgrace of the president. That victory was preserved and completed by the election of Jimmy Carter.

As was pointed out in these columns a few years ago, all of Jimmy Carter’s political impulses were managerial in nature, making him the ideal choice to continue the process of rationalization. Despite his early efforts at civil service reforms, his major contribution was to discredit the idea of democracy. Early on in his campaign he flirted with the rhetoric of “populism,” making calculated use of “love,” “open government,” “participation,” and Bob Dylan. This version of populism had nothing to do with the great historical movements of that name. It was, instead, a media code-word, a pejorative substitute term for “democracy.” Carter exploited the sentimentalism surrounding populism to obscure the fact that on most issues—unemployment, inflation, fiscal policy, defense spending, aid to cities, and welfare reform—his views were indistinguishable from those of many sober conservatives.

By mouthing pseudo-democratic slogans (“a government as good as its people”), vague warnings about the machinations of sinister “interests” and the dangers of a government “remote” from the people, expressions of “compassion” for the poor, and then compiling a record of presidential performance that was nothing short of a national embarrassment, he achieved something that was beyond the abilities of Richard Nixon: he taught the citizens to distrust professions of democracy and to associate democracy with incompetence, empty vision, and self-righteousness. To gain a hearing for democratic changes in the aftermath of Jimmy Carter will not be easy. He has been the instrument of making America safe for the state at the expense of an already enfeebled democratic tradition.

If the weakening of democratic beliefs and institutions is the necessary condition for the rationalization of the political economy, the destruction of traditional values is also the condition for the innovating economy to operate freely. The modernizing economy is voracious, not only of natural resources, but of the traditional human resources summed up in traditions: resources of skill, craftsmanship, domesticity, personal ties, and common morality. An economy that is at once innovative and competitive and profit-driven needs a workforce that is adaptable, submissive, and mobile; that will easily forget former ways, places, and people; and that will be without the inner moral and political resources to protest and disobey. Despite all of the born-again Christians who are ready for a crusade in defense of the free market, it should not be forgotten that the innovating economy is the quintessence of exactly that “modernism” which new popes and presbyters profess to hate.

The ethos surrounding the advanced economies is wholly secular in outlook; no executive prays to Vulcan and no manager seeks other than rational solutions, preferably with as much statistical evidence as possible. The fundamental presupposition, that every problem has a solution, even though the impossible may take a little longer, would provide as fine a text for a sermon on superbia as any Augustinian could wish. The modernizing economy looks out on a universe that it perceives as unmysterious, knowable, and indifferent. Pascal’s wager is incomprehensible, not nearly as good a bet as giving campaign contributions to both candidates.

Just as Jimmy Carter proved to be the perfect vehicle for removing the hindrances of democracy from the path of the rationalizing state, so Ronald Reagan will be the ideal instrument for discrediting the traditional values cherished by the middle, lower middle, and working classes. For a half-century he has portrayed those values; he has been decent, clean-living, respectful of good girls and religion, patriotic, and manly. He has been careful in his friends, preferring the old-fashioned, self-made businessman to the corporate executive. His acceptance speech at the Detroit convention was redolent with nostalgia for pre-corporate America. It made only disparaging references to economic terms (“monetary tinkering” and “fiscal sleight-of-hand”) and none at all to the trendy language of first- and second-strike “capabilities.” He spoke, instead, of “teach[ing] our children the virtues handed down to us by our families…” and of the Mayflower Compact, the Declaration of Independence, and Lincoln. He referred to “family, work, neighborhood, peace, and freedom,” and assured his audience that America could be turned around if we “simply apply to government the common sense that we all use in our daily lives.” No fancy talk about trade balances, triggering, money supply, or zero-sums. During the primaries he had even made an issue of the Trilateral Commission and let it be known that he did not intend to come to power with the aid of the great corporations.

It was all pure hokum except to the faithful, to all those millions who still believed in the traditional virtues and pieties, but who could express their dismay only in a twisted and mean-spirited morality. The election of 1980 may mark one of the great calamities of American history, when the moral indignation and political despair of citizens could find no other means of expression than that of the vengeful groups whose members so doubted the truth of their own morality that they had to insist they were a majority; who were so unmindful of the pitfalls of hypocrisy that they were as zealous for the duty of capital punishment and he necessity of producing greater weapons of destruction as they were for the right to life; so quick to decry the murder of the unborn that they seemed indifferent to the living who subsist in the squalor of the ghettos; so grateful to the god who allows them to relieve their guilt and retain their wealth.

Traditional morality is, however, as anachronistic to the needs of the modernizing economy as democratic values are to the needs of the rationalizing state. When asked after the election whether the Moral Majority would have a voice in the appointment of his cabinet, Reagan replied that while their “input” would be welcome, the cabinet would be selected according to “who are the best we can find with qualifications for the job….” It is no accident that the most powerful man in the Reagan entourage, Ed Meese, is a devout believer in managerialism. The likelihood is that as the Reagan administration has to deal with the everyday realities of the political economy, the language of traditional morality, like the language of democracy, will cease to matter. As one of the principal (and self-described) spokesmen of the establishment has told us, the battlelines are already being drawn between the competing economic philosophies within the new administration. On one side “Big Business and Big Finance,” multinational and international in its outlook; on the other “The Populists” (sic) such as Kemp, Roth, Professor Laffer, and others who “even” regard Big Business as “a kind of conspiracy with foreign powers, undermining native American institutions, and threatening the spirit of private enterprise, as manifested in small- and medium-sized businesses.”*

The Reagan years promise a political economy in which the state will seek to ground its claims to legitimacy in the authority of technical and scientific knowledge rather than in “democratic” consent, and the economy will be able to count on a more tractable, less backward-looking population who will have the president’s amiable moralizing to distract them. But if the American political economy, like the economies of West Germany and France, has entered a period of low economic growth that will compel state authorities to impose even more severe wage restraints, cutbacks in services, anti-inflationary measures at the expense of employment, and a host of similar measures, it will find itself in a true crisis of legitimacy. In promoting rationalization and modernization, the state will have sacrificed the political and moral virtues and habits which prompt citizens to support and to sacrifice.

“We need to regroup and unify” was the response of the Democratic majority leader of the Senate to the Republican victory. This may not suffice if, as there is good reason to believe the fundamental reason for Reagan’s triumph was the bankruptcy of contemporary liberal politics, both in practice and in theory. Jimmy Carter was the symptom not the cause. Although he was an outsider to the various liberal traditions represented by the several Democratic candidates beginning with Al Smith, Jimmy Carter was made possible because of the failure of liberalism to establish a critical distance from capitalism. The failure was primarily a failure of liberal political theory. Liberals didn’t understand until too late that the huge, centralized bureaucracy spawned by the New Deal was radically inconsistent with democratic values and practices; and that social legislation which perpetuated dependency on huge bureaucracies was dangerously anti-democratic.

Democracy is impossible if citizens cannot practice it by sharing in the deliberations and decisions about the forces and conditions affecting their daily lives. Liberalism proceeded to separate political democracy from social welfare, leaving the former to fend for itself while encouraging the citizen to think of politics strictly as a matter of group interests and then to press for the interests of his particular group. Having reduced politics to the play of “interests,” liberals could then substitute economics for political theory and produce a vision of politics based on cost-benefit analysis, with a conception of the good as the just distribution of an ever-expanding surplus, and a conception of evil that fixated on the problem of the “free rider.” Once the surplus began to diminish, liberals practiced “compassion”—not, to be sure, in the literal sense of suffering with others, but patronizingly, feeling for “the disadvantaged.”

As the history of the Democratic Party of the last fifty years shows, liberalism has taught corporate capitalism that social welfare is a cheap price to pay. It produces a scapegoat population whose feckless existence distracts attention from the fact that power, wealth, and knowledge are squarely in the hands of the few.

This Issue

December 18, 1980