Two broad forces for change, one driven by the global demographic explosion, the second by new technologies, are affecting societies throughout the world, with particularly severe consequences for the poorer countries of the developing world.1 Even successful states like Switzerland and Japan, which have usually been better able than many to insulate themselves from international turbulence, will find it impossible to escape the impact of the demographic and technological revolutions bearing down on us.2

How will these various forces for change affect the United States over the coming decades? What are America’s strengths and weaknesses, and how well is it prepared to meet the newer global challenges? In the traditional domain of “hard” or military-based power, the United States is clearly unequaled by any other nation, including Russia and China. Both possess larger land forces, but there must be serious doubt about their overall quality. In any case, numbers are not as important as morale and training, sophistication of equipment, and capacity to project force to distant theaters; in all those aspects, the United States devoted large resources during the 1980s to ensure the required standards.

Strategically, it retains a panoply of air-, land-, and sea-based missile systems to intimidate another power from attacking the United States and its allies. Technologically, its armed services are equipped to fight so-called “smart” wars, using everything from Stealth bombers and fighters to AEGIS cruisers and sophisticated night-fighting battlefield weapons. Through satellites, early-warning aircraft, and an extensive oceanic acoustical detection-system, its forces usually have the means to spot what potential rivals are up to. 3

Finally, the US is the only country with a truly global “reach,” with fleets and air bases and ground forces in every strategically important part of the world, along with the capacity to reinforce those positions in an emergency. Its response to the 1990 invasion of Kuwait by Iraq demonstrated the flexibility and extent of those abilities. In dispatching over 1,500 aircraft and 500,000 men (including heavy armored units) to Saudi Arabia in a matter of months, and in filling the Mediterranean, Persian Gulf, and Indian Ocean with carrier task forces, the United States displayed military power unequaled in recent times. Perhaps the only modern historical equivalent was Britain’s “force projection” of over 300,000 soldiers, safely protected by the Royal Navy’s command of the seas, to fight in the South African war at the beginning of this century.

As the cold war fades away, the size and extent of United States deployments are being cut significantly; but it would be remarkable if America returned to its pre-1941 policy, where none of its military units was based outside the United States and its insular dependencies. As it is, the existence of regimes like those in Iraq and Libya, and of conflicts like those in Somalia and Bosnia, aids the Pentagon in arguing the need to retain considerable and flexible armed forces.4 Whatever re-duction in American military power occurs, it is likely to possess far greater capacity than medium-sized countries like France and Britain, and retain a technological edge over Chinese and Russian forces.

Yet while this military power boosts the United States’ place in world affairs, that may not necessarily be a blessing for the nation as a whole. The high defense burden has caused some economic damage, and given an advantage to commercial rivals like Japan and Germany. The cold war provided the political “cement” to bind a majority of Americans, Republicans and Democrats alike, to large defense budgets and entangling alliances. With the Soviet threat removed, this consensus may disintegrate; at the least, it may be difficult for American leaders to justify a worldwide military presence to its own public in a time of economic dislocations. While some strategic thinkers debate whether forces should be withdrawn from Europe and concentrated against “out-of-area” threats in the developing world, 5 others wonder about the utility of military force in general, since the threats to America may now come not from nuclear weapons, but from environmental hazards, drugs, and the loss of economic competitiveness.6

As a consequence, the sense of relief that the Soviet Union is no longer an “enemy”7 is overshadowed by uncertainties about the proper world role of the US. To the traditionalists, it is important that America is present, in Europe, the Pacific, and elsewhere, in order to prevent any return to the anarchic conditions of the 1930s; 8 to critics, the argument that the United States is “bound to lead” places burdens upon the American people, diverts resources from domestic needs, and takes American democracy further away from its original foreign-policy principles.9

Such a debate is easily recognizable to historians. In general, the leading power favors international stability in order to preserve the system in which it enjoys great influence and wealth; usually it has inherited a vast legacy of obligations and treaties, promissory notes to distant allies, and undertakings to keep open the world’s seaways. But acting as a world leader includes the danger of becoming the world’s policeman, combating threats to “law and order” wherever they arise, and finding ever more “frontiers of insecurity” across the globe that require protection.10 This suggests, therefore, that the debate over the future of American external policy will go on.


Such a debate cannot be separated from domestic concerns, simply because of the cost of maintaining such a global position. Three hundred billion dollars a year bought military security for the United States, but it also diverted resources—capital, the armed forces’ personnel, materials, skilled labor, engineers, and scientists—from nonmilitary production. In 1988, for example, over 65 percent of federal R&D money was allocated to defense, compared with 0.5 percent to environmental protection and 0.2 percent to industrial development. Moreover, while engaging Moscow in an expensive arms race, America has had to compete for world market shares against allies like Japan and Germany which have allocated smaller percentages of their national resources to the military, thus freeing capital, personnel, and research and development for commercial manufacture, which has undermined parts of the American industrial base. Not surprisingly, this has provoked American demands that allies contribute more to the common defense, or that major retrenchments occur in American defense spending in favor of domestic needs.11

Although this controversy usually focuses on the question of whether high defense spending causes economic slowdown, the issue is not as simple as that. In some cases, defense spending can boost economic growth, as the United States discovered during World War II. Again, a reduction in defense expenditures may do little or nothing to assist a country’s economic growth if the amount “saved” is then returned to a society that spends it on imported automobiles, wines, and VCRs; whereas if the same amount were channeled toward productive investment, the economic results could be very different.

Much more important is the structure of an economy that bears large defense expenditures. If that economy is growing briskly, possesses a flourishing manufacturing base, is at the forefront of new technologies, enjoys a strong flow of skilled labor, scientists, and technologists, invests heavily in R&D, is in balance (or surplus) on its current accounts, and is not an international debtor, then it is far better structured to allocate 3 or 6 or even 9 percent of its GNP to defense than if it lacks those advantages.12


In fact, given the size and complexity of the American economy, it is impossible to categorize it as either hopelessly weak or immensely strong; it is a mixture of strengths and weaknesses.

The single most important fact is that rates of growth have slowed considerably in the final third of this century compared with the middle third (see chart above).


Whatever the explanation for this slowdown, the consequences are serious for the United States with its internal and external obligations. With a high, fairly evenly distributed standard of living, a favorable current-accounts balance, and no foreign commitments, a country like Switzerland, perhaps, or Luxembourg, might suffer a long period of sluggish economic growth, and the results, although depressing, might not be serious. But the United States is the world’s number one military power, with commitments all over the globe. Its wealth, while considerable, is unevenly distributed, resulting in immense social problems at home. And because it has large deficits, not just in the trade of goods and services with other countries but also in its overall accounts, it needs to borrow from foreigners.

Given those circumstances, a prolonged period of slow growth compounds existing problems, making it unlikely that the United States can continue to fund the same level of military security, while also attending to its social needs and repaying its debts. A country where real weekly incomes have fallen steadily since 1973—as in this case—is ever more inclined to restrict spending to immediate needs (debt interest payments, health-care payments) rather than invest in its long-term future.

Such a dilemma is intensified if other nations are growing faster, leading to changes in economic relationships. The leading great power simply cannot maintain its status indefinitely if its economy is in relative decline.13 Moreover, because this decline is relative and gradual, it is insidious, not dramatic; as one economic historian has noted,

A country whose productivity growth lags one percent behind other countries over one century can turn, as England did, from the world’s undisputed industrial leader into the mediocre economy it is today. 14

It also turned from a first-class to a second-class power. Presumably, that reasoning was behind Mrs. Thatcher’s declaration several years ago that it would be a “disaster” if the American economy—as some forecast—were to grow more slowly than Japan’s in the 1990s.15 Perhaps Japan’s current economic slowdown will be more than a temporary one, in which case this will not happen. But the basic point remains that, from a realist perspective, continued slow growth would erode America’s world position, causing a further shift in the balances from Washington to Tokyo.


This suggests that the fundamental strategic objective of the United States as it moves toward the twenty-first century ought to be to enhance its per capita productivity for the sake of long-term growth. It is not that economic expansion is good in itself—it can damage environments and societies, if it is pursued wantonly—but without growth many desirable aims cannot be met.

In recent years, however, American productivity has become a cause of concern. Since the nineteenth century, the United States has enjoyed the world’s highest labor productivity, which is why American national income and “war potential” were much larger than those of anybody else when it fought the two world wars.16 At present, its overall productivity is still larger than Japan’s and Germany’s, but other nations have increased productivity at a swifter pace since the 1960s, narrowing the American lead. While the very latest reports on comparative productivity at first sight suggest that America is maintaining its lead better nowadays than it did in earlier decades, it may also be that in some cases the extent of the decline is being obscured by the practice of measuring output in terms of internal purchasing power rather than in terms of real international purchasing power as reflected in current exchange rates.17 Whatever the proper measure is, there is little ground for complacency, especially in view of the changes in the composition of the American work force, which will be discussed later.

America’s growing indebtedness, the frailty of its financial system, and its persistent deficits in trade and current accounts would also be relieved by increased productivity. Indebtedness occurs at various levels. Nationally, it results from the US government and Congress declining to pay the full cost of federal programs by additional taxes, a trend already evident in the 1960s, and perpetuated by both Democratic and Republican administrations; it was greatly accelerated by the Reagan government’s decision to decrease taxation and increase defense spending during the 1980s. In 1960 the federal deficit totaled $59.6 billion and the national debt $914.3 billion.18 In 1991, despite pledges by the White House and Congress to get spending “under control,” additional expenditures—cleaning up nuclear facilities, S&L and bank bailouts—pushed the deficit well over $300 billion, while the national debt itself approached $4 trillion, which does not include the federal government’s obligations of around $6 trillion under various programs (crop guarantees, loans to farmers and students, insurance programs).

Interest payments on the national debt are around $300 billion annually and represent 15 percent of government spending. As the economics editor of The Wall Street Journal has noted, interest payments now exceed “the combined amounts that government spends on health, science, space, agriculture, housing, the protection of the environment, and the administration of justice.” Not only are these charges likely to increase,19 at the expense of other government outlays, but a rising amount of those interest payments are to foreign owners of Treasury bonds, further reducing America’s wealth. Finally, if slow economic growth persists throughout the 1990s, the deficit may rise further, since federal receipts will not grow as fast as expenditures.20 Already the new Clinton administration, upset by fresh estimates that the annual federal deficit may rise, not fall, during the 1990s, is backing off its preelection pledge to cut the deficit in half during the coming four years.21

It was not only the national debt that soared during the 1980s, but every other form of debt. State and local governments began to experience deficits from 1986 onward—a trend exacerbated by cuts in federal grants. Consumer debt, fueled by “easy money” incentives, reached $4 trillion, while repayments diminished personal income. Corporate debt was even worse: “As the 1990s began, about 90 percent of the total after-tax income of US corporations went to pay interest on their debt.” Altogether, public and private debt equaled roughly 180 percent of GNP, a level not seen since the 1930s.22

Deficits in the balance of payments and current accounts represented a further change from the 1950s and 1960s, when America had large surpluses in merchandise trade and current accounts.23 Since 1971—when the United States recorded its first merchandise-trade deficit in over a century—it has consistently bought more than it has sold. By 1987 the trade deficit reached a staggering $171 billion and, although the decline in the value of the dollar brought the total down by the later 1980s, deficits of over $100 billion were still being recorded.

If the American economy were able to cover its “visible” trade deficit through earnings in “invisibles” such as services, investment income, and tourism, as Britain did before 1914, the situation would be less serious; but American invisible earnings are insufficient to close the gap. As a result, the US now pays its way by borrowing tens of billions of dollars from foreigners.’ (US borrowing from foreign countries in 1991 and 1992 was affected by international payments to the US for the Gulf War—the first profitable war in its history—but the debt was still between $50 and $70 billion.) Once the world’s largest creditor, the United States has by some measures become the world’s largest debtor nation within less than a decade. 24 The longer this continues, the more American assets—equities, land, industrial companies, Treasury bonds, media conglomerates, laboratories—are acquired by foreign investors.

The heart of the trade deficit problem lies in the long-term erosion of America’s relative manufacturing position, which may seem a curious fact when almost three quarters of the economy nowadays is in services. Yet, by their nature, many service activities (landscape gardening, catering, public transport) cannot be exported, and even where earnings from services are considerable (consultancy, legal work, patents, banking fees), the total doesn’t pay for the goods and services imported each year. For example, the total value of goods and services imported into the United States in 1987 was $550 billion, whereas the gross export of services was about $57 billion. Manufacturing is vital for other reasons: it accounts for virtually all of the research and development done by American industry, and a flourishing and competitive manufacturing base is still “fundamentally important to national security.”25

Any attempt to summarize the present condition and future prospects of American manufacturing, however, is confronted by its extraordinary diversity. Some of its largest companies are world leaders, and many smaller firms (in computer software, for example) are unequaled in what they do. Others, however, are reeling from foreign competition, and their plight is the subject of innumerable commissions, studies, working parties, and congressional hearings. An entire industry (alas, not very distinguished either in manufacturing productivity or in its contribution to the balance of payments) has now emerged, devoted to studying American “competitiveness.”26 The overall picture that emerges is of an industrial structure which, though it has many strengths, no longer occupies the unchallenged position it did in the first two postwar decades.

While this is not a picture of unrelieved gloom, the rise of foreign competition in industry after industry has obviously increased the American merchandise-trade deficit. As the chart on this page reveals, out of eight key manufacturing sectors only chemicals and commercial aircraft were producing an export surplus by the late 1980s.


These deficits occurred across a range of industries, from low per capita value-added products like textiles to high-technology goods such as computer-controlled machine tools and luxury automobiles. This makes the position of American manufacturing during the 1990s especially critical. Is there a high-tech “revival” under way, as recently reported in The Wall Street Journal, that would restore American’s position?27 Or does that signify only a partial and temporary recovery within the saga of relatively long-term decline?

Unsurprisingly, the debate over “competitiveness” has not produced unanimity, as was evident in the election campaign and in. Clinton’s Little Rock economic “summit.” Appeals for protection from hard-hit industries are opposed by those who fear retaliation in export markets, and by laissez-faire economists. Attacks upon foreigners’ buying into America are countered by the argument that Japanese and European companies bring expertise, job opportunities, and much-needed capital. “Buy American” campaigns are resisted by those who feel that consumers should be free to purchase goods regardless of the nationality of their origin. Calls for an industrial policy are denounced by groups who feel that government-led actions to support specific industries would be inefficient and contrary to American traditions. Some claim that the relative economic decline has a single cause, whereas others offer many reasons, from poor management to low levels of investment, from insufficient technical skills to excessive government regulations. The debate echoes one that took place a century ago in Britain, when a “national efficiency” movement emerged in response to the growing evidence that Britain’s lead in manufacturing was being lost.28

The present concern about the condition of the United States’ economy is also fueled by a broader unease regarding the implications for national security, for American power, and for its position in world affairs. What if foreigners acquire a monopoly in industries that make strategic products for the Pentagon, or if an important military-related item is made only abroad? What if the country becomes ever more reliant upon foreign capital—will it one day pay a political price as well as a financial one for that dependency? What if its industrial base is further eroded, while it continues with defense expenditures six or ten times higher than those of other countries—will it, instead of “running the risk” of imperial overstretch, finally have reached that condition?29 Japan (hurt by a speculative “bubble” in land prices) and Germany (hurt by the cost of unification) may be facing economic difficulties. But what if a later recovery permits them to resume that advance upon America’s position which occurred in the 1970s and 1980s? Will not the productive power balances continue to shift, so that the United States will no longer be “Number One”? (See chart.)


These apprehensions may appear old-fashioned to certain economists—in their view, a sign of “residual thinking” in an age where the nation-state is no longer central, and the key issues concern the quality of life rather than rank in the global pecking-order30—but one suspects the apprehensions are serious ones, for all that.

What is one to make of this controversy? To the optimists, what has been happening is perfectly understandable. In the postwar decades the United States occupied an artificially high position in world affairs, because other powers had been damaged by the conflict; as they recovered, the American share of world product, manufacturing, high technologies, financial assets, even military capacity, was bound to fall. Yet the United States remains the most important nation in the world, in economic and military power, diplomatic influence, and political culture, though certain domestic reforms are needed.31 American industry was unprepared for the intensity of foreign competition—and paid a price for that—but since the 1980s it has become leaner and fitter, its productivity has shot up, and it is moving into new technologies and products with unequaled strengths, especially research personnel. The advantages of such competitors as Japan will not last for long. With the dollar’s reduced exchange rate, and continuous upgrading of American manufacturing, the economy will rebound into prolonged growth, turn the deficits into surpluses, and respond vigorously to what were merely temporary difficulties.32

To the pessimists, such reasoning is a sign that many Americans have failed to understand the seriousness of the problem. It is not the country’s relative economic decline in the two decades following 1945 that concerns them, since that was the “natural” result of the rebuilding of other economies; it is the evidence that the American position relative to other nations has continued to erode since the 1960s in new technologies and patents, key manufacturing industries, financial assets and current-account balances, and international purchasing power. Most pessimists would, no doubt, be delighted to be proven wrong, and dislike being called “defeatists” or “declinists.” But they remain skeptical of the vague argument that America’s “specialness” or “genius” or “capacity to respond to challenge” will somehow restore its position, seeing in such rhetoric the same ethnocentric pride that prevented earlier societies from admitting and responding to decline.33


While much of the controversy over American “decline and renewal” naturally focuses upon the economy, failures in the educational system, the social fabric, the people’s well-being, even their political culture, are also much debated…presumably for fear that the causes of noncompetitiveness may be more profound than, for example, an inadequate savings rate. Characteristic of this thinking is the assumption that somehow the American people have taken a wrong path; as the popular television commentator John Chancellor put it, anticipating Clinton and Perot’s electoral rhetoric:

The strength is there, but it is being sapped by a combination of weaknesses—a thousand wounds we find difficult to heal. We have weakened ourselves in the way we practice our politics, manage our businesses, teach our children, succor our poor, care for our elders, save our money, protect our environment, and run our government.34

To the daily readers of American newspapers, the list of ailments will be drearily familiar: for example, a health-care industry that doubled the number of its employees in the 1980s—thus worsening overall labor productivity in the health-care industries—and that now consumes over 14 percent of GNP, more than twice the share for defense, yet does not provide decent health care for many citizens. In fact, some 37 million Americans lack health insurance, and suffer accordingly. By the end of the 1980s, the number of poor people with health problems—such as babies born with syphilis or AIDS—was steadily rising; among blacks, where half the children under six live below the poverty line, health problems are severe and compounded by poverty. Lacking a national health system, the US has the highest incidence of child mortality among the major industrialized countries and also has the lowest position among these countries in life expectancy and visits to the doctor,35 although it probably leads the world in politicians who talk about “family values.” While life expectancy for older white men and women has increased (much of the rise in health-care spending has gone to those over seventy-five), that for black women and especially black men has fallen.36 Because of this widespread poverty, Oxfam America—famous for its aid to developing countries—announced in 1991 that it would also focus, for the first time ever, upon the United States itself.

This uneven health care reflects the structure of wealth and income in contemporary America, where on average managers earn over ninety times as much as industrial workers (up from forty times as much in 1980), but where 20 percent of African Americans and 20 percent of Hispanics earn less than the official poverty line and live in slums. It is exacerbated by the amount of drugs Americans consume; according to one estimate, the United States—with 4 to 5 percent of the world’s population—consumes 50 percent of the world’s cocaine. Such addictions strain health-care services, and not simply in the treatment of adults; in 1989 alone, approximately 375,000 Americans were born addicted to drugs, mainly cocaine and heroin.37

Drugs in turn feed crime, which is significantly higher in the US than anywhere else in the developed world. Thanks to the political power of the National Rifle Association, Americans have access to deadly weapons, and use them, to a degree that astounds observers abroad. Americans possess an estimated 60 million handguns and 120 million long guns, and kill one another at a rate of around 19,000 each year, chiefly with hand-guns. Since 1960 the rate of violent crime per capita has grown by 355 percent, a truly horrifying statistic. Homicide rates per capita are four or five times higher than in Western Europe (while rape rates are seven times higher, and forcible robbery rates some four to ten times higher).38 Experts suggest that this violence has cultural roots, and cannot simply be linked to poverty. New York’s homicide rate is far larger than that in the slums of Calcutta, for example, and in prosperous Seattle—recently rated number one city in the United States for “liveability”—the murder rate is seven times that of Birmingham, England.39 Nor is violence owing to lack of police efforts and deterrents; at the last count, American prisons were holding more than a million convicted prisoners, a proportion of the population larger even than in South Africa or the former USSR.40 Three thousand out of every 100,000 black American males are in prison, whereas South Africa managed to preserve apartheid by imprisoning 729 black males per 100,000.41

Doubling the number of people behind bars during the 1980s has not been very effective, therefore, in dealing with the erosion of American society, partly because of the difficulty of attempting major social reforms in a politically decentralized, libertarian society. 42 Any attempt to alleviate homelessness and poverty in the inner cities—and the rural South—might cost a great deal of money, and a transfer of resources from the better-off (who vote) to the poor (who don’t). Since the Boston Tea Party, middle-class Americans have had a deep aversion to paying taxes—with some justification, since unlike Europeans they do not enjoy in return such middle-class benefits as free college tuition, health care, subsidized cultural events, efficient public-transport systems, and so on.43 Perhaps funds could be made available if productivity and real growth were bounding upward. When they are not, changes in spending priorities become part of a “zero-sum game,” blocked by groups who would lose out.44

But Americans have been willing to invest heavily in education. For example, in 1989 over $350 billion was spent on public and private education to support 45 million pupils enrolled in primary and secondary schools, as well as nearly 13 million college and university students. In absolute terms, only Switzerland allocates more money per pupil; relatively, the United States devotes 6.8 percent of GNP to education, which is equal to that of Canada and the Netherlands and ahead of education’s share in Japan, France, or Germany.45

In return the United States could claim, with some justification, to have one of the finest systems of higher education in the world. Apart from many superb liberal-arts colleges, it boasts state university systems that educate an impressive number of students. Above all, it possesses the world’s greatest array of research universities and scientific institutes, with faculty recruited from around the globe, achieving disproportionately high international recognition (e.g., Nobel Prizes), and attracting students from many lands. The resources of intellectual powerhouses like Harvard, Yale, and Stanford—with endowments of billions of dollars—are equaled by their high performance and their global reputation. From them emerge annual cohorts of scientific and creative personnel upon which the American economy depends.

Apart from higher education, however, the picture is less favorable. Many Americans are worried by the growing evidence that general levels of public education from kindergarten through high school are relatively mediocre. Since the early 1960s, the scores achieved on scholastic aptitude tests—for what they are worth—have fallen considerably. Despite the opportunities offered by the free mass public education system, pupils are abandoning it in record numbers; between 600,000 and 700,000 drop out of high school each year, which is one fifth of all high school pupils (and closer to one half of those at inner-city high schools).46 Moreover, although the US census ambitiously and perhaps misleadingly reports a literacy level of almost 100 percent, various studies claim that millions of Americans—the figures range from 23 to 84 (!) million—are functionally illiterate; according to one, 25 million adults cannot read well enough to understand a warning label on a medicine bottle, and 22 percent of adults are unable to address a letter correctly.47

How does this compare internationally? In a recent standardized science test administered to ninth graders in seventeen countries, American students finished behind those of Japan, South Korea, and every Western European country, and ahead only of those in Hong Kong and the Philippines. In a test of mathematical proficiency (1988), American eighth graders were close to the bottom. Other tests reveal that the American ranking worsens as children grow older—although, ironically, more than two thirds of high school pupils felt that they were “good” at mathematics, whereas fewer than one quarter of the South Koreans (who actually scored much higher) felt that way.48 Only 15 percent of high school students study a foreign language, and a minuscule 2 percent pursue one for more than two years.49 Surveys of high school pupils’ knowledge of basic history have also revealed great ignorance (for example, of what the Reformation meant), an ignorance eclipsed only by their geographical illiteracy. One in seven adult Americans tested recently could not locate his own country on a world map, and 75 percent could not place the Persian Gulf—even though many Americans had favored dispatching US forces to that region.50 As the National Commission on Excellence in Education noted in its landmark 1983 report, A Nation at Risk, “If an unfriendly foreign power had attempted to impose on America the mediocre educational performance that exists today, we might well have viewed it as an act of war.”51

Despite the many studies devoted to the problem, the root cause is still not clear. Certain experts caution against drawing too severe conclusions from the declining SAT scores and international tests, recalling that the United States is educating a far larger proportion of its population for much longer than it was forty years ago. By the same token, it may be erroneous to compare the knowledge of the average American high school pupil with that of children in more selective systems overseas.52 It may also be misleading to compare educational standards of a “melting pot” society with those of demographically stable and ethnically homogenous countries like Sweden and Japan.

More prosaically, one might note that although America spends large amounts on education as a whole, a disproportionate 40 percent goes to higher education (which may explain why American universities rate high internationally), whereas the share going to high school or lower schools is less than in other countries.53 Again, American pupils attend school for considerably fewer days each year (175 to 180 is normal) than their equivalents in Western Europe (200 or more) and Japan (220). If, by the age of eighteen, the average Japanese or South Korean has had the equivalent of three or four years more school than the average American, is it surprising that they know much more algebra and physics?54 Finally, while the United States is one of the few developed societies lacking nationally mandated education standards which are assessed by uniform national exams—the chief reason a National Research Council investigation on mathematics education felt that “the top-down systems have beaten us hands down” 55—any suggestion of copying the other democracies is resisted by school boards, education authorities, teachers’ unions, and all who celebrate the country’s decentralized traditions.

To other experts, technical alterations are less significant than the social culture within which American education has to operate. Displaying nostalgia for what seemed better days—when high school pupils supposedly worked harder and achieved higher scores—some critics suggest that “the crisis is not in the schools but in us. The society we have constructed has given us the education we deserve.” 56 The “trivialization” of American culture, meaning the emphasis upon consumer gratification, pop culture, cartoons, noise, color, and entertainment over serious reflection, is portrayed as a self-inflicted wound.

Apparently, the average American child has watched 5,000 hours of television even before entering school, and by graduation that total will be nearly 20,000 hours. This anti-intellectual youth culture—continued later by the fascination with sports or “soap” operas—is not helped by the disintegration of the family, especially among African Americans, where so many mothers have to cope on their own; or by the great rise in female employment, so that (unlike in East Asian societies) the “first educator,” the mother, is absent from the home for most of the day. Apart from certain groups—Jews, Asian Americans—who place strong emphasis on the value of education, the average American child is said to be picking up the value system of a shallow entertainment industry rather than the moral standards, discipline, and intellectual curiosity that equip a person to learn. To ask the schools, especially in the inner cities, to remedy this social-cum-cultural crisis is simply to demand too much.57

While a fair-minded reader of this gloomy literature may think it cataclysmic—are the education systems in other countries free of problems?—the chief fact is that the literature exists, affecting the national debate about the future. If the average American is poorly educated, does that not also contribute to a trivialization of electoral campaigns, with Reagan/Bush slogans like “Read My Lips,” “Make My Day,” and “Morning in America”? Does an inadequate school system lead to an erosion of proper democratic debate in order to meet television’s demands for quick answers? Is this why little more than half of US citizens vote?58 If the average American has little interest in foreign cultures, and cannot locate the Persian Gulf on a map, how is he or she to comprehend issues of intervention abroad? or to learn about globally driven changes? Is that sort of knowledge to be left to a minority (say, 15 percent of the population, as at the time of the Founding Fathers) consisting of professional-class families whose members did go to the right schools and colleges, and whose standards of living, frequency of foreign travel, and access to the international economy have increased nicely through the 1980s?59


Despite these worrying tendencies, many American commentators stress the positive features of their bustling, variegated society. The United States is still the largest economy in the world (unless one counts the European community as a whole). It is attractive to millions of immigrants each year, and many more who can’t get in. Its popular culture is visible around the world, its language dominant in business, science, and entertainment. Its commitment to liberty and democracy has inspired oppressed peoples everywhere from China to Czechoslovakia. It is the exemplar of the capitalist system, which its ideological foes challenged and lost. Because of its great military power and diplomatic influence, when an international crisis occurs, be it in Kuwait or Somalia, all eyes usually turn to Washington. Americans, according to the unabashed optimists, should be celebrating their triumphs, their culture, their ideology, their way of life, their “noble national experience…the most universally attractive of our era…”60

Even the more cautious “revivalists” may admit that domestic reforms are needed, but argue that the chief danger is that the American people think the country is poor and impotent, when it is in fact rich and powerful. If it can simply shake off its present mood and make certain adjustments, it will be the world’s leading nation in the twenty-first century, just as it has been for the past fifty years. 61

All this presents the United States with a dilemma. Yet apart from a few unreconstructed optimists like George Gilder or Ben Wattenberg, who hold that the country is moving effortlessly upward, recent opinion polls by the early 1990s showed that most citizens felt things had worsened—in the social fabric, race relations, public education, economic performance, the conditions of the average American family—and would be worse for their children and grandchildren. This has led to a demand for changes: some want the tax system altered, others want the schools overhauled, or a transformation in health-care provision, or changes in industrial policy, or an all-out assault on poverty, or on crime. Many Americans would like all of the above. It was this anxiety, surely, that accounted for Perot’s popularity and that, more significantly, tempted voters into the Democratic camp when Clinton emphasized that he intended to take action in precisely such fields as health care, education, infrastructure, and so on.

Reforms that challenge existing arrangements are never easy in a democracy; but the American political structure in particular has offered the most marvelous opportunities to obstruct changes. The constitutional division of powers means that the president lacks the authority of, say, the British prime minister and cabinet to get legislation swiftly enacted. The relative absence of party discipline makes each member of Congress more independent, but the unrestrained costs of electoral campaigning also make that member reliant upon funding from supporters and interest groups (political action committees) and highly sensitive to the threat that a powerful lobby—the pro-Israel coalition, the National Rifle Association, pro- and anti-abortion movements, the groups representing retired Americans—will campaign against a congressman if they are offended by his or her policies. Consequently, efforts to slash the budget deficit, or trim Medicaid costs, or restrict gun sales, usually founder in the “gridlock” of Washington politics. Perhaps this will improve, now that a Democratic president confronts Democratic majorities in both House and Senate; but at the broader national level, a social culture which asserts “Let everyone do their own thing” is not an ideal one in which to push through reforms. The very notion of reforming or retooling American society to make it more competitive is itself a contradiction of the laissezfaire ethos.


How, then, will American society, obsessed about its present condition, deal with the broader forces for global change? How well prepared is the United States for the twenty-first century?

Clearly, the United States is going to be affected in many ways by demographic trends. There will, for example, be many more elderly people by the early twenty-first century. Whereas there were only 16.6 million Americans aged sixty-five and over in 1960, the figure had virtually doubled to approximately 31 million by 1990; after slow rises over the next decade or two, it was until recently forecast to increase to 52 million in 2020 and 65.5 million in 2030, with the numbers of people over seventy-five and even over eighty-five—where the health-care costs per person are disproportionately high—growing the fastest of all.62 This means not only that the political power of retirees’ organizations will be even greater, but also that there could be a further diversion of resources toward elderly care—resources that, economically at least, would be better employed in preventing child poverty or improving infrastructure.63

Over the longer term, however, the most serious consequence will be that the Social Security funds—at present still in surplus, and helping to disguise the true extent of the federal deficit—will simply run out by sometime early in the next century, causing a crisis not only in health provision for the average elderly American, but also in the fiscal system. The politicians then in charge, facing a federal deficit worsened by social security losses, will confront several unpleasant choices: slash Social Security provisions or other forms of federal spending; or vastly increase taxes upon the relatively smaller proportion of “productive” Americans to pay for the swollen costs of caring for the fast-growing numbers of over-sixty-five-year olds. The only other alternative would be to risk enormous federal deficits, and the prospect of financial instability.

Meanwhile, the ethnic composition of the United States is also changing. Although the forecasts are subject to amendment—many earlier predictions of the future population of the United States have been notoriously inaccurate64—demographers are reasonably confident that the white, Caucasian segment will continue to shrink. This is partly due to the expectation of further large-scale immigration, both legal and illegal, chiefly from Latin America and Asia; as “have-not” families stream to “have” societies, America is seen as the most desirable, and accessible, destination to many migrants. The second reason is the differentiated birthrate between white and most nonwhite ethnic groups, which has socioeconomic causes but is also affected by the different roles of men and women, women’s expectations, and access to higher education. In consequence, some demographers refer to the “browning” of America by 2050, as white Caucasians become a minority.65

Other experts forecast that this transformation will be less swift because over time immigrants and minorities will conform to white reproductive patterns.66 Nevertheless, these trends toward the simultaneous “graying” and “browning” of America are going to have lasting consequences. Some writers worry that an aging United States will stagnate economically, and call for increased immigration, reminding readers that successive waves of migrants have fueled the country’s rise in the past; this argument is often accompanied by gloomy prognoses about Europe and Japan’s long-term prospects as they grapple with demographic decline, yet seek to prevent an inflow of newcomers. Others point uneasily to the fact that most of the recent immigrants to America have relatively low educational and skill levels, congregate in the inner cities—few of them help to compensate for the declining populations of Great Plains townships—and impose additional demands upon the social and educational services of the poorest parts of the American administrative structure. Some demographers predict that perhaps 15 million immigrants will arrive each decade for the next thirty years, and calls are now being made to “bar the door.”67

Demographic change can also exacerbate ethnic tensions, as between African and Hispanic Americans (over jobs), or Asian and African Americans (over educational access), as well as stimulate the racial worries of poor whites. Over the longer term, the graying/browning tendency may be setting up an intense contest over welfare and entitlement priorities between predominantly Caucasian retirees and predominantly nonwhite children, mothers, and unemployed, each with its vocal advocacy organizations. California, whose population rose by 30 percent in the 1980s alone, is still the favored destination of millions from south of the border. As a consequence of higher birthrates and continued immigration, half of all children in the state are forecast to be Hispanic by 2030, when whites will comprise 60 percent of the elderly population—a troublesome mismatch.

Even before the triumphant Democrats moved into the White House, The Washington Post was reporting that deep-rooted antagonisms—between blacks and Latinos, education unions and education reformers—would cause fissures in the Clinton coalition.68 Perhaps predictably, some authors now call for a debate about the implications of “bright, well-educated American women” giving birth to fewer and fewer children.69

These outcomes are, at the moment, hypothetical, whereas the political and economic consequences of America’s demographic transformation are easier to estimate. Simply because the regional electoral balances (e. g., share of seats in the House of Representatives) do, over time, reflect population change, there is likely to be a further shift in voting power from the north and east to the south and west, from Caucasian to non-Caucasian districts, from Europe/Israel–centered issues to Hispanic/Pacific concerns. The executive, judiciary, and legislative branches, at present despite recent appointments with only a sprinkling of non-male, nonwhite members, will find it difficult to halt their metamorphosis into bodies containing many more women and minorities. Schools and colleges, already grappling with the demands to teach both “multiculturalism” and “Western civilization.” may come under further social and cultural pressures as the demographic tide advances.70

Demographic change will also affect the American economy, both in the composition of its work force and in the larger issue of American competitiveness in a future which, forecasters assert, will be dominated by knowledge-based societies. According to a common economic theory, the United States rose to world preeminence because of its vast, easily accessible, raw materials (oil, iron, coal) and foodstuffs, giving it an advantage over resource-poor Japan and Europe. Now that ample supplies of raw materials and food are produced throughout the world, that advantage is shrinking; and it will shrink further with the “de-materialization” of production and the many other changes in the way things are manufactured. Moreover, the continued explosion of scientific knowledge will be best exploited by societies that are steadily raising overall educational standards, technical training, and work force skills, and this America is not doing.71

Since the 1970s, the composition of the work force has changed significantly. While manufacturing cut many skilled, blue-collar, and relatively high-paying jobs, the “boom” in services created ever more low-paid, low-skill jobs (cleaners, restaurant personnel, drivers, health-care assistants, and the like), most of which paid less than $15,000 a year.72 The other trend was the growth in white-collar, technical jobs, especially in information and research sectors of the economy, requiring advanced training and higher education. According to the Hudson Institute survey Workforce 2000, by the end of this century as many as 52 percent of new jobs may require at least some college education.73

Yet the supply of so many educated persons is in doubt. For years the number of American Ph.D.s in mathematics and engineering (and faculty instructing them) has been inadequate, but while that may be another sign of a declining manufacturing culture, the shortfall can be made up by recruiting foreign doctoral students and professors. On the other hand, American industry has found it difficult to recruit workers to fill jobs not requiring a college education. The chairman of Xerox Corporation has declared that the skill levels of American society have “the makings of a natural disaster,” while New York Telephone reports that it had had to test a staggering 57,000 applicants to find 2,100 people qualified to fill entry-level jobs. As business spends ever more on training (the total may now be over $50 billion annually), there is increasing concern over the extent to which America’s educational deficits will reduce economic competitiveness.74

Demographic trends suggest the worst is yet to come. Of the new entrants into the work force, white males—currently the best educated sector of the population, especially in science, technology, and engineering—will comprise only 15 percent, and the rest will be women, minorities, and immigrants, with the latter two groups making up the two fastest-growing segments of the work force. The point here is not race per se, but educational access. Since minorities and immigrants have generally gone into low-paid, unskilled jobs, there exists a potentially enormous mismatch between educational levels and the forecast demand for jobs requiring advanced technical or higher education. Unlike Germany, Sweden, or Japan, however, the United States does not possess a systematic approach to remedial training or, the experts say, to vocational education as a whole, preferring instead to retain haphazard, laissez-faire methods.75

Demographic trends also influence the long-term American response to changes provoked by the introduction of robotics and automated manufacture. While intelligent robots are being designed for specialized circumstances (space exploration, undersea mining, hazardous-waste disposal), manufacturing as a whole has less incentive to automate production than in Japan. Although one might imagine that the “graying” of the Caucasian males among the American population would stimulate automation, the simultaneous “browning” trend provides a cheap labor pool in many repetitive jobs. Just as the relative cost of manual versus automated production hurt America’s early lead in robotics, so is it likely that demography and work force composition will slow any overall move to automated manufacturing.

This generalization may not apply to certain sectors of industry, for example to American factories owned by a Japanese multinational, or companies under pressure from East Asian competition and also able to raise the capital to make large-scale investments in automation, or firms preferring robots to poorly trained workers. However, with the collapse of America’s indigenous robotics industry during the 1980s, around 75 percent to 80 percent of robots sold to US firms each year are imported, worsening the trade gap. Moreover, if the investment in robots is not accompanied by retraining the redundant workers by either the company itself (as in Japan) or the state (as in Sweden), then the decline in well-paid, blue-collar employment would intensify; just as, if the surviving workers are not trained to work with robots, the increases in output will be much less than expected.

American agriculture will also be challenged by the newer global forces for change, certainly by biotechnology, and possibly by global warming. The biotech revolution in farming and food processing appears to offer enticing prospects to the large pharmaceutical and agro-chemical firms that have invested heavily in both research and production in this field and are constructing large factory complexes or “refineries” which, in essence, replace the traditional farm. As they also increasingly link up with giant food distribution and chain stores, conglomerates are emerging that will control every part of the process of providing food, from the seeds and fertilizer (or the in vitro hormones and genes) to the canned and packaged goods in the supermarket.

For American farmers themselves—and their communities—these trends are disturbing. Abundant agricultural output made the United States the reserve “bread basket” of the world for the past century, earning large amounts of foreign currency. Because of improved technology, American farming is becoming more efficient each year; in fact, according to the US Office of Technology Assessment, the United States has the capacity “not only to meet domestic demand, but also to contribute significantly to meeting world demand in the next 20 years,” enough indeed to meet the expected 1.8 percent annual growth in world population.76 While that forecast would be contested by environmental groups which believe that US agriculture’s long-term prospects are being damaged by overgrazing, loss of topsoil, decline in water supplies, excessive use of fertilizers, and other unwise methods that aided the original expansion of output, there is no disputing that present productivity per hectare is impressive. But that itself is now a problem.

The challenges facing American farming are large scale and structural. Even if only 3 percent of the total population is nowadays involved in farming, far more is produced than can be consumed at home. To avoid a crisis of agricultural overproduction—of which there have been several since the late nineteenth century—farmers have pressed US administrations to discover and open markets overseas. At present, however, such a solution is put in doubt by chronic imbalances in global supply and demand for farm products. Dozens of poor countries would welcome the continued flow of American food supplies, but have no funds to pay for them. Similarly, the erstwhile USSR and certain of its former East European satellites require food to make up for their own farming deficiencies; but unless international aid pays for this, there is no way that those societies can themselves provide the hard currency required. (In any case, if they do eventually manage to restructure agriculture, all or most of them could well become surplus food producers.) Efforts to lower or remove tariff barriers to American food imports into, say, Japan or Korea provoke violent reactions from local farmers.

Meanwhile the EC’s common agricultural policy, which subsidizes and protects millions of farmers, has eroded American farm export shares in both European and third markets—compelling the US government to subsidize its own farmers in expensive ways. The recent hard-fought GATT negotiations, which provoked riots by alarmed French farmers, have only slightly improved America’s export opportunities. But if in fact agreements were reached to phase out all subsidies and price supports—which is highly unlikely—the greatest beneficiaries would probably be countries like Australia, New Zealand, and Argentina, whose farmers are efficient enough not to need agricultural tariffs. While consumers might rejoice at the drop in food prices, many American farming communities would wither away.

It is here that another new technogy—the biotech revolution in agriculture and food processing—will make its impact. With artificial sweeteners having cut heavily into the American sugar market over the past decade, and with forecasts that the use of the bovine growth hormone to increase milk production could lead to a 50 percent reduction in the number of dairy farms by 2000, it is not surprising that some groups of farmers are campaigning against the new technologies. However, unless these innovations can be proven positively harmful to health or the environment—and therefore banned by federal agencies—the response is likely to be mixed. Many better capitalized farms could be attracted by the promise of greatly enhanced yields from “designer” herbicide-resistant seeds and the accompanying herbicides, or by the greater productivity that will flow from new information technology equipment.

What might this mean in overall numbers? One study by the US Office of Technology Assessment calculated that the new biotechnology and information technologies would be adopted by more than 70 percent of the largest farms in the United States, but by only 40 percent of the moderate-sized farms and about 10 percent of small farms. Many of the nation’s two million small farms are run by people with other income, so the impact there might be less. For the moderate-size, full-time farms, traditionally the back-bone of American agriculture, the results would be very serious as they struggled to compete. By the year 2000, the number of such farms might have shrunk to 75,000, compared with 180,000 in 1982. By contrast, the largest farms are expected to grow in size and efficiency, and by the end of the century a mere 50,000 of them could be producing around three quarters of all agricultural output.77 Whether they will still be regarded as farms, or simply as the upstream production facilities of food-processing companies, with wage laborers supervised by corporate-style managers, is an open question.78 In any case, the traditional style of farming, in middle America no less than in rural France, is little prepared for the next century.

Given these prospects, moreover, it is to be hoped that the “greenhouse effect” does not result in the temperature rises forecast by the gloomier studies on global warming; for that would increase the pressures upon farmers whose livelihoods are already endangered by the biotech revolution.


Environmental changes outside the national boundaries are also affecting American society. For example, while the recent flood of Haitian refugees to the United States was prompted by political turbulence, another more important cause is that peasant landowners have eliminated the forests (only 2 percent of the land is still forested) and the subsequent exploitation and loss of topsoil have worn some areas down to the bedrock. With farmable land (only 11 percent of the whole) continuing to shrink, total fertility rates still very high, and population control negligible, more and more people—already the poorest in the Western hemisphere—are left with fewer and fewer resources. Given the mass unemployment rates of 30 percent, is it any surprise that many of them struggle to get to the United States and regard repatriation as close to a death sentence? And once they arrive in Florida or New York, is it a further surprise that—through no fault of their own—these immigrants are additional burdens upon the sorely pressed educational and social systems of the inner cities? Here, in microcosm, is an example of how entwined demographic growth, environmental damage, social and economic catastrophe, and mass migration have become.

It is while they confront these challenges that citizens of the United States are being urged to adjust to the borderless world of twenty-four-hour-a-day financial flows, electronic trading, and the globalization of business and communications. The general sense is that America enjoys enormous advantages in the form of giant multinationals and banks, traders, consultants, and service industries, the dominance of the English language and the US dollar, an entrepreneurial culture, and numerous highly educated scientists, engineers, designers, lawyers, and other “symbolic analysts” whose skills are in global demand.79 On the other hand, the relocation of industries abroad, the increasing redundancy of various occupations, and the inadequate educational levels of many workers for high-tech employment suggest that the lower four fifths (or more) of Americans may not enjoy the oft-proclaimed benefits of globalization. If demographic trends lead to a relative decline in the number of Americans with marketable skills, and if US multinationals find themselves increasingly pitted against foreign rivals with larger capital resources and better trained labor, those benefits may appear ever less obvious.

If the above analysis is generally accurate, the United States may not be a “loser” in the face of global changes, as many desperate societies in the developing world will be; but because of its social and economic structure—its altering demographic pattern, its educational and social deficits, its fiscal problems—it could be less than a clear “winner.” What emerges instead is a mixed picture: some industries rising as others fall, traditional farms losing as agribusinesses gain, consultants flourishing as blue-collar workers face fewer opportunities, the slow growth in overall per capita GNP barely concealing the widening gap between those whose skills are in demand and those whose skills are not.

Moreover, despite the proclaimed intentions of the Clinton administration and the possibility of some corrective measures being implemented here and there as the 1990s advance, the size of the federal deficit, together with the nature of American society and politics, makes it unlikely that a national “plan” for the twenty-first century will emerge such as may be formulated in France or Japan. Instead, there will be differentiated responses and local initiatives in the traditional American way: states and school districts will push ahead with their individual schemes; communities will grapple with local environmental problems; towns and cities will attack urban poverty in various ways; some regions will benefit from fresh foreign investment, others will suffer as American companies transfer production overseas.

There is a lot to be said for this sort of differentiated, decentralized, individualistic response to change: it is in the tradition of American free enterprise and its libertarian culture; and it is what the nation is used to. The United States is, after all, a demi-continent, not a small country like Japan, which finds it necessary to stress social harmony and organization in order for everyone to exist on its mountainous, crowded island-chain. America, by contrast, is the home of those fleeing from constraints elsewhere; it has offered an open frontier to dissatisfied people; and its sheer size, “escapist” culture, and lack of serious external threat has combined to foster dislike of organized, central government. This cultural heritage means that, as the United States turns to meet the broad forces for global change, its response is also likely to be differentiated, decentralized, and individualistic, “muddling through” rather than a coordinated, centralized attack upon the problems. After all, a country like Great Britain “muddled through” for a very long time.

But that returns us to historical analogy as well as to the core of the American dilemma. One hundred years ago, Britain, which was widely regarded then as Number One, was engaged in a similar debate about its future prospects. It was, of course, a very different society from America today, and occupied a different geographical position as the island center of a worldwide empire rather than a resource-rich continental land mass. Nevertheless, the dilemma Britain faced was like the one facing the United States now. Both were preeminent world powers whose economic competitiveness and general international position seemed less assured at the century’s end than it had five decades earlier. In both, alarmed citizens called for changes to improve national competitiveness and “prepare” for the next century. The difficulty was, however, that the proposed reforms would threaten many vested interests. Britain’s spending priorities, its public educational system, the efficiency of its industry, its treatment of poverty, its levels of investment, even the pattern of career choice (more engineers, fewer lawyers and bankers), might all have had to be altered to match the new global competition.

While reformers in turn-of-the-century Britain urged the need for tough solutions, and cultural pessimists bemoaned the evidence of “decline” and “decay,” many disliked the idea of change. It would mean the loss of institutions and work habits that were familiar, cozy, and reassuring. It implied that national traditions had to be amended in imitation of foreign ones. It upset powerful vested interests, and made for uncertainty. It involved costs, or a redistribution of national resources, when economic growth was moderate. Besides, there were many other academic “experts,” journalists, and economists who said that things were still fine, that the declinists were alarmist, and that Britain still had the energies and resourcefulness to remain ahead. All this made sense to a people taught that it occupied a unique historical place, and was an example to others. In sum, there was an understandable and deep-rooted antipathy, both psychological and cultural, to the idea that great changes were needed, especially if they involved pain or money. Rejecting the calls for change, the British people thought it was better to “muddle through.”80 Why, then, cannot America today do the same?

The answer is that the long-term implication of muddling through is slow, steady, relative decline—in comparative living standards, educational levels, technical skills, social provisions, industrial leadership, and, ultimately, national power, just as in Britain. The British may have avoided hard choices by “muddling through” policies, but that evasion ultimately caused the loss of their place in the world.

This, then, is the great test facing President Clinton and his new team. While an impressive array of American companies, banks, investors, think tanks, entrepreneurs, educators, and others are aware of the global forces for change and individually scrambling to prepare for the twenty-first century, the United States as a whole is not; indeed, it probably cannot prepare itself without becoming a different kind of country, transforming its health care, public education, inner cities, job-training, social security, energy consumption, and fiscal system by measures that would make the New Deal look dilettante by comparison—and in the process offending large numbers of Americans who called for some of those reforms but would be hurt economically by others.

Can Mr. Clinton, already in danger of being preoccupied by immediate foreign policy issues (Somalia, Bosnia, Iraq), make a difference? Can any American government improve the country’s changes of meeting the enormous, transnational forces for change bearing down on us? One thing, at least, is clear. Whatever America’s prospects are, there can be no coherent response to those challenges unless the President himself recognizes them and has the political courage and ability to mobilize opinion to accept changes that many Americans will find uncomfortable. It will require, in other words, a leadership very different from that demonstrated by recent incumbents of the White House, whether it concerned domestic deficits or global population growth or education reform.

Perhaps America now has such a leader, one who can really institute significant if painful reforms. Or perhaps, instead, traditional “muddling through” methods will prevail despite Mr. Clinton’s pleas for “change”—leaving the American people to pay a high price in the future for having assumed that things can stay much the same at home while the world outside alters more swiftly than ever before. Truly, as the Chinese curse has it, do we live in interesting times.

—January 28, 1993
(This is the second of two articles.)

This Issue

March 4, 1993