• Email
  • Single Page
  • Print

A Reader’s Guide to the Century

The Century

by Peter Jennings, by Todd Brewster
Doubleday, 606 pp., $60.00

The American Century

by Harold Evans, with Gail Buckland, by Kevin Baker
Knopf, 710 pp., $50.00

The Columbia History of the Twentieth Century

edited by Richard W. Bulliet
Columbia University Press, 651 pp., $49.95

Chronicle of the 20th Century

edited by Clifton Daniel, by John W. Kirshon, foreword by Arthur M. Schlesinger Jr., by An updated edition will be published in November.
Dorling Kindersley, 1540 pp., $49.95

If we removed all the page numbers from War and Peace, it would not take anything away from the meaning of the novel. Nor would restoring the numbers deepen the story. The numbers are there to help us return to a passage in an artifact to whose meaning they are irrelevant. When different editions of Tolstoy’s Russian novel, or of its translations, make Peter’s words show up on differently numbered pages, the words are unaffected. The numbering of years and centuries and millennia is as arbitrary a way of flagging reality as is pagination. The flow of life is not deeply altered by the fact that December 31 is assigned to one year, January 1 to another—or by saying that we are twentieth-century creatures now but will become twenty-first- centuryites in five (or in seventeen) months. Reality does not come to us in neatly labeled packages. We impose the labels. Even our talk of “this century” is a Eurocentric convention, ignoring the existence of other calendars in China or Thailand. It was comparatively recently that parts of Europe itself ceased having two calendars, the Julian and the Gregorian (Russia did not give up the former until 1917, and Greece not till 1923). Farther back in time, Europe began the new year in March, not January. What happened on either date was not altered by what was no more than a different “page number.”

One attempt to escape arbitrary units simply reifies in a more drastic way some stretch of time as an entity. I have been told, for instance, that the “real” 1960s, as opposed to the calendar 1960s, ran from (say) 1963, from Dr. King’s March on Washington and President Kennedy’s assassination, to 1974, to American withdrawal from Vietnam and the Watergate investigations. But for whom was this time unit the reality? Not, presumably, for a poor mother in Africa, who could not care much about America’s designs on Vietnam.

As some people search for the “real” Sixties, others now want to define the “real” twentieth century. The most famous of these is the respected historian Eric Hobsbawm, in The Age of Extremes, a book he published in 1994. He could analyze the period so early since his twentieth century runs only from 1914 to 1991, a “short century” to go with his “long nineteenth century,” described in an earlier trilogy—a century which ran from the 1780s to 1914. That long period, Hobsbawm claimed, was an age of “revolution, capital, and empire.” Our later, shorter time is just an age of “extremes,” concluded by the fall of the Soviet Empire. Naming these periods as if they were single things is a dubious exercise. The eighteenth century, called by many the age of reason, was as much a time of Pamela’s tears and Rousseau’s sentiment as of Newton’s Optics. The Romantic Era, so called, was the time when science and the industrial revolution radically reshaped lives.

Generalizations can mislead us not only about the past but about the very time we are living in and experiencing—as when people accept the assurance that ours is a secular age, though it is subject to waves of mystical, fundamentalist, and plain fanatical belief, often in the closest union with the tools of modernization (as Alan Ryan notices in his shrewd contribution to The Oxford History of the Twentieth Century). Hobsbawm, believing in the triumph of secularism, just ignores or minimizes such aberrations.

Despite these problems with encompassing the concrete evidence of time’s passage, the modern mass media continue the game of name-that-century or name-that-decade, as if they were playing the old radio game “Name That Tune.” But when a contestant identified a Cole Porter song and gave its name—“Night and Day,” let us suppose—he or she was just recalling a thing already composed and entitled. Naming a decade means inventing a single label for a stretch of time not created by a single composer. The results, it should be no surprise, are more misleading than helpful. We talk of the Conformist Fifties, yet that was the era of beat poets and “existential” coffee shops, of folk singers and Elvis, of James Dean and Marlon Brando, of Brown v. Board of Education and the Montgomery bus boycott. And if the 1960s were so radical, how did the combined vote for Nixon-Agnew and Wallace-LeMay total 57 percent of the electorate in 1968? Clearly most of the electorate was conservative, if not reactionary.

If it is so hard to impose a single ethos or pattern on a decade, what hope can we have of imposing a shape on all the events of a century? Rather than seek a separable “real” century as Hobsbawm does, it may be better just to let the very arbitrariness of the numbers serve—as page numbers do—to flag the most measurable differences between, say, page 1900 and page 2000. In this approach, it does not matter whether one is choosing page 1900 or 1914 (or 1890) to begin with. The aim is not to identify turning points or test what was “real” in some predetermined way, but to let the most tangible differences leap out—irrespective of where one begins, so long as the period treated is extensive enough for the scale of change to be obvious. For this purpose, a century is a conveniently large chunk of time.

What obvious differences are there between human life in 1900 and in 2000? The clearest difference is that there is more of human life, well over three times more, a rate of population growth unlike any that preceded it. Not only are more people here; they can expect to stay here longer. The normal life span in industrial countries grew from forty-five to seventy-five in this century. The growth in poorer countries lagged, of course, but the rate of increase was even greater because the starting point was so low—from twenty-five years in 1900 to sixty-three in 1985. Infant mortality has declined, and a woman’s risk of death by childbirth is forty times less since 1940.

The obvious reason for these changes is the impact, at many levels, of science and technology. Science has changed food production as radically as technology has improved its distribution. Disease control has benefited from medical research, from the technologies for applying the results of that research, from systematized sanitation, from the regulatory sophistication of the Food and Drug Administration, and from the organizational tools of groups like the World Health Organization and the Centers for Disease Control.

Science has not only increased the numbers of people and the years they can live. It has rearranged the patterns of that living. Agricultural advances have changed humankind from being primarily rural to being primarily urban in less than a hundred years. In 1900 only Great Britain had less than half of its people working the land. Now that is true of almost every country. At the beginning of the century approximately 90 percent of the world’s population lived outside cities, mainly on farms. Now less than half of it does; and the rate of migration from the land is greatest in the less-industrialized southern sector of the globe, which is playing catch-up to trends that have already remade the northern tier of nations.

City complexes, with their rapidly changing functions as nodes of technological sophistication and services, grow exponentially, not least in the third world, which now contains eight of the thirteen cities with populations over ten million. In black Africa, major cities are increasing their population by 10 percent a year. Kinshasa has added from five to eight million people (no one can keep count) in just two generations. The most rural countries are now creating vast cities. Cairo acquires a thousand new inhabitants a day. India, whose few cities were small in 1900, now has three of them (Calcutta, Delhi, and Bombay) with more than ten million inhabitants. Australia, thinly populated, has seen explosive growth in Melbourne, Sydney, Adelaide, Perth, and Brisbane. Kenneth T. Jackson, in The Columbia History of the Twentieth Century, calls Africa’s “the fastest rate of urbanization ever recorded” and urbanization “the most powerful of the world’s demographic trends.”

Population has shifted not only between rural and urban sectors within continents but in terms of the balance of people between continents. As recently as 1850, Europe had double the population (400 million) of every other major region on the earth. But by 1900, India and China had far surpassed Europe, with two billion people, a third of the people living on the globe. Even sub-Saharan Africa had a larger population than Europe, and Latin America and Southeast Asia would soon equal it. In 1975, for the first time, a majority of the world’s people lived in the nonindustrialized countries.

This population shift went along with an even more dramatic power shift. In the first half of the century, the world’s major political reality was European colonialism. Britain’s domain covered a quarter of the earth’s surface. India alone would have been roomy enough for any nation to control. But Britain also held—besides its strong influence in white Commonwealth powers like Canada, Australia, New Zealand, and South Africa—imperial supremacy in three widely distant areas. In the American hemisphere, its colonies were Jamaica, Trinidad, British Guiana, Honduras, the Leewards, the Windwards, the Bahamas, and Bermuda. In the Mediterranean, Middle East, and Indian Ocean region, its holdings were Gibraltar, Malta, Cyprus, Palestine, Jordan, Aden, the Gulf protectorates, Ceylon, Mauritius, and the Seychelles. In Africa, its writ ran in Gambia, Sierra Leone, the Gold Coast, Nigeria, Cameroon, the Sudan, British Somaliland, Kenya, Uganda, Tanganyika, and Northern Rhodesia.

The French Empire included even more of Africa than Britain’s did—Algeria, Tunisia, Morocco, the Congo, Mauritania, Senegal, the Ivory Coast, Dahomey, French Sudan, French Guinea, Upper Volta, Niger, Chad, Gabon, the Middle Congo, Ubangui Chari, and French Somaliland. Its Caribbean empire included Guadeloupe, Martinique, and French Guiana. In Asia it held Indochina, in the Middle East Syria and Lebanon, in the Pacific Tahiti and New Caledonia, in the Indian Ocean Madagascar, and off Newfoundland the fishing bases of St. Pierre-et-Miquelon.

The Dutch Empire held the vast archipelago of Indonesia, with its 13,000 islands. Belgium had the Congo. Portugal had Angola and Mozambique in Africa, along with Goa, Macao, and Timor in the Far East. Italy’s African empire was composed of Libya, Eritrea, and Italian Somaliland. Earlier in the century, the German and Habsburg empires were broken up. Just to list the parts of the world formerly held by Europe shows how much history, and what diverse worldwide grievances, we carry around so casually in an adjective like “Eurocentric.” It also tells us what wrenching changes had to take place for us to refer to ours as a “postcolonial” world.

Nor were Europeans the only ones to lose empires in this century. After World War I, the Ottoman and Romanov empires were dismantled. After World War II, the Japanese lost Korea and Manchuria. The United States gave up the Philippines and the Canal Zone. More recently, the Soviet Union came apart. The world is not hospitable to colonialism. The concomitant to this breakup of empires was the astonishingly rapid proliferation of new nations. If population has tripled in this century, the number of separate countries has more than tripled—from about 50 in 1900 to about 180 at present, with fissiparous pressures still building in many places. The twenty years of decolonization after World War II produced 100 new nations to be accommodated in the UN (or, worse, to thrash about outside it).

  • Email
  • Single Page
  • Print