1.

Every middle-class American family with a college-age child knows how it goes: the meetings at which the high school counselor draws up a list of “reaches” and “safeties,” the bills for SAT prep courses (“But, Dad, everyone takes one; if you don’t let me, I’m screwed”), the drafts of the personal essay in which your child tries to strike just the right note between humility and self-promotion—and finally, on the day of decision, the search through the mail in dread of the thin envelope that would mean it’s all over and that, as a family, you have collectively failed.

The struggle to get into America’s leading colleges is, of course, the dark side of a bright historical development. Until about fifty years ago, our most prestigious academic institutions were pretty much the domain of well-born prep school boys. In 1912, Owen Johnson’s enduringly popular novel (most recently reprinted in 2003) Stover at Yale gave a picture of Ivy life as a gladiatorial contest among alpha males who, by beating out their rivals for a spot on the team or in the club, learned to achieve “victory…on the broken hopes of a comrade,” and went on to rule the nation. In 1920, Scott Fitzgerald (Princeton ’17) called Stover at Yale the “textbook” for his generation. Writing a few years later about Harvard in his novel Not to Eat, Not for Love, George Weller remarked on “how similar the faces always looked in the Varsity picture, except where there was an Irishman or a Jew, and even then they seemed somehow anglicized down toward alikeness.” In Weller’s novel, one Harvard bureaucrat runs a brisk business selling “the addresses of selected Anglo-saxon sophomores to the mothers of Boston débutantes” lest some “anglicized” Irishman or Jew pass for a WASP and, by means of an unwary Beacon Hill belle, contaminate the race. As late as the outbreak of World War II, these fictions had the plausibility of fact.

At the turn of the century, when Stover was prepping for Yale, fewer than a quarter-million Americans, or about 2 percent of the population between eighteen and twenty-four, attended college. By the end of World War II, that figure had risen to over two million. In 1975, it stood at nearly ten million, or one third of the young adult population. Today, the United States leads the world by a considerable margin in the percentage of citizens (27 percent or 79 million) who are college graduates.1

To advance this immense social transformation required many means—notably the GI Bill, passed by Congress in 1944, which brought onto America’s campuses students whose fathers could have set foot there only as members of the janitorial class. Starting a decade later, the Ivies did their part by establishing “need-blind admissions” and “need-based financial aid”—by which they promised to accept qualified applicants regardless of their ability to pay, and to help support needy matriculants by assessing family assets and making up with scholarship aid whatever the family could not afford. A new system of standardized testing (the SAT) identified talented students, many of whom were subsidized by federal programs designed to train scientists and strategists for the postwar struggle against communism. It was during the cold war that the Old Boy with his Rudy Vallee (Yale ’27) intonation and “Gentleman’s C” became an anachronism.

Progress in the public universities was equally striking. By 1960, the University of California at Berkeley was challenging Harvard in accomplishment and prestige, and the “flagship” branches of other state universities such as Michigan, Ohio, Wisconsin, and, more recently, Texas and North Carolina joined the ranks of the world’s leading institutions. Later in that decade, in part because of competitive pressure from public rivals, Yale and most of its private peers opened their doors to women, and the push was on to recruit students from “underrepresented” (to use today’s stipulated bureaucratic language) minority groups. In short, by the late twentieth century the America that Tocqueville had described 150 years earlier as a nation where “primary education is within the reach of everyone” but “higher education is within the reach of virtually no one” had been turned upside down.2

This mostly happy story is well known. Less well known is the most recent chapter, which tells of a slowdown, if not reversal, of the trend toward inclusion. Over the last twenty-five years, as the tax revolts of the 1970s (starting in 1978 with California’s Proposition 13) became chronic tax resistance, state support for pub-lic universities has sharply fallen. Public funds now cover less than one third of expenses at public universities—with the result that tuitions are rising beyond the reach of families who once would have depended on these institutions as pathways of upward mobility.3 Tuition at the State University of New York jumped by 28 percent between 2002–2003 and 2003–2004—the price of attending is now over $14,000 (tuition, housing, and fees)—and other state systems saw similar increases. The average tuition increase at all public universities last year was 10.5 percent, four times the rate of inflation.

Advertisement

Naturally, private universities, taking advantage of the opportunities, have been raiding public universities for leading faculty, whose salaries and working conditions have declined relative to those of their private-sector counterparts. In The Future of the Public University in America: Beyond the Crossroads, former University of Mich- igan president James Duderstadt points out that private universities now enjoy what are, in effect, large public subsidies that, unlike the legislative appropriations on which public universities depend, are dispensed out of sight of the public eye. “When the investment corporations created by many private universities to manage their endowments make profits on a business venture,” Duderstadt writes, “that profit is tax-exempt, and, in effect, the forgone tax revenue must be replaced by tax dollars paid by other citizens.”

Calling these schools “predators” and “carnivores,” Duderstadt warns that public universities may have to “unleash the T word, tax policy, and question the wisdom of current tax policies that sustain vast wealth and irresponsible behavior at a cost to both taxpayers and to their public institutions.” As if to underline his point, The New York Times reported a few months ago that part of the fees paid by institutional investors to a management firm spun off from the Harvard Management Company (the nonprofit entity that oversees Harvard’s $22 billion endowment) has been going to Harvard as tax-exempt revenue.4

At the Ivy League colleges, where financial aid was previously awarded strictly on the basis of need, “merit aid” is increasingly used to recruit especially desired students who may not need the money—with the result that less money is available for students who do. And applicants are stampeding toward early admissions programs that offer, in exchange for a promise to attend if admitted, a better chance of getting in. These programs, which now account for roughly half of all enrolled students in the Ivy League, favor candidates from private or suburban schools who have well-connected counselors (sometimes privately hired) and the financial freedom to pick a college without waiting to compare financial aid offers—and the colleges know it.5

In academia, in short, no less than in other privileged corners of American life, money is being funneled into the hands of a relative few. Once-shabby college towns have become boom towns where old dives remembered fondly by alumni are now upscale restaurants to which today’s students bring their high-limit credit cards, and parking lots are crowded with student SUVs.

The recent history of elite higher education is usually told as a glorious story of democratization. But future historians may look back and see something different: a restrictive age of old money (1900–1950), followed by an interregnum of broadened access (from the 1950s into the 1980s) and then a period (circa 1990–?) in which new money poured in. America’s colleges continue to serve extraordinary students from many backgrounds, but Stover’s drinking song is being sung again:

Oh, father and mother pay all the bills,

And we have all the fun,

That’s the way we do in college life.

Hooray!6

2.

Amid these troubling developments, one hopeful sign is the growing public debate over who should go to college and how they should be paid for.7 Yet one hears comparatively little discussion of what students ought to learn once they get there and why they are going at all. Over my own nearly quarter-century as a faculty member (four years at Harvard, nineteen years at Columbia), I have discovered that the question of what undergraduate education should be all about is almost taboo.

Most American colleges before the Civil War (more than five hundred were founded, but barely one hundred survived) were, in Richard Hofstadter’s words, “precarious little institutions, denomination-ridden, poverty-stricken…in fact not colleges at all, but glorified high schools or academies that presumed to offer degrees.” A bachelor’s degree did not have much practical value in the labor market or as a means of entering the still-small managerial class. The antebellum college was typically an arm of the local church—an academy for ministers, missionaries, and, more generally, literate Christians—that remained true to the purpose of the oldest American college, Harvard, which had been founded in dread “lest the churches of New England be left with an illiterate ministry…when our present ministers shall lie in the dust.”

As sectarian fervor cooled, the colleges became less closely tied to the churches, though most retained a strong religious tone through the mid-1800s. Whatever the particular method or creed, there was consensus, in “an age of moral pedagogy,” that the primary purpose of a college education was the development of sound moral character.8 A senior-year course in moral philosophy, usually taught by the college president, was almost universal. As the grip of religion loosened further over the course of the century, and the impact of Darwin transformed intellectual life, colleges changed fundamentally, becoming largely secular institutions devoted less to moral education than to the production and transmission of practical knowledge.

Advertisement

By the mid-nineteenth century, the need for expert training in up-to-date agricultural and industrial methods was becoming an urgent matter in the expanding nation, and, with the 1862 Morrill Act, Congress provided federal land grants to the loyal states (30,000 acres for each of its senators and representatives) for the purpose of establishing colleges “where the leading object shall be, without excluding other scientific or classical studies, to teach such branches of learning as are related to agriculture and the mechanic arts.” Eventually these “land-grant” colleges evolved into the system of state universities.

At the same time, as the apprenticeship system shrank and some professional careers began to require advanced degrees, the impetus grew for the development of private universities. Some took shape around the core of a colonial college (Harvard, Yale, Columbia), while others (Chicago, Northwestern) came into existence without any preexisting foundation. Still others (Clark, Johns Hopkins) had at first few or no undergraduate students. In 1895, Andrew Dickson White, the first president of Cornell, whose private endowment was augmented by land granted to New York State under the Morrill Act, looked back at the godly era and declared himself well rid of “a system of control which, in selecting a Professor of Mathematics or Language or Rhetoric or Physics or Chemistry, asked first and above all to what sect or even to what wing or branch of a sect he belonged.”

The idea of practical and progressive truth to which the new universities were committed was, of course, not entirely novel. It had already been advanced in the eighteenth century by Enlightenment ameliorists such as Benjamin Franklin, who anticipated a new kind of institution, to be realized in the University of Pennsylvania, that would produce “discoveries…to the benefit of mankind.” Roughly one hundred years later, Charles W. Eliot, the president who turned Harvard College into Harvard University (and who was himself descended from Puritan clergy), explained that a modern university must “store up the accumulated knowledge of the race” so that “each successive generation of youth shall start with all the advantages which their predecessors have won.”

By 1900, professors, no less than physicians or attorneys, had become certified professionals, complete with a peer review system and standards for earning credentials—which one of Eliot’s faculty members, William James, referred to as the “Ph.D. octopus.” Faculty began to benefit from competitive recruitments in what was becoming a national system of linked campuses; and when some rival university came wooing, the first thing to bargain for was, of course, a reduced teaching load. Seven years after Eliot’s inauguration speech in 1869, the Harvard philologist Francis James Child was exempted from grading undergraduate papers in response to an offer of a job from Johns Hopkins.

By the end of the nineteenth century, the professionalized university had absorbed schools of medicine and law that had typically begun independently, and was acquiring teacher-training schools, along with schools of engineering, business, and other professions. It was on its way to becoming the loose network of activities that Clark Kerr, president of the University of California, famously called the “multiversity.” When Kerr coined that term in 1963, in The Uses of the University, he remarked on the “cruel paradox” that a “superior faculty results in an inferior concern for undergraduate teaching,” and he called this paradox “one of our most pressing problems.”

Since Kerr wrote, the problem has gotten worse. Today, as David Kirp points out in Shakespeare, Einstein, and the Bottom Line, New York University, which has lately made a big (and largely successful) push to join the academic front rank, employs “adjunct” faculty—part-time teachers who are not candidates for tenure—to teach 70 percent of its undergraduate courses. The fact that these scandalously underpaid teachers must carry the teaching burden—not just at NYU, but at many other institutions—speaks not to their talent or dedication, but to the meagerness of the institution’s commitment to the teaching mission. At exactly the time when the struggle to get into our leading universities has reached a point of “insane intensity” (James Fallows’s apt phrase), undergraduate education has been reduced to a distinctly subsidiary activity.9

3.

Under these circumstances, one might expect to see students fleeing to colleges whose sole mission is teaching undergraduates. Fine colleges such as Swarthmore, Amherst, and Williams, which have significant endowments and high academic standards, do indeed have considerable drawing power. Yet these are small and relatively fragile institutions, and even the best of them are perennial runners-up in the prestige game, while other impressive colleges—such as Centre College in Kentucky or Hendrix College in Arkansas—must struggle, out of the limelight, to compete for students outside their region.

The leading liberal arts colleges will doubtless survive, but they belong to an endangered species. Michael S. McPherson, president of the Spencer Foundation and former president of Macalester College, and Morton O. Schapiro, president of Williams, report that even now “the nation’s liberal arts college students would almost certainly fit easily inside a Big Ten football stadium: fewer than 100,000 students out of more than 14 million.”10 In today’s educational landscape, barely one sixth of all college students fit the traditional profile of full-time residential students between the ages of eighteen and twenty-two. One third of American undergraduates now work full-time, and more than half attend college part-time, typically majoring in subjects with immediate utility, such as accounting or computing. These students, and their anticipated successors, are targets of the so-called electronic universities that seek a share of the education market by selling Internet courses for profit. A few years ago, the president of Teacher’s College at Columbia University predicted that some wily entrepreneur would soon “hire well-known faculty at our most prestigious campuses and offer an all-star degree over the Internet…at a lower cost than we can.”11

As for the relatively few students who still attend a traditional liberal arts college—whether part of, or independent from, a university—what do they get when they get there? The short answer is freedom to choose among subjects and teachers, and freedom to work out their own lives on campus. Intellectual, social, and sexual freedom of the sort that today’s students assume as an inalienable right is never cheaply won, and requires vigilant defense in academia as everywhere else. Yet there is something less than ennobling in the unearned freedom of privileged students in an age when even the most powerful institutions are loath to prescribe anything—except, of course, in the “hard” sciences, where requirements and prerequisites remain stringent. One suspects that behind the commitment to student freedom is a certain institutional pusillanimity—a fear that to compel students to read, say, the major political and moral philosophers would be to risk a decline in applications, or a reduction in graduation rates (one of the statistics that counts in the US News and World Report college rankings closely watched by administrators). Nor, with a few exceptions, is there the slightest pressure from faculty, since there is no consensus among the teachers about what should be taught.

The history of American higher education amounts to a three-phase story: in the colonial period, colleges promoted belief at a time of established (or quasi-established) religion; in the nineteenth century, they retained something of their distinctive creeds while multiplying under the protection of an increasingly liberal, tolerationist state; in the twentieth century, they became essentially indistinguishable from one another (except in degrees of wealth and prestige), by turning into miniature liberal states themselves—prescribing nothing and allowing virtually everything.12 Anyone whose parents or grandparents were shut out from educational opportunity because of their race, ethnicity, or gender is thankful for the liberalizing trajectory of higher education—but as in every human story, there is loss as well as gain.

—This is the first of two articles.

This Issue

March 10, 2005