delbanco_1-070915.jpg

Peter Stackpole/Life Picture Collection/Getty Images

Members of the Yale Whiffenpoofs, the oldest collegiate a cappella group in the United States, early 1950s

Death may be the great equalizer, but Americans have long believed that during this life “the spread of education would do more than all things else to obliterate factitious distinctions in society.” These words come from Horace Mann, whose goal was to establish primary schooling for all children—no small ambition when he announced it in 1848. Others had already raised their sights higher. As early as 1791, exulting in the egalitarian mood of the new republic, one writer declared it “a scandal to civilized society that part only of the citizens should be sent to colleges and universities.”1

How that part has grown is a stirring story. It begins in the colonial period with church-funded scholarships for the sons of poor families. It continued after the Revolution with the founding of public universities such as those of North Carolina and Virginia. In the midst of the Civil War, it was advanced by the Morrill Act, by which Congress set aside federal land for establishing “land-grant” colleges, many of which became institutions of great distinction. By the later nineteenth century, when most colleges still admitted only white men, the cause was advanced again by the creation of new colleges for women and African-Americans.

In the twentieth century the pace quickened. The GI Bill (officially the Servicemen’s Readjustment Act of 1944) encouraged veterans to continue their education by giving them money for tuition and living expenses, and helped to drive up college graduation rates among American males from 100,000 per year in 1940 to 300,000 in 1950. Amid cold war anxiety about a “brain race” with the Soviet Union, the National Defense Education Act of 1958 extended benefits such as graduate fellowships to women, and the Higher Education Act (HEA) of 1965 included a “Work-Study” program providing eligible students with campus jobs. When the HEA was reauthorized in 1972, grants for low-income students were added—known today as Pell grants in honor of their principal sponsor, Senator Claiborne Pell of Rhode Island.

Some of these measures were promoted by Republicans, others by Democrats, and all commanded a degree of bipartisan support that in retrospect seems remarkable. Moreover, they were effective. The number of Americans between ages twenty-five and twenty-nine holding a four-year college degree rose from one in twenty in 1940 to one in four by 1977. And if the federal government did much to make this happen, the states did more. Led by California, which virtually guaranteed college access to every high school graduate, many states committed themselves to providing high-quality public education at low cost.

All these strategies had in common one basic motive: to bring college within reach of those for whom it would be otherwise unattainable. In 1817, a North Carolina jurist called for “some just and particular mode of advancing…poor children…from the primary schools to the academies, and from the academies to the university.” A century and a half later, at the signing ceremony for the HEA, President Johnson promised that “a high school senior anywhere in this great land of ours can apply to any college or any university in any of the 50 states and not be turned away because his family is poor.”2 As Suzanne Mettler writes in her valuable book, Degrees of Inequality, “access to college” was becoming “a right of American citizenship.”

Today this story is stalled. At the top of the prestige pyramid, in highly selective colleges like those of the Ivy League, students from the bottom income quartile in our society make up around 5 percent of the enrollments. This meager figure is often explained as the consequence of a regrettable reality: qualified students from disadvantaged backgrounds simply do not exist in significant numbers. But it’s not so. A recent study by Caroline Hoxby of Stanford and Christopher Avery of Harvard shows that the great majority of high-achieving low-income students (those scoring at or above the ninetieth percentile on standardized tests, and with high school grades of A- or higher) never apply to any selective college, much less to several, as their better-off peers typically do.3 Their numbers, which Hoxby and Avery estimate at between 25,000 and 35,000 of each year’s high school seniors, “are much greater than college admissions staff generally believe,” in part because most such students get little if any counseling in high school about the intricate process of applying to a selective college—so they rarely do.

As for the colleges themselves, searching for more qualified low-income students and providing financial aid for them as well as for middle-income students are likely to require compensatory reductions in budgets for competing priorities. Scholarships for needy students can be solicited from donors—but there are “opportunity costs” in the sense that asking alumni to give to the scholarship fund precludes or reduces their giving for other purposes such as faculty chairs or a new dorm or gym. At my own university, which has lately raised billions of dollars in large part to finance a campus expansion, the percentage of college students receiving financial aid has, inexcusably, declined.4

Advertisement

Meanwhile, at public institutions, which enroll many more students than private colleges—some 14 million of the roughly 18 million undergraduates who attend nonprofit colleges—subsidies to keep college affordable have also been dropping. Mettler points out that between 1980 and 2010, average spending on higher education slipped from 8 percent to 4 percent of state budgets. Some states have seen a modest recovery since the Great Recession, but recently the governors of Wisconsin, Louisiana, and Illinois have proposed new cuts.5

As a result, the cost of public higher education has shifted markedly from taxpayers to students and their families, in the form of rapidly rising tuition. Between 2000 and 2008, the proportion of family income required for families in the bottom income quintile to cover the average cost of attending a four-year public institution rose from 39 percent to 55 percent. For top-quintile families over that same period, the corresponding rise went from 7 percent to 9 percent.6

The story these numbers tell is of a higher education system—public and private—that is reflecting the stratification of our society more than resisting it. Those students who do get to college are distributed, like airline passengers, into distinct classes of service, but with incomparably larger and lingering effects. In 2010, private nonprofit universities, whose students tend to be relatively affluent, spent on average nearly $50,000 per student—with the wealthiest colleges spending nearly double that amount. At public four-year institutions expenditure per student was $36,000, while community colleges, where minority and first-generation students are concentrated and which stress vocational training and offer associate rather than bachelor degrees, could spend just $12,000 per student. Moreover, while the number of Pell grants for needy students has jumped over the last forty years from half a million to more than ten million, a Pell grant in the 1970s covered four fifths of total cost at the average four-year public university. Today it covers less than one third.

These numerical disparities have stark human consequences. Getting through college can be a challenge even for confident students with families on whom they can count to cushion the shock if a parent falls ill or loses a job, or if unexpected expenses arise. For students without such a buffer, college can be a very hard road indeed. Yet in our current system the relation between vulnerability and support is an inverse one. One result is that graduation rates are the same for low-income students with high test scores as for high-income students with low scores.7 In the United States today, three of every five children from families in the top income quartile earn a bachelor’s degree by age twenty-four, while for those in the bottom quartile the rate is one in four (see Figure 1).

Delbanco_figure_1-070915.jpg

These are indefensible realities in a nation that claims to believe in equal opportunity. Yet some people look at this picture and say that the whole idea of mass higher education was misguided from the start—that the United States should have emulated instead the European model of test-based tracking by which a select few are chosen early in life for university training that leads to public service or the professions, while the rest are channeled into vocational schools or the trades.

Former secretary of education William Bennett, for instance, believes that “students might be better off investing their tuition money in stocks and bonds rather than in a degree from one of our nation’s many colleges.”8 Bennett is well known for his love of gambling, but given the proven advantages for the college-educated in the form of higher wages and lower unemployment, anyone taking his advice (never mind that low-income students have no money to invest) would be making a very bad bet.

Critics like Bennett are right, however, to decry what’s happening—or not happening—to many students who do get to college. Too few are challenged or given guidance and encouragement. Cheating is common, including at elite private colleges and the so-called public flagships.9 In a widely noted 2011 book, Academically Adrift: Limited Learning on College Campuses, the sociologists Richard Arum and Josipa Roksa gave a grim account of college as a place where students are held to low standards in an atmosphere of wasteful frivolity. In their new book, Aspiring Adults Adrift: Tentative Transitions of College Graduates, they stress that the likeliest victims of “late adolescent meandering” are students from low-income backgrounds who come out of college aimless, demoralized, and with fewer chances than their more affluent peers to recoup lost opportunities. In Paying for the Party: How College Maintains Inequality, Elizabeth Armstrong and Laura Hamilton speak of “an implicit agreement between the university and students to demand little of each other.” And they, too, make the case that students with the fewest family resources have the lowest post-college prospects.10

Advertisement

Colleges and universities cannot be expected to solve America’s problems of inequity. They cannot repair broken families, or make up for learning deficits incurred early in childhood, or “level the playing field” for students with inadequate preparation. But they should be expected to try to mitigate these problems rather than worsen them—and one main reason they are failing to do so is their relentlessly rising cost.

Almost every year, when private colleges announce tuition increases greater than the rate of inflation, pundits cry foul and pin the blame on luxuries (the proverbial “climbing wall”) for pampered students, or on lavish compensation for senior administrators. In 2012, thirty-six private university presidents earned more than a million dollars—some a lot more—and many supplement their salaries with “service” on corporate boards. Especially in straitened times, these excesses are, to say the least, tasteless. They make presidential homilies urging students to put aside selfishness ring hollow. But they contribute only marginally to the college “cost disease.”11

A deeper cause is the dual purpose of universities—to create new knowledge while transmitting what is already known. New fields of inquiry arise (pursued in new departments by new faculty), while old fields are rarely relinquished—at least not at the same pace. Thus costs almost always grow faster than savings.

And then there is the nature of teaching itself. Unlike transactions such as banking or shopping, teaching cannot be made more efficient by substituting capital for labor, or, like certain kinds of manufacturing, by economies of scale—at least not without degrading its quality. Some people think that digital technology will soon make it possible to provide good teaching for large numbers of students without a commensurate rise in cost.12 This remains to be seen. In the meantime, the main strategy for keeping instructional budgets under control has been the replacement of full-time faculty with low-paid adjuncts, many of whom carry heavy course loads, sometimes at several campuses, with little time to devote to individual students. No one except budget managers likes the results.

The problem is compounded when leading members of the faculty are rewarded with reduced teaching obligations—which means that additional faculty must be hired to pick up the slack. And since the prestige of the institution, on which its ability to attract paying students depends, is largely a function of its reputation for “cutting-edge” research, colleges are under constant pressure to construct new laboratories as well as state-of-the-art athletic facilities and performance and social spaces. (The trustees of one university, Auburn, recently approved construction of a new football scoreboard at a cost of $14 million.) There is a recurring imperative, too, for adding administrative staff in order to provide career and psychological counseling for increasingly stressed students, and to cope with government regulation in every area, from financial compliance to the adjudication of allegations of sexual harassment or assault.13

Usually, when the subject of college cost comes up, outrage is focused on the spectacular “sticker price” of highly ranked colleges (at Columbia, where I teach, the published price of attendance is approaching $70,000 per year). But in fact with roughly half the students at the highest-priced colleges receiving some financial aid, only about 2 percent of all college students in the United States pay above $50,000—and they come from families who (though they may not like it) can generally afford it.14 Meanwhile, in order to attract a sufficient number of students, private colleges of lesser reputation—though not necessarily lesser quality—provide such high discounts on their published price that net tuition dollars (the amount received after financial aid has been awarded) have been virtually flat for years—a trend, in the face of rising costs, that can threaten an institution’s financial sustainability.15 Sweet Briar College, a small women’s college in rural Virginia with declining enrollments and a discount rate of nearly 60 percent, announced in early March that it would close down after more than a hundred years.

But if there are problems in the private sector, by far the biggest driver of rising tuition has been the lack of economic investment in public institutions. Between 1998 and 2008, tuition rose at four-year private colleges by 33 percent, while at four-year public universities it climbed by 54 percent—a divergence that widened with the Great Recession.16 And this has happened at a time when most Americans have experienced wage stagnation.

To make matters worse, the trend in dispensing financial aid—both by the states and by colleges themselves—has been moving away from calculations of need toward assessments of “merit” as demonstrated by test scores, extracurricular activities, and the like.17 Since low-income students are often saddled with family responsibilities and summer and term-time jobs, they tend to have fewer opportunities than their more affluent peers for the sort of entrepreneurship, volunteer work, or studies abroad that may impress admissions officers. Many colleges seek to elevate themselves in the US News and World Report rankings, in which test scores count, and there is close correlation between performance on standardized tests and family income. Moreover, a partial “merit” scholarship for an affluent student, which may be enough to persuade that student to enroll, of course costs less than a full scholarship for a needier student.

For these reasons and more, a growing portion of college subsidies is going to students from relatively prosperous families. Many of the beneficiaries are doubtless extremely worthy. But they are students for whom, as William Zumeta and his colleagues write in Financing American Higher Education in the Era of Globalization, a scholarship award may make a difference in the decision about which college to attend, but is “less likely to make a difference in whether or not a student attends college” at all.18

In short, the United States has a serious structural problem: the cost of college is rising faster than public, institutional, or, for most Americans, personal resources available to meet it. One ominous sign is that Hispanics and African-Americans, especially young men, are lagging badly behind whites in educational attainment. (see Figure 2 below) If these problems are not addressed, we are likely to become, if we are not already, what Mettler calls “a society with caste-like characteristics.”

Delbanco_figure_2-070915.png

What is to be done? Some would say that “the market” is already doing a lot. A torrent of private loans is flowing into the gap between costs and subsidies. For most of our history, banks avoided lending to students, since, unlike home mortgages or car loans, student debt is unsecured by real property and borrowers can take years to repay, leaving lenders with low liquidity and high risk of delinquency or default. But in the second half of the twentieth century, government began to make student loans more attractive to private investors. The HEA included a provision by which government would pay the interest while students are still in school, then split the payments after graduation—the first step toward what has grown into a complex system of government-guaranteed private loans.19

Loans really began to flow in the 1970s as enthusiasm waned for what conservatives called government “handouts,” and consensus grew, even among liberals, that students ought to have some “skin in the game.” In reauthorizing the Higher Education Act in 1972, Congress created the Student Loan Marketing Association (Sallie Mae), modeled on the Federal National Mortgage Association (Fannie Mae), with the purpose of buying loans from private lenders and “bundling” them for sale in the secondary securities market. The idea was to inject more capital into private banks in order to encourage more lending, including lending to students.

This idea worked very well or very badly, depending on one’s view of what Joel and Eric Best, in their informative history, call The Student Loan Mess. Today, some 70 percent of college graduates leave college with debt whose aggregate size—over $1 trillion—exceeds the national total of credit card debt, with the average student borrower owing around $30,000. Some people think this is a bubble waiting to burst.20 Others are less apocalyptic but believe that student debt suppresses young people’s consumer spending (thereby hurting the general economy) and delays home-buying and family formation.

On the other hand, a recent report from the Brookings Institution argues that “the debt picture for the typical college graduate is not so dire,” and that in most cases monthly payments remain reasonably manageable.21 Whichever side one takes in this debate, there is little doubt that especially for low-income students, indebtedness has become another obstacle to college completion.

When it comes to inflicting debt on low-income students, among the worst offenders are the for-profit “universities,” to which Mettler devotes an excoriating chapter. In the last several decades these institutions have seen explosive growth. With the contraction of state budgets and the changing demographics of college students—many today are adult part-time students, often with families, seeking skills and credentials with which they hope to get or hold a job—the for-profit institutions, which specialize in online and night classes, proliferated and expanded. In 1990 there were 343 for-profit colleges. By 2009 there were 1,199, accounting for nearly 10 percent of all college students, many of them vulnerable and needy—military veterans, single parents, dropouts from traditional institutions who are un- or underemployed.

Mettler tells hair-raising stories about telemarketers who prey on prospective students with misleading promises, and are then rewarded with bonuses or free vacations if they reach enrollment quotas. She describes “an auto repair school in Ohio that was actually run out of a fruit stand” and a beautician school that tells students to expect annual earnings up to $250,000. These are small operations, but the largest for-profits such as the University of Phoenix, whose enrollment peaked at nearly 600,000 in 2010, are several times the size of the largest public universities such as Arizona State or Ohio State.

Large or small, fair or fraudulent, the for-profits have a smart business plan: keep costs down by forgoing a campus and offering courses taught by part-time employees, while pulling in maximum revenue in the form of government grants to students and government-guaranteed student loans. In 1991 Congress passed a rule requiring that in order to qualify for accreditation, a for-profit must show that at least 15 percent of its income comes from nongovernment sources, but in 1998 the rule was watered down to 10 percent. “Despite being regarded as part of the private sector,” as Mettler puts it, “the for-profits are financed almost entirely by American taxpayers.” For investors in these enterprises, it’s been a great plan. For many if not most of their students, whose debt and default rates are proportionally higher than their counterparts in traditional colleges, it’s been a less happy story.

But the heyday of the for-profits may be over. Under President George W. Bush, a former University of Phoenix lobbyist was appointed assistant secretary for postsecondary education, and regulations, already lax, became even more so. Under President Obama, however, and with more aggressive scrutiny by state attorneys general, government oversight has tightened. Enrollment at the University of Phoenix has dropped by close to two thirds. In 2014, Corinthian Colleges, which once enrolled more than 70,000 students and was taking in $500 million per year in Pell grants (more, as the higher-education analyst Kevin Carey points out, than the entire University of California system), was dissolved in a settlement with the federal government after being charged with falsifying graduates’ employment data, among other abuses.22

These responses to the college cost crisis—the loan flood and the for-profit boom—were never likely to ameliorate educational inequity in the United States. If anything, they have made things worse.

But there are some hopeful signs. During the first Obama administration, Congress disallowed government-subsidized private lending and expanded a program of income-based loan repayment that had begun under President Bush. Caps on monthly repayment obligations have been lowered to 10 percent of the borrower’s income. Loan forgiveness is now available after twenty years, and, for people working in government or nonprofit public service, after ten years. We may be at the start of what Carey calls a “quiet revolution in helping lift the burden of student debt.”23

Other promising ideas are in the air—though in the present political climate few seem likely to find bipartisan favor. In his 2015 State of the Union Address, President Obama called for making community colleges free—which conservatives reject as a bill too big for government to pay. Some liberals caution that waiving tuition (already covered for Pell-eligible students) could have the effect of shifting subsidies from truly needy students to others, less constrained, who may choose free community college for the first two years before transferring to a four-year institution.24

Many writers on policy, including Mettler, urge that Pell grants should be converted into entitlements, so that, like Social Security benefits, they would provide regular cost-of-living increases rather than depend on periodic congressional reauthorization. The Harvard economist Edward Glaeser has made the interesting suggestion that a portion of Pell funds should be held back from colleges until students who bring in the grants are actually graduated. His idea is to provide incentives for colleges to help low-income students finish their degrees—an approach something like the way construction loans are paid out in stages as the stipulated work is completed. And University of Michigan economist Susan Dynarski points out that small “nudges”—text messages or phone calls from counselors to at-risk students, reminding them of deadlines and offering practical advice—can make a big difference in college attendance and completion rates.25

If we ever return to a functional politics, these and doubtless many other sensible ideas could eventually make improvements on the revenue side of the college affordability problem. On the cost side, however, the burden falls squarely on institutions themselves, since, as Zumeta says, there will be no “expansion of college opportunity without significant improvements in cost effectiveness.”

With this imperative in mind, former presidents William Bowen and Eugene Tobin (Bowen of Princeton, Tobin of Hamilton College) have issued a call, in their book Locus of Authority, for reform in academic governance. They think that universities are held back from plausible reforms—experimenting with technologies that could improve efficiency, reducing redundancies in duplicative programs, cutting down on unproductive research—by a certain provincialism whereby faculty tend to think exclusively about what’s good for their department rather than for the whole institution. With a parenthesis that may strike some readers as an afterthought, Bowen and Tobin detect a “strong aversion, especially on the part of faculty (but sometimes on the part of administrators too) to any discussion of cost savings in the case of educational programs.”

This is often true. A related point has been made more bluntly by David Rosen, an English professor who writes that just when we “appear to be entering a new Gilded Age, with institutions of higher learning as willing or unwitting accomplices,” faculty—many of whom call themselves leftists—“seem ready to politicize everything but the immense changes occurring before their very noses.”26 Decrying inequality is commonplace on many college campuses these days, but the question seldom comes up whether the college itself is helping or hurting through its own admissions and aid practices, and if it’s the latter, what might be done about it.

All in all, despite an emerging recognition that we must change course, the story told in the books under review is a dispiriting one. Mettler attributes the decline of educational opportunity since the 1980s to a failure of “upkeep,” by which she means the failure of government to renew and adapt policies from the past in order to advance their original purposes in the present and future. This strikes me as a generous explanation. The truth may be uglier. Perhaps concern for the poor has shriveled not only among policymakers but in the broader public. Perhaps in our time of focus on the wealthy elite and the shrinking middle class, there is a diminished general will to regard poor Americans as worthy of what are sometimes called “the blessings of American life”—among which the right to education has always been high if not paramount.