Discussion of the various crises in American higher education seems like a mass of irreconcilable contradictions.
On the one hand, commentators inveigh against the soaring cost of tuition at elite universities, driven by wasteful luxury, with climbing walls as the invariable insidious example. On the other hand, prospective students and their parents despair over the incredibly competitive admission process to those same colleges, beginning with securing entry to the right preschool, proceeding through an adolescence filled with résumé-burnishing activities, and culminating in the all-important personal essay.
Similar contradictions emerge for the system as a whole. Reports bemoan inadequate college-graduation rates and shortages in vital skills like science and engineering. At the same time, employment options for graduates are so bad that prospective students are told that investing in a college education may be the worst decision they could make.
Colleges are a mirror of the society they serve.
But the whole picture becomes clearer if you keep in mind one simple fact. Colleges are a mirror of the society they serve. The growth of inequality and entrenched privilege in America (also evident, but less extreme, in other English-speaking countries) is mirrored by the higher-education sector.
At the top, things have never been better, either for the institutions or the students who attend them. But for the middle class, what was once a safe route to ensure that children did at least as well as their parents is now a big gamble, staking crippling student debt against the prospect of entry to the professional classes that make up the top 20 percent of income earners.
Even if going to college has become a much less attractive option, the alternative is far worse. Real wages for men with only a high-school diploma are lower now than they were in 1970, and the chance of holding a job has dropped by even more. Combining those factors, earnings for the median man with a high-school diploma and no further schooling fell by 41 percent from 1970 to 2010.
Women did a bit better in the early part of that period, but are now experiencing similar trends. For white women with only a high-school education, life expectancy, from 1990 to 2008, has actually declined by as much as five years.
Higher education reflects those trends.
Starting at the top, we have the 1 percent, in this case the Ivy League, along with Stanford, the University of Chicago, and the elite liberal-arts colleges. Those institutions educate 1 percent of the college-age population but account for almost 17 percent of all endowment income, and produce a similarly disproportionate share of the corporate and political elite. Legacy admissions, along with favorable treatment for the offspring of donors, ensure that the children of the 1 percent are massively overrepresented in the Ivy League. Legacy status gives fortunate applicants the equivalent of an additional 160 points on the former 1600-point SAT scale.
That’s a big advantage, but a bright kid whose parents are in the top 20 percent of the income distribution has a shot at getting into the Ivy League even without legacy status. And just about everyone in the upper-income bracket who has a chance goes for it. Not surprisingly, applications vastly outnumber admissions. Stanford now accepts only 5 percent of applicants, and rates are only marginally higher for the other elite schools.
Given the intensity of the competition, and the advantages of having educated and wealthy parents, it’s unsurprising that the 80/20 rule applies. That is, 70 to 80 percent of the students at Harvard and other elite universities come from the top 20 percent of the income distribution. With a base like that, it’s easy to allocate the remaining places, at minimal or low tuition, to diversity and the spurious promise of a needs-blind admissions policy.
And, given the clientele, there’s nothing surprising about the proliferation of luxury dormitories and athletic facilities. The incomes of the top quintile have risen steadily over recent decades, and those of the top 1 percent have risen spectacularly. They see no reason for their children to endure the relatively spartan living conditions that once characterized student life, even at elite colleges. And they have no real objection to universities’ paying big money to elite academics and senior managers. The resulting tuition costs help to tilt the balance against middle-class families not rich enough to pay but too well-off to qualify for full financial aid.
One step down the scale are the state flagships, epitomized by the University of California system. Under the famous 1960 Master Plan, the top 12.5 percent of California high-school graduates were guaranteed a place at one of the flagship UC-system campuses, the top third would be able to enter the California State University, and the community colleges would accept all applications. All were tuition-free, and it was possible, at least in principle, to make the transition from lower to higher tiers.
At the time, the plan was revolutionary. The flagships offered enough places that students from almost any background, given ability and determination, could get in, and produced enough graduates to meet the demand for workers in professional and managerial occupations. Even more striking was the idea that everyone, no matter what their family background or how well they had done in high school, should have a shot at higher education and the social mobility it generated.
A half-century later, in California and elsewhere, the flagships have responded to decades of cuts in state funding by transforming themselves into quasi-private institutions, relying heavily on private philanthropy and tuition fees from out-of-state and international students. In-state students, who make up a declining proportion of a shrinking enrollment, have faced huge increases in tuition, despite declining expenditures per student.
Similar processes have worked themselves out in the second-tier state-university systems. Although barely mentioned in many discussions of higher education (since most of the participants in those discussions were educated at private institutions or state flagships), non-research-intensive state universities represent the core of American higher education. Where they perform well, they represent a high-quality, lower-cost alternative to the research-intensive flagships. Where they perform badly, they are disaster areas, with as few as 1 percent of students graduating in the standard four years.
Even in six years, some state universities graduate less than 25 percent of their students. Most students who attend those institutions take on high debt and get little or nothing in return. Steadily shrinking support has produced more disasters and has compromised quality even at the best of those institutions.
An alternative to the state system, seen by some observers until quite recently as the great hope for the future, is the for-profit sector, epitomized by the University of Phoenix. It has become increasingly apparent, though, that the sector, taken as a whole, is little more than a system for getting students to take on debt and extracting Pell Grants from the federal government. According to a report by the Education Trust, in 2008, only 22 percent of the first-time, full-time bachelor’s-degree students at for-profit colleges over all graduated within six years, comparable to the worst performers in the state sector.
At the bottom of the status hierarchy are community colleges. The two-year institutions have accounted for most of the growth in postsecondary education in recent years. They cater mainly to the lower middle class, for whom education was once the route to upward social mobility. Those institutions are failing badly. In Divided We Fail, Colleen Moore and Nancy Shulock found that six years after initial enrollment, only about a third of California community-college students have completed a degree, about half have dropped out, and around 15 percent are still enrolled. National studies paint a similar picture.
Finally, of course, there is the option of not going to college at all. In the mid-20th century, that wasn’t a bad choice. Unionized blue-collar workers could make a middle-class income and look forward to the same or better for their children. In the 21st century, entering the work force with only a high-school diploma (or worse, without one) is a virtual guarantee of poverty and unemployment, with the prospect that your children, and theirs, will be trapped at the bottom of an increasingly rigid social hierarchy.
To some extent, that gloomy picture is the result of policy choices specific to higher education. A reversal of the decades of expenditure cuts would do a lot to restore access to high-quality education for the children of the middle and working classes. And elite institutions could pay more attention to economic diversity among their students, particularly by scrapping legacy admissions.
But changes like those would have only marginal effects if the trend toward entrenched inequality continued. Whatever political efforts might be made within higher education, the 1 percent will find ways to push their children to the front of the queue, while the poor will, for the most part, never even get to apply.
John Quiggin is a fellow in economics at the University of Queensland, in Australia; a columnist for The Australian Financial Review; a blogger for Crooked Timber; and the author of Zombie Economics: How Dead Ideas Still Walk Among Us (Princeton University Press, 2010).