In 1829, as the North Carolina legislature debated an expansion of public schooling, a concerned citizen dispatched a letter to The Raleigh Register. “Gentlemen, I hope you do not conceive it at all necessary that everybody should be able to read, write, and cipher,” he wrote. Such luxuries might be defensible for future lawyers and doctors, “but if a man is to be a plain farmer, or a mechanic, they are of no manner of use, but a detriment.”
This question of who gets educated — who deserves the time and resources to learn — is not new. Nor is the desire to distinguish between practical knowledge and idle exploration. The University of North Carolina’s 1789 charter calls for “all useful Learning,” a caveat that stresses higher education’s need to serve worldly ends. But like those booksmart farmers and mechanics, we live in an age when yesterday’s abstract knowledge is becoming tomorrow’s practical necessity.
Manufacturing requires analytics and programming. Farming is a mechanized, globalized industry with environmental, economic, and marketing concerns that would stagger a 19th-century grower. The world changes, and changes again. College presidents are known for singing this song, but we’re not the only ones.
The McKinsey Global Institute predicts that advancing technology will drastically shift the nature of work over the next few decades. Deloitte warns that whole categories of employment — administrative support, retail sales, health aids — will require new skills as people work alongside smarter machines. The U.S. Chamber of Commerce puts it quite bluntly: “Millions of lower-skilled jobs are disappearing, and millions of higher-skilled jobs are being created in their place.”
These are the organizations paid to help the smart money plan for the future, and they are unanimous in calling for higher levels of education.
We’ve been here before. On the eve of the Second World War, about four in 10 Americans over the age of 25 had a high-school diploma. Now, it’s nine in 10. That shift was accompanied by one of the greatest economic booms and most dazzling expansions of knowledge since the dawn of human civilization. If that sounds like hyperbole, it’s only because we live in the world created by that achievement.
Mass education is an American innovation. We are the land of universal high school, of land-grant colleges, of the GI Bill. All along, that democratic instinct has been countered by a thread of skepticism.
Land-grant universities, with their focus on pragmatic research and professional training, brought fears of undermining “true” education. Robert Maynard Hutchins, president of the University of Chicago, warned that the GI Bill would turn colleges into “hobo jungles,” besieged by jobless veterans with no real interest in learning. John Rankin, the Mississippi congressman who chaired the Veterans Affairs Committee at the time, flatly declared that black veterans were incapable of higher learning.
They were all wrong. Education is not a commodity, and making it more accessible doesn’t make it less valuable. Our country thrives when more people get the chance to learn and contribute.
The George Mason economics professor Bryan Caplan takes up the pessimist’s cry in his new book, The Case Against Education: Why the Education System Is a Waste of Time and Money. “As a society, we continue to push ever larger numbers of students into ever higher levels of education,” he wrote in an Atlantic excerpt. “The main effect is not better jobs or greater skill levels, but a credentialist arms race.”
In decrying what he terms a “college-for-all mentality,” Caplan argues that higher education delivers little lasting value. For most students, it’s an inefficient way to signal vague productive capacity to employers. “The labor market doesn’t pay you for the useless subjects you master; it pays you for the pre-existing traits you signal by mastering them,” he writes.
No question, there are students who start college with a wealth of employable talent. At the highly selective institutions that tend to dominate our national discussion of college, many students could’ve “gone pro” right out of high school. It seems cruel to send an 18-year-old straight off to investment banking, especially without some business-ethics courses, but I suppose you could do it.
For millions of others — especially those attending the community colleges and less-selective public institutions that serve the vast majority of American students — college is the place that hones skills and knowledge, builds professional networks, and clarifies life goals. It’s the place where you learn to devote close attention to a hard task, to work alongside others on complex problems, to stick with a long-range challenge. Those “signaled” virtues are well-earned. Data from the Collegiate Learning Assessment show that some of the sharpest student gains happen at regional public universities — institutions that prize opportunity above exclusivity.
Caplan is also wrong about employer indifference toward content. The most popular majors in American colleges and universities are business, biomedical science, health professions, and visual arts. Not coincidentally, those are growing and diversifying sectors of our economy. The market isn’t destroying higher education; it’s endorsing us rather soundly. The rewards for degree completion have continued to grow, even as the percentage of Americans with a degree has risen.
One of the unfortunate side effects of our endless wrangle over who should go to college is that it distracts from more urgent reform. I don’t think every high-school graduate should proceed directly to the nearest university campus, and I don’t know of any college president who does. The conversations I hear among policy makers aren’t about “college for all” but about creating a much clearer set of options beyond high school — from apprenticeships to certifications to community college.
Better choices outside of a four-year degree would be good for everyone, including colleges. Caplan is right to point out that a degree holds life-altering power yet is an all-or-nothing proposition. You either make it to the finish line, or you get little benefit. Our institutions need to embrace alternative credentials, like those from company training programs, online coursework, and military service, and make it easier to pair them with formal coursework. Students need a range of options over a lifetime, not a single, high-stakes shot.
In curtailing the degree’s power as arbiter of someone’s economic future, we can restore the focus on what college is meant to do: push people to think and learn. By welcoming new alternatives for gaining skills beyond high school, we can begin to address the troubling perception of higher education as a source of division rather than a force for opportunity.
Higher education has thrived not because it is exclusive and elite, but because we’ve worked hard to be less so. Our moments of growth and transformation come when we embrace a changing landscape — when land-grant institutions reinvent teaching and research, when the GI Bill redraws our image of the American college student, when states see their public universities as long-range assets rather than near-term burdens.
Today, those ideals demand that we resist the narrow role of a credentialing agency and embrace the broad mission of useful learning in all its forms. That’s our calling.
Margaret Spellings is president of the University of North Carolina system and a former U.S. secretary of education.