"You know," then-candidate Donald Trump said during the 2016 campaign, "I’ve always wanted to say this: … The experts are terrible." Trump has often been caught at a loss over basic issues of public policy and has repeatedly bashed intellectuals who criticized his lack of substance. He famously exulted, "I love the poorly educated," demanded that President Obama prove his American citizenship, and cited the National Enquirer approvingly as a source in charging that the father of one of his opponents, Ted Cruz, was involved in the assassination of President John F. Kennedy.
Higher education is supposed to be a bulwark against these kinds of frontal assaults on established knowledge, empowering voters with the ability to know fact from fiction, and to fight fantasy with critical reasoning.
How’s that going? Not well. In the wake of the 2016 election, half of Republicans believe that Donald Trump won the popular vote (he didn’t), while half of Democrats believe that Russia hacked America’s voting machines (it didn’t).
Colleges are failing not only in their mission to educate young men and women but also to provide the foundations of civilizational knowledge on which informed voting and civic participation are built. These failures are threatening the stability and future of American democracy.
The pampering of students as customers, the proliferation of faux "universities," grade inflation, and the power reversal between instructor and student are well-documented, much-lamented academic phenomena. These parts, however, make up a far more dangerous whole: a citizenry unprepared for its duties in the public sphere and mired in the confusion that comes from the indifferent and lazy grazing of cable, talk radio, and the web. Worse, citizens are no longer approaching political participation as a civic duty, but instead are engaging in relentless conflict on social media, taking offense at everything while believing anything.
College, in an earlier time, was supposed to be an uncomfortable experience because growth is always a challenge. It was where a student left behind the rote learning of childhood and accepted the anxiety, discomfort, and challenge of complexity that leads to deeper knowledge — hopefully, for a lifetime.
That, sadly, is no longer how higher education is viewed, either by colleges or by students. College today is a client-centered experience. Rather than disabuse students of their intellectual solipsism, the modern university reinforces it. Students can leave the campus without fully accepting that they’ve met anyone more intelligent than they are, either among their peers or their professors (insofar as they even bother to make that distinction).
This client model arose from a competition for students that has led to institutions’ marketing a "college experience" rather than an education. Competition for tuition dollars — too often drawn thoughtlessly from an inexhaustible well of loans — means that students now shop for colleges the way adults shop for cars. Young people then sign up for college without a lot of thought given to how to graduate or what to do afterward. Four years turns into five and, increasingly, six or more. (A graduate of a well-known party school in California described his education as "those magical seven years between high school and your first warehouse job.")
A limited diet of study has turned into an expensive educational buffet, laden mostly with intellectual junk food, but little adult supervision to ensure that the students choose nutrition over nonsense. Faculty members often act as retailers for their courses rather than educators. As a professor at an elite college once said to me, "Some days I feel less like a teacher and more like a clerk in an expensive boutique."
These changes weren’t sudden. They have happened over decades. When I arrived at Dartmouth, at the end of the 1980s, my colleagues told me a story about a well-known scientist there who gave a lecture to a group of undergrads on international-security affairs. During the question-and-answer, a student waved away the professor’s views, saying, "Well, your guess is as good as mine." "No, no, no," the professor said emphatically. "My guesses are much, much better than yours."
Her request was quickly answered by the director of a security-consulting firm in London, an expert in the field of chemical weapons. He offered his help and corrected her by noting that sarin isn’t a gas. The student responded in a storm of outraged ego: "yes the [expletive] it is a gas you ignorant [expletive]. sarin is a liquid & can evaporate … shut the [expletive] up." The security professional, clearly stunned, tried one more time: "Google me. I’m an expert on sarin. Sorry for offering to help." Things did not improve before the exchange finally ended.
Faculty members both in the classroom and on social media report that incidents like that, in which students see themselves as faculty peers or take correction as an insult, are occurring more frequently. Unearned praise and hollow successes build a fragile arrogance in students that can lead them to lash out at the first teacher or employer who dispels that illusion, a habit that carries over into a resistance to believe anything inconvenient or challenging in adulthood.
Even as colleges cater to student tastes and living standards, they burnish their own brands and convince students that they are better educated than they actually are. This is why, for example, colleges since the 1990s have been elevating themselves to "universities." They try to appeal to students who want to believe they’re paying for something in a higher tier — a regional or national "university" rather than a local college. These new "universities" then enter a degree-granting arms race against both established and arriviste competitors, bloating their offerings with extra courses to create make-believe graduate degrees as a means of attracting new funding streams.
All of this borders on academic malpractice. The creation of graduate programs in colleges that can barely provide a reasonable undergraduate education cheats both graduates and undergrads. Small colleges do not have the resources of large universities, and repainting the signs at the front gates and changing "College" to "University" on the stationery cannot magically create that kind of academic infrastructure.
More to the point, this rebranding dilutes the worth of all postsecondary degrees. When everyone has attended a "university," it gets that much more difficult to sort out actual achievement and expertise among graduates. People, especially on social media, will misinform their fellow citizens while boasting that they have completed graduate education and that they are therefore to be taken seriously. The only thing more disheartening than finding out that internet know-it-alls are lying about having multiple degrees is to find out that they are telling the truth.
I am not calling here for slimming colleges down to a bunch of STEM departments with a smattering of English or history majors. I deplore those kinds of arguments, and I have long objected to what I see as an assault on the liberal arts. I don’t want to live in a civilization where there are no art-history, film-studies, philosophy, or sociology majors. The question remains, however, whether students in those majors are actually learning anything, or whether there need to be so many students taking these subjects. There is no way around the reality that students are too often wasting their money and obtaining the illusion of an education by gravitating toward courses or majors that either shouldn’t exist or whose enrollments should be restricted to the small number of students who intend to pursue them with rigor.
When rebranded universities offer courses and degree programs as though they are roughly equivalent to better-known counterparts, they are not only misleading students but are also undermining later learning by laying the foundation for social resentment. If I studied film at a local state college and you went to the film program at the University of Southern California, who are you to think you know more than I do?
There’s plenty of bad faith in these arguments, which are often little more than social one-upmanship. A lousy student who attended a good school is still a lousy student; a diligent student from a small institution is no less intelligent for the lack of an elite pedigree. Still, the faux egalitarianism that assumes all degrees are alike regardless of the quality of the school or the program that produced them not only contributes to an inability to recognize expertise but also undermines further learning by breeding the smug and false faith among citizens that a degree has taught them all they need to know.
The problem of inflated degrees is compounded by the problem of inflated grades. Academic standards have been lowered in an effort to ensure that coursework does not interfere with the enjoyable nature of the college experience. As a University of Chicago study found in 2011, "it does not take a great deal of effort to demonstrate satisfactory academic performance in today’s colleges and universities." Forty-five percent of students reported that in the prior semester they had not had a single course that required more than 20 pages of writing over the entire semester; 32 percent had not had even one class that assigned more than 40 pages of reading per week.
"I was a straight-A student at a university" no longer means what it did in 1960 or even 1980. A study of 200 colleges and universities through 2009 found that A was the most commonly given grade, an increase of nearly 30 percent since 1960 and more than 10 percent just since 1988.
The impact of lighter workloads and easier grades on civic competence should be obvious. Students graduate with a high GPA that produces illusory confidence but doesn’t reflect a corresponding level of education or intellectual achievement.
Colleges further diminish expertise in the minds of students by encouraging them to evaluate the educators standing in front of them as though they were peers. I am a supporter of some limited use of student evaluations. I will immodestly say that mine have been pretty good since I began teaching, some 30 years ago, and I have no personal ax to grind here. But the whole idea has gone out of control, with students rating professional men and women as though reviewing a movie or commenting on a pair of shoes.
The cumulative effect of this service-oriented, consumer-tested approach to education is to empower cynicism and uninformed judgment over critical thinking, enabling the kind of glib attacks on established knowledge that defeat the very purpose of college. This, in turn, endangers the stability of a republican form of democracy that relies on trusting elected representatives and their advisers to make informed decisions on behalf of citizens.
Having surrendered intellectual authority within the classroom, colleges capitulate similarly outside the classroom doors. The insistence on traditions of free inquiry that took centuries to establish and that scholarly communities are sworn to defend have, in waves of attacks over the course of a half-century, been giving way to a greater imperative to make the academic community’s young charges feel right at home.
At Yale in 2015, for example, a house master’s wife had the temerity to tell minority-group students to ignore Halloween costumes they thought offensive. That provoked a campuswide temper tantrum that included a professor’s being shouted down by screaming students. "It is your job to create a place of comfort and home!" one student howled in the professor’s face. "Do you understand that?"
Quietly, the professor said: "No, I don’t agree with that." The student then unloaded on him:
"Then why the [expletive] did you accept the position?! Who the [expletive] hired you?! You should step down! If that is what you think about being a master, you should step down! It is not about creating an intellectual space! It is not! Do you understand that? It’s about creating a home here. You are not doing that!"
The house master resigned his residential position, and Yale apologized to the students. The lesson, for students and faculty members alike, was obvious: The campus of a top university is not a place for intellectual exploration. It is a luxury home, rented for four to six years, nine months at a time, by young people who may shout at professors as if berating clumsy maids in a colonial mansion.
A month after the Yale fracas, protests at the University of Missouri at Columbia flared up after a juvenile incident in which a swastika was drawn with feces on a bathroom wall. Exactly what Missouri’s flagship public university was supposed to do, other than wash the wall, was unclear, but the campus erupted anyway. "What do you think systemic oppression is?" a woman yelled at the flustered Mizzou president. "You going to Google it?" she hollered. After a few more days of such theatrics, the president and the chancellor resigned.
This is no longer an issue of the political left or right; academics of all persuasions are distraught at the fragility of 21st-century students. The British scholar Richard Dawkins, for one, was perplexed by the whole idea of "safe spaces," which American students demand as a respite from any form of political expression they might find "triggering." "A university is not a ‘safe space,’ " he said on Twitter. "If you need a safe space, leave, go home, hug your teddy & suck your thumb until ready for university."
The swaddling environment of the modern university infantilizes students and dissolves their ability to conduct a logical and informed argument. Worse, when students learn that emotion trumps everything else, they take that lesson with them as a means of empowering themselves against dissent on any subject.
But colleges can band together to counter at least some of the trends that have eroded their mission and their legitimacy. And professors needn’t wait for any such organized action. They can — beginning today — hold students to clear standards and expect them to learn how to formulate coherent views and to argue them, calmly and logically. They can grade students on their responses to the questions asked and on the quality of their work, not on their political views. They can demand that students treat one another, as well as faculty and staff members, with respect, and that they engage the ideas and beliefs of others in the classroom without emotion or personal attacks.
I realize it’s not as simple as all that. I adhere to these standards as well as I can, and, like my colleagues, I sometimes fail. When students leave my classroom, I am haunted by the realization that I cannot moderate their arguments forever. I cannot prevent them from dismissing others, from rejecting facts, from denouncing well-intentioned advice, from demanding that their feelings be accepted in place of the truth. If they’ve spent four years casually judging their professors, showing disrespect to their institutions, and berating their classmates, they cannot be expected to respect or listen to their fellow citizens.
The battle to maintain sense and civility is especially daunting these days, when America’s new president himself has attacked learning and expertise and has profited politically from the most uncivil and vulgar presidential campaign in modern history. But if college graduates can no longer be counted on to lead reasoned debate and discussion in American life, and to know the difference between knowledge and feeling, then we’re in the kind of deep trouble that no scholar or expert can fix.
Tom Nichols is a professor of national-security affairs at the U.S. Naval War College and the Harvard Extension School, and the author of The Death of Expertise: The Campaign Against Established Knowledge and Why It Matters, to be published by Oxford University Press in March. The views expressed are entirely his own.