In 1904, while touring the eastern half of the United States, the German sociologist Max Weber encountered an institution that would intrigue him for decades: the American college. Between delivering lectures and finishing the final drafts of what would become The Protestant Ethic and the Spirit of Capitalism (1905), Weber visited Columbia University, Harvard University, Haverford College, and Northwestern University, among others. On their pretty, well-kept campuses, he found, as he wrote to his wife, Marianne, a fascinating “wild muddle” (“ein wildes Durcheinander”) of education, religion, and socialization. The American college was neither a finishing school for the wealthy nor a training academy for professionals; it was a social institution that performed liturgies, cultivated character, and helped to sustain the culture of a uniquely American capitalism.
More than a century after Weber’s U.S. tour, the character of our colleges prompts polemical debate. A wave of student protests in 2015 at places like Yale, Middlebury, and the University of Virginia received national media attention. Scrutiny of the political activism and moral lives of American undergraduates and faculty members is an established genre of cultural criticism. From campus novels such as F. Scott Fitzgerald’s This Side of Paradise (1920) and John Williams’s Stoner (1965), which lampoon the pretensions of English professors and the high-minded lasciviousness of fraternity brothers, to the periodic reports in The New York Times on the latest campus crazy, America’s cultural elite keeps up with its alma mater. The four years spent at Middlebury, Yale, or UVa are, as Weber observed, the most “magical memories of youth” — compelling 40-year-olds to return for class reunions and to give money. This nostalgic attachment sustains the scrutiny of campus life. Reading about the latest outrage is like catching up on the family drama at Thanksgiving dinner — an experience of sometimes proud, sometimes repulsed, recognition.
More recently, some scholars have trained their expertise on other domains of campus life. One of the most prominent, Jonathan Haidt, a social psychologist at New York University and an advocate of what he calls “campus-viewpoint diversity,” recently warned that the string of protests in 2015 marked a turning point for higher education. Where previously they had pursued truth as their singular purpose, elite universities now embraced “social justice.” Vaguely appealing to Aristotle, Haidt asserts that universities can’t have two purposes, much less conflicting ones. They have to choose: truth or justice.
In fact, universities have never had just one purpose. They have always pursued multiple, often competing ends. For centuries, they have organized themselves around the maintenance of church doctrine and the education of the clergy and a confessing citizenry, the production of economic value, the formation of citizens (democratic and otherwise), the creation and transmission of knowledge, the maintenance of culture and class. In the 13th century, the University of Paris educated the clergy and sustained a scholarly community; in the 18th century, the University of Göttingen, in Germany, raised funds for the state coffers and created the first scholarly research library; in the late 19th century, the University of Chicago formed democratic citizens and produced scientific research for industry.
In 1862, Congress passed the Morrill Act, which generated significant funds for colleges and universities. It led directly to the founding of new universities, such as the University of California, and to scientific schools at existing institutions, such as Princeton and Columbia. Although the act is most often remembered for its support of “agriculture and the mechanic arts,” it also singled out “liberal” education as a public good and mandated its support. The act declared the creation and transmission of knowledge a democratic and social project. By legislative decree and financial incentive, it tethered one of the university’s historical purposes, the pursuit of truth, to another, the pursuit of social goods.
College curricula have become peripheral to moral education.
In the following decades, higher-education leaders across the country cast this dual mission of truth and public good as the defining purpose of a distinctly American university. In 1874, Andrew D. White, a historian and founding president of Cornell University, told a national gathering of educators in Detroit that while individual states had assumed responsibility for primary and secondary education, they had left higher education to “various religious sects” that had done “lasting injury” to the well-being of the nation. The American public, White told his audience, needed universities that were “supported by the whole people and for the whole people” and dedicated to “public civic action.”
For Weber, the American college was a unique social institution. He often contrasted it to German universities. In 1911, Weber told an assembly of German academics in Dresden that the new U.S. institutions had adopted a German model but adapted it to a different culture and to different ends, and that their scope now rivaled that of even the grandest of German universities.
They had developed out of an older American institution — the college. Originally established by religious sects to train ministers, American colleges had organized themselves around the strict discipline of student life. They required students to attend chapel services (often multiple times per day), maintained a fixed curriculum, and sought to form good Christian gentlemen. This cultivation project culminated in a fourth-year course in moral philosophy, typically taught by the college president — usually an ordained minister — and designed to ground Protestant morality, character, and dogma in common reason and shared social commitment. Such capstone courses showed students how religious doctrine fit with modern knowledge, and how both related to life. Truth, in its scientific, moral and religious dimensions, was one.
As American colleges and universities were transformed into research institutions and, as Weber put it, became more “metropolitan” — loosening or even severing their denominational affiliations, replacing the classical curriculum with an elective one, relaxing compulsory chapel attendance — they maintained their commitment to the ethical formation of their students.
At Northwestern University, Weber was astounded that students had to account for their attendance in a “chapel record.” In chapel, they heard not only sermons and Scripture readings but also the most recent theological scholarship, and were dismissed only after the day’s football and baseball schedules had been announced. Weber left Evanston impressed by the social life and ethos of undergraduate education, whose “output,” he wrote, was immediately obvious: “endless intellectual stimulation,” habituation to hard and serious work, lasting friendships, and enduring forms of sociability. The “college-bred man,” one American businessman assured Weber in Chicago, learned not just to reflect upon the world but also to act in it and for it.
German and American institutions of higher education formed two distinct types of people. German universities trained specialists according to the ideals of Wissenschaft, or scholarship, while American universities formed young men of “character” according to the ideals of an emerging culture of industrial capitalism. Whether public or private, university or college, American institutions of higher education socialized students. They taught young men how to hold their own among “social equals,” Weber said in Dresden, and formed a social disposition that was the “foundation” of American society. Whereas 18th- and early-19th-century Harvard, Princeton, and Yale had formed clergy, by the turn of the 20th century they cultivated capitalists of character — cool and efficient managers as well as captains of industry brimming with confidence that they could change the world.
By the mid-20th century, the dominance of the Anglo-Protestant monopoly had largely eroded, and with it the ability of elite colleges to form character and robust social bonds on the basis of a homogenous culture they had long taken for granted. American colleges and universities grew again, increasing their enrollments. They slowly admitted women, Jews, Catholics, and students from outside the elite. To replace the 19th-century courses in moral theology, and the character they aimed to instill, these institutions sought a curriculum aspiring to a unified account of knowledge and morality.
Scientists proved reluctant to offer their research as resources for moral instruction. Faculty members in literature, art, philosophy, and history were less so. Beginning with the introduction of Columbia’s “Contemporary-Civilization” course, in 1919, institutions introduced general-education programs in an effort, it seemed to many, to recover the university’s pursuit of truth as a matter of both knowledge and morality. From the 1930s to the ’50s, literary scholars, philosophers, and historians banded together to form a new institutional structure that claimed a monopoly over questions of morals and values. They called it “the humanities.”
In the 1960s, students demanded more engagement with social concerns, and universities responded by gradually introducing new humanities courses and programs: in black studies, women’s studies, and ethnic studies. Many conservative commentators, such as Allan Bloom, condemned such movements for supplanting the pursuit of truth with the performance of politics.
These curricular reforms, and the moral mission and democratic purpose that — to their proponents — they represent, have never fully been accepted. This contest underlay the canon wars of the 1980s and ’90s and, in the past few decades, led to protracted debates about (post)modern theory in literary studies and the monitoring of “politicized” courses and political indoctrination by activist groups. Most recently, there have been worries that conservative donors bought influence by financing university-based institutes and faculty positions. But those efforts will, thankfully, never succeed in reproducing the social homogeneity and exclusion upon which the moral culture of the 19th-century college depended.
Amid these feuds, another, in some ways bigger, change has escaped due attention: College curricula have become largely peripheral to moral education. Within almost all colleges and universities, with the exception of some religiously affiliated institutions (and even then unevenly), moral education has shifted from the curriculum — from classrooms and labs — to extracurricular student life.
Stories about colleges coddling students with lazy rivers, climbing walls, and buffets provoke ire but fail to convey the richness and scale of extracurricular life. Students at places such as Middlebury, Yale, and Virginia spend more time participating in clubs, fraternities or sororities, and athletics than they do in class or studying. At Virginia, students lead more than 800 clubs and groups, most with weekly meetings and events — from the Jefferson Literary and Debating Society to the quidditch team; they volunteer hundreds of thousands of hours every year for local, national, and international groups; they participate in collegiate sports, practicing and training as much as 20 hours per week; and they participate in arts programs, from small writing groups to internationally recognized orchestras.
Students are formed primarily outside of classrooms and apart from their professors. And it’s this institutional separation that is central to understanding contemporary college campuses.
As universities experimented with various curriculum-reform efforts in the first half of the 20th century, as the historian Julie Reuben has shown, they simultaneously reasserted their control over student life by introducing new institutional programs and structures almost entirely independent of the curriculum. Most of these extracurricular initiatives sought to instill moral character and a sense of common purpose. What happened in the classroom was about knowledge. What happened outside was about life and, however fragmented, morality.
Universities built dormitories, student unions, undergraduate libraries, and residential colleges. The famed residential-college systems at Harvard and Yale were not established until the late 1920s, with Yale’s first seven colleges opening in 1933. Universities also established offices of student life and advice, offered counseling, and invested heavily in sports. Although American college students had been playing football for decades, it wasn’t until the second decade of the 20th century that universities constructed big stadiums and hired professional coaching staffs and athletic directors. They transformed collegiate athletics into ritual. Loyalty and love for one’s alma mater could be cultivated through the traditions of football games. The liturgical habits of the Protestant college were reinvented. The cadences of compulsory chapel gave way to stadium chants; the college professor ceded his moral standing to the football coach.
When universities shifted moral education out of their curricula and into to the realm of extracurriculars, they relieved faculty members of responsibility for the ethical formation of their students. And the faculty members consented, grateful for the time to focus on other administrative obligations or their research. Professional administrative-staff members gradually took on the responsibility of character and moral education.
Many of my students at Virginia, along with some of their peers at Yale, Middlebury, and elsewhere, are acting out of the complex tradition of American higher education. They are heeding and reinterpreting the dual purpose of the American college and university as Weber observed it, and as generations of university leaders have extolled it: to serve the public by forming citizens committed to civic action; to change the world. In its own way, the recent increase in student-led campus activism represents a return to these earlier ideals, an attempt to align knowledge and moral vision in a common institutional project. In a sense, it is the student activists, not the “viewpoint diversity” advocates, who are the conservatives, the ones making explicit moral judgments and defending ideals.
All of this is taking place against a backdrop of bigger change: Education has become an almost incidental endeavor. The University of Virginia, for example, is an entertainment-and-production company (UVa’s concerts and events), a health-care provider, a start-up incubator, a federally financed research unit, a philanthropic behemoth, a sports franchise, a police force — as well as a community that educates and creates knowledge. And these multifarious activities correspond to a range of distinct purposes. Contemporary universities are expected to educate, democratize, credentialize, socialize, and produce economic value. They have become all-purpose institutions bound together by a development office, a roster of sports teams, and a president who is not just a “captain of erudition” or a mediator of the multiversity but also CEO of a diversified, international corporate endeavor.
The transformation of American colleges and universities into corporate concerns is particularly evident in the maze of offices, departments, and agencies that manage the moral lives of students. When they appeal to administrators with demands that speakers not be invited, that particular policies be implemented, or that certain individuals be institutionally penalized, students are doing what our institutions have formed them to do. They are following procedure, appealing to the institution to manage moral problems, and insisting that the system’s overseers turn the cant of diversity and inclusion into real change. A student who experiences discrimination or harassment is taught to file a complaint by submitting a written statement; the office then determines if the complaint has merit; the office conducts an investigation and produces a report; an executive accepts or rejects the report; and the office “notifies” the parties of the “outcome.”
These bureaucratic processes transmute moral injury, desire, and imagination into an object that flows through depersonalized, opaque procedures to produce an “outcome.” Questions of character, duty, moral insight, reconciliation, community, ethos, evil, or justice have at most a limited role. American colleges and universities speak the national argot of individual rights, institutional affiliation, and complaint that dominates American capitalism. They have few moral resources from which to draw any alternative moral language and imagination. My students have adapted the old Protestant college’s moral mission to the demands of the institutions in which they now find themselves.
The extracurricular system of moral management requires an ever-expanding array of “resources” — counseling centers, legal services, deans of student life. Teams of devoted professionals work to help students hold their lives together. The people who support and oversee these extracurricular systems of moral management save lives and inspire students, but they do so almost entirely apart from any coherent curricular project.
The cadences of compulsory chapel gave way to stadium chants.
It is entirely reasonable, then, for students to conclude that questions of right and wrong, of ought and obligation, are not, in the first instance at least, matters to be debated, deliberated, researched, or discussed as part of their intellectual lives in classrooms and as essential elements of their studies. They are, instead, matters for their extracurricular lives in dorms, fraternities or sororities, and student-activity groups, most of which are managed by professional staff members who, for many faculty members, seem to work in a wholly separate institution. The rationalization of colleges and universities has led to the division not only of intellectual labor (through academic specialization) but also of basic educational functions.
More than a century ago, Weber glimpsed in American colleges the promise of a more convivial capitalism, the moral habits and social bonds that could sustain a form of life not fully defined by the cold, calculating rationality of modern capital. This culture had shaped students into capitalists of character and institutions into “industrial enterprises,” each led by a president who relentlessly competed for talent, resources, and money.
Weber’s hopes for a capitalism tempered by collegiate conviviality were never realized. Over the course of the 20th century, America’s elite colleges and universities continued to outcompete, outperform, and quite simply out-capital not only their German predecessors but almost all institutions of higher education around the world. On their way to unparalleled prestige and wealth, however, they also fostered campus cultures and moral environments often indistinguishable from the techno-utopian hypercapitalism of international corporations.
And yet the “wild muddle” that is the elite American college remains, however embattled. Amid the absurdity of some recent campus incidents, it provides a privileged elite with resources, space, and time not wholly beholden to markets. Almost despite itself, it continues to be a fugitive space, where a poem can capture the attention of an 18-year-old for days, an essay can steal the time of a 40-year-old faculty member for weeks, and the children of those who never belonged can bring bookish ideas — about democracy, duty, justice, evil, truth — to bear on how they live. There are few institutions of its kind left.
Chad Wellmon is a professor of German studies at the University of Virginia. An earlier version of this essay appeared in Aeon.