In 2017, the University of Virginia reported an operating budget of almost $3.2 billion, assets of $11.2 billion, and liabilities of more than $7.8 billion. The university includes UVA Global LLC, a wholly owned subsidiary based in Shanghai; an athletics enterprise with 25 programs and $24 million in revenues and expenses; a police force with 67 officers; an investment company that manages resources from 25 tax-exempt foundations, each with its own board; ownership of numerous art, historical, and scholarly collections, including more than five million printed volumes; capital assets in the form of academic buildings, dorms, and a Unesco-recognized World Heritage Site; a top-ranked medical center with several affiliated health companies, more than 12,000 employees, and its own budget of almost $1.5 billion; a concert-and-events venue for everything from monster-truck rallies to the Rolling Stones; a recycling business; a mental-healthcare provider; and a transportation system with a fleet of buses and cars. Incidentally, UVa also educates around 16,000 undergraduates and 6,500 graduate and professional students each year.
In 1963 the University of California’s president, Clark Kerr, famously predicted this state of affairs when he described the postwar American university as a “multiversity” — an institution serving varied, even conflicting, interests and oriented to a range of purposes.
Today, Kerr’s multiversity seems quaint. Universities both public and private contend with an ever-expanding range of demands and expectations: that they satisfy the health-care needs of local populations, that they redress manifold social inequalities, that they serve as engines of economic activity and growth — even as the most elite among them have historically exacerbated some of these very same problems.
The multiversity Kerr described was not the result of any considered plan or coherent philosophy. Rather, it emerged inadvertently as a congeries of historical conceptions of the university. Kerr identified three salient traditions. The first was represented by Cardinal Newman, founding rector of the Catholic University of Ireland, which eventually became University College Dublin. Newman regarded the purpose of the university as the pursuit of knowledge for its own sake, cultivating gentlemen suited to lives of erudition, taste, and intellectual refinement. The second was embodied in Abraham Flexner, an American educational reformer who, in 1930, founded the Institute for Advanced Study, in Princeton, N.J. He invoked a German model that defined the university as an institution devoted to specialized research.
To save itself and to better serve its democratic purpose, the university needs to be not more but less reactive to public demands.
Finally, Kerr described the “American model,” which he saw most strongly reflected in the land-grant movement of the latter half of the 19th century. This distinctly American idea of the university was born of an explicit twinning of higher education and the democratic project, opening the doors of the academy to a broader public and emphasizing such “practical” fields of study as engineering and agriculture. If Newman’s university served the generalist and Flexner’s the specialist, the American model was to serve the demos.
Kerr saw all three models as coexisting in the multiversity. The balance among them varied by institution, but, under the watchful stewardship of presidents, they remained in a general state of homeostasis. In the 55 years since Kerr’s treatise, however, the “American model” has increasingly eclipsed the other two. Regardless of what they do or how they fund and organize themselves, American universities understand themselves as institutions in service to the public.
Yet this shift has proceeded with no clear sense of who that public is or why universities ought to serve it — and, perhaps most important, no clear sense of how they should serve it. While the challenge of Kerr’s multiversity was to balance the interests of students, faculty members, administrators, and governing boards, its contemporary derivative — call it the “omniversity"— is beholden to a public whose often conflicting interests are far more complex and intractable than suggested by the deceptively monolithic ideal of “the public.”
American universities’ democratic commitment has been both empowering and imperiling. It has compelled them to open themselves up to previously excluded publics, but it has also encouraged them to accrue a range of functions they were never imagined for — and are often ill-equipped to take on. The university’s appetite for always doing more could prove to be its undoing.
If the university is to flourish and continue to play a vital role in American life, it needs to reinterpret its democratic legacy. And it needs to do so with a frank acknowledgment of the fragility of the public it purports to serve. The university is what it is today, in part, because of the atrophy of other public institutions, which has left universities to fill a widening void. Higher education is in a precarious position; so too is the American republic. In order not just to save themselves but to fulfill their social role, universities need a more refined understanding of their responsibilities to the public — and of how to meet them in ways that are consistent with their own animating purpose. They also need an honest appreciation of their limits.
We issue this charge from the University of Virginia, an institution that, since its founding 200 years ago, has cast itself as a guardian, as Thomas Jefferson put it in 1818, of “the public prosperity,” even while, for much of its history, consistently limiting that “public” to wealthy white men. One of us is a member of the faculty; the other is a career administrator. If the chatter about higher education is to be believed — faculty members bemoaning “administrative bloat” and parasitical “BS jobs,” and administrators sighing about the vanity and cluelessness of the faculty — surely we represent warring sides in the struggle for the soul of the university. But we see our respective roles as expressions of the university’s admirable if often misdirected interpretation of its public purpose. And on this we agree: In an era of public disinvestment and disintegration, universities need to reconsider how they can most effectively serve “the public” whose flourishing they have long been charged with sustaining.
Until the last third of the 19th century, American higher education was a predominantly private and ecclesiastical endeavor. But the federal government intervened on a grand scale with the passage of the Morrill Act in 1862, granting states federal land for the establishment of public universities to offer training in agriculture, engineering, and the liberal arts.
Until then, most colleges educated a largely East Coast, Protestant, white-male elite, molding Christian gentlemen who would go on to lead the nation. If universities existed “to form the statesmen, legislators and judges, on whom public prosperity and individual happiness are so much to depend,” as Jefferson first described the purpose of UVa, then the newly imagined institutions of the Morrill Act sought to serve that public more directly.
By the end of the century, American universities had not only begun to rival those in Berlin and Göttingen but to represent a new and distinctly American vision of higher education. The University of Chicago and its founding president, William Rainey Harper, are exemplary. In “The University and Democracy,” a speech delivered at Berkeley in 1899, Harper lent the legislative intent of the Morrill Act a moral force. The university, he declared, found its legitimacy in a democratic public; it was “of the people, and for the people, whether considered individually or collectively.”
But Harper raised the rhetorical stakes: The university was also the “agency established by heaven itself to proclaim the principles of democracy. ... It is the university that, as the center of thought, is to maintain for democracy the unity so essential for its success.” The university was a divine agent entrusted with a historical task. It was the prophet who proclaimed the promise of democracy, the philosopher who reflected on its problems; it was the priest who preserved the practices of human communion and maintained the “religious cultus” of democratic traditions. American higher education did not just educate democratic citizens. It sustained democracy itself.
Harper’s vision became a guiding ideal across America. In 1905, Charles Van Hise, president of the University of Wisconsin, first articulated what would come to be known as the Wisconsin Idea: he would not rest until the “beneficent influence of the University reaches every family of the state.” Addressing Princeton’s faculty and students as early as 1896, Woodrow Wilson declared that “when all is said, it is not learning but the spirit of service that will give a college place in the public annals of the nation.”
Over the course of the 20th century, this democratic self-understanding begot wondrous goods. Universities gradually expanded access, admitting women, Jews, and people of color. From 1890 to 1940, enrollment increased roughly fivefold. After World War II, enrollment rates soared, then stabilized in the 1970s, only to continue their upward advance in the 1980s and 1990s. Universities could no longer legitimately presume that their student bodies of years past — elite, white, male, generally Protestant — represented the public beyond their campuses.
As universities enrolled more students, they also assumed more and more complex social functions. Between 1900 and 1950, they absorbed professional training, establishing schools for business, medicine, dentistry, pharmacy, law, and theology, among others. They built dormitories, hospitals, football stadiums, laboratory schools and child-care facilities, public museums and performance halls, and, as the footprint of the campus expanded, transportation systems and off-campus housing.
During the Cold War, serving the public also turned out to be in universities’ material interests. They took advantage of enormous federal resources. Beginning with the Servicemen’s Readjustment Act of 1944 (the GI Bill) and continuing with the National Defense Education Act of 1958 and the Higher Education Act of 1965, universities were transformed. Enrollments, physical and administrative infrastructure, student financial aid, and federally funded research all expanded. Increasingly, universities turned to their alumni as a significant source of revenue. To invest in higher education was to invest in the democratic project. This was a social compact.
The irony of the success of the “American” model was that it authorized a proliferation of purposes. Harper had described the university as a prophet of democracy. Prophets are chosen instruments, divinely elected to share a god’s clarion message. But whose voice, amid the myriad voices of “the people,” was the university to relay?
The reimagination of the university as fundamentally democratic was glorious and good, but also, as it turned out, perilous. It licensed universities to pursue, relatively unconstrained, activities and functions for which they were often ill-equipped; it also distracted them from their basic strengths in education and research.
There is another reason, just as crucial, that the American university developed into the corporate hydra it is today.
In neighborhoods and communities across the country, universities are not just the largest employer but are often one of the few remaining public institutions. Over the past half-century, and with accelerating speed since the 1980s, universities have attempted to fill a growing vacuum in American public life. They’re not growing (simply) to satisfy the egos of presidents or mollify crazed sports fans, but to address real and pressing social needs such as health care, child care, economic development, and social mobility. The plight of contemporary universities is that they may be the last public institutions left standing.
They can’t do it alone. And the more they try to, the more their authority and potential for public good is eroded — internally by increasingly fractious debates among faculty members and administrators about “mission creep,” resource allocation, “corporatization,” and so on; and externally by government and public perceptions that (ironically) universities are pursuing private interests and should therefore forfeit the rights and privileges, such as tax-exempt status, afforded to entities serving the public good.
Democracy does not need a prophet; it needs a public. And universities can help establish that public.
In this twilight of American public life, universities have assumed the responsibility not just to address — to study, theorize, research — social problems but also to redress them. They have increasingly assumed the burden of attempting to repair and sustain a society that extends well beyond their campus gates. Consider the large-scale projects that Harvard and the University of Pennsylvania have undertaken in the past several years to expand their campuses and “revitalize” and “renew” local neighborhoods — projects motivated by a broader failure of social institutions. Such efforts at “urban renewal” have hardly been uniformly salutary, often resulting in the destruction of functioning communities — in some cases, ironically, to accommodate new schools of public health or social work. Universities have always served distinct functions and purposes. But how much should we, that ill-defined demos, hold them responsible for repairing a society whose public institutions are unraveling?
The fate of American universities over the course of the 20th and now 21st centuries has been inextricable from the fate of American society more broadly. How can they fulfill their democratic responsibilities but avoid the endless accretion of functions that risks undermining them? How can universities adjudicate among their proliferating purposes?
Scholars such as Christopher Newfield have consistently called for universities to recover a “public good conception” to overcome their capture by private interests. But it is precisely such a vague public commitment that makes the contemporary university’s situation untenable. The conflicting interests of the public, the systematic and long-term disinvestment in public institutions more broadly, the amalgamation of public and private interests — all of these make any return to an unalloyed commitment to an idealized “public” difficult and ill-advised. The university’s democratic commitments have become too centrifugal, pulling apart its interests, energies, and purposes. To save itself and to better serve its democratic purpose, the university needs to be not more but less reactive to public demands.
One consequence of the ascendance of the “American” model is that it forced universities to justify themselves in public terms. But today the only widely shared moral language, the only commonly accepted way to talk and think about ideals and purposes, is the rubric of economic utility. So universities describe themselves in terms of economic value — their contributions to economic development, to technology innovation, to work-force training. The collapse of our public institutions might be matched only by the poverty of our moral imaginations.
What we are calling for is a university whose democratic responsibilities are revivified by its animating purpose: what Daniel Coit Gilman, founding president of the Johns Hopkins University, described as the “acquisition, conservation, refinement, and distribution of knowledge.” Universities cannot sustain the aspirations of a democratic society on their own. But they can — and ought to — serve democratic ideals by more intently focusing on their unique role: to create and share knowledge.
The university’s role in a thriving democracy, however, makes sense only within a coherent and functioning social whole. Its ability to educate and create knowledge depends upon local preschools to care for the young children of faculty and staff members and students; a reliable public transportation system to deliver people to and from campus; a health-care system to tend to our bodies and minds; and local, state, and federal governments to pass relevant legislation and fund civic infrastructure, including the work of universities themselves.
Democracy does not need a prophet; it needs a public. And universities can help sustain, nurture, and establish that public by bringing knowledge out into the world and defending it as a common good. The history of American universities and that of the American republic are interwoven, and so too are their futures. It is not enough to save the university; we must redeem American public life.
Correction (8/8/2018, 12:56 p.m.): This article originally stated that John Henry Newman was the founder of the University of Dublin in the mid-19th century. Newman was in fact founding rector of the Catholic University of Ireland, which eventually became University College, Dublin (not the University of Dublin, which was founded in the 16th century). The article has been updated to reflect this correction.
Adam Daniel is senior associate dean for administration and planning at the University of Virginia, where Chad Wellmon is a professor of German language and literature.