In the weeks following the U.S. presidential election in November, Twitter was aflutter with the suggestion that a Biden-Harris administration could issue an executive order canceling student-loan debt. The responses ranged from the moralizing — “Why should I pay for other peoples’ poor choices?” — to the hortatory — “Higher ed is a right!” — to the pedantic — “Historian of higher ed, here.” And then there was the sociologist Tressie McMillan Cottom, quote-tweeting those who couched their opposition to student-debt forgiveness as a concern about the majority of Americans without the “luxury” of a college degree:
McMillan Cottom’s tweet distilled the argument she had first made in her book Lower Ed (2017): The explosive growth of for-profit colleges has been fueled in part by federally backed student loans, a wildly disproportionate share of which are owed by Black men and women. Elite higher ed, in her words, “legitimizes the education gospel while” Lower Ed “absorbs all manner of vulnerable groups who believe in it.”
What must one believe in to be willing to borrow tens of thousands of dollars in order to pursue a certification of completion — a B.A.? What would a college have to promise in order to compel someone to do that? What would a bank have to believe to extend this person credit? Or the U.S. government, to guarantee such loans en masse — now roughly $2 trillion? And what would a society have to believe to sustain the system that keeps it all going?
The word credit comes from French and Italian words meaning “belief” or “trust,” and it is related to the Latin noun for “loan” or “a thing entrusted to another” (crēditum) and the verb “to trust” or “to believe” (crēdere). Credit is a form of trust that one person or group has in another, and that serves as the basis for the former to provide the latter some thing (typically goods or money), with the expectation that the person so entrusted will within a certain period of time return it. In a relationship based on credit, belief and trust become practice. In the United States, it’s just this type of relationship that underpins the financing of higher education.
Colleges have for centuries benefited from the belief that they could provide prospective students, as well as institutions (the Roman Catholic Church, the state, the military, aristocratic classes) with particular goods (social recognition, status or class membership, discrete skills or knowledge, money, prestige). But it wasn’t until the middle of the 20th century in the United States that a belief arose in the capacity of colleges to transform not just the lives of social elites but the lives of all people — and also to directly change society.
Where did this belief come from? How did it become, in the United States at least, nearly universal, almost assumed? Who made use of it, and to what ends?
This belief was bolstered by a particular Cold War mixture of liberal humanism, a progressive theory of history, global finance, and technocratic governance. Together, these four facets made up an ideology meant to modernize not just the United States, Canada, and Britain but the postcolonial world that constituted a former crumbling empire. Few spread the gospel of higher ed more authoritatively than Clark Kerr, who served as chancellor of the University of California at Berkeley (1952-57), president of the University of California (1958-67), and chairman of the Carnegie Commission on Higher Education (1967-73).
But that gospel is no longer news. It’s an ossified dogma that obscures the pact structuring American society since the middle of the 20th century and continues to misshape our lives.
The implicit compact between university and society drew its force from history itself.
In an essay based on his inaugural address as chancellor of the University of California at Berkeley, in 1953, Kerr explained to the faculty, of which he had been a member since 1945, that it was party to an implicit contract that set the terms of the university’s relation to society. In return for the provision of laboratories, libraries, and the freedom to practice the “calling of the scholar,” universities provided society with ideas and a labor force trained for the industrial age — the “raw materials of progress.” By accepting industrial society’s material and moral infrastructure, Berkeley’s faculty members obligated themselves to produce knowledge for it.
The implicit compact between university and society drew its force from history itself. Society supported faculty members and the university, contended Kerr, “in the belief that they are part of a process by which men are able to discover the truth and, through this truth, control their destiny.” Kerr acknowledged how untimely such a belief was. Speaking in the early years of the Cold War, when fears of fascism and totalitarianism constrained political hopes, and fatalisms of both the Christian and the existentialist sort cultivated a debilitating hopelessness, Kerr described the present “age of doubt” as a capitulation to a theory of history that dressed despair up as mature realism. In fact, he said, such despair betrayed a fear that human beings could not control their own futures. This disposition was especially detrimental to the university because it eroded the confidence necessary to fulfill its function: to serve as the intellectual agent of development and modernization.
Kerr exhorted Berkeley’s faculty members to recognize the existential challenge that confronted them and to accept their historic responsibility, a call they could heed only if they believed in progress and the university’s singular role in realizing it. “The university of today,” announced Kerr, “is founded on the faith” that people can consciously direct human progress and control the future. Without this faith, there would be no role for the university in industrial modernity.
Kerr delivered his address on “The University in a Progressive Society” several years after the publication of such liberal classics as Reinhold Niebuhr’s The Nature and Destiny of Man, Lionel Trilling’s The Liberal Imagination, and Arthur Schlesinger Jr.’s The Vital Center. Presenting the future as a dangerous course between the Scylla of communist utopias and the Charybdis of fascist totalitarianism, these works shaped the “bleak liberalism” that the scholar Amanda Anderson has shown was common among American intellectuals between 1930 and 1950. Writing in 1957, Judith N. Shklar, then a young professor of philosophy at Harvard and now best known for her decidedly bleak “liberalism of fear,” identified, like Kerr, a melancholia among American liberals. They had concluded that, as Schlesinger put it, “man was, indeed, imperfect.” Shklar diagnosed American liberals as lacking political faith, a faith in the “power of human reason expressing itself in political action.”
Shklar sought a new politics — one propelled by radical hope and uncowed by liberal anxieties about totalitarian threats and the dangers of wanting too much. Kerr, who trained as a labor economist, took a different path. He hoped that higher education might facilitate a competent managerial and scientific elite, a technocracy, powered, as he put it in 1969, by “the largely hidden hand of the experts in the offices of government agencies, corporations, trade unions, and nonprofit institutions working with cost-benefit analysis, with planning, programming, and budgeting.” Such a system would coordinate minds and matter, and bring about the end of ideology and political conflict. Kerr’s address was an early statement of the belief in the individually and socially transformative power of colleges and universities to reduce inequality and ensure unending economic growth. It was also an articulation of a Cold War liberalism whose imprint lives on in the institutional norms, ideals, and infrastructures of contemporary higher education.
For Kerr, all of these benefits were couched in terms of an explicit theory of history, one that guided his academic leadership: what he called, in his 1953 address, the “continuing upward movement of our Western civilization.” The university was not only a trove of accumulated knowledge but also a source of confidence, a consoling sign of the historical necessity of the path of progress as it had developed over the past “three centuries of Western thought.” With this forward-looking narrative, which rendered “the West” synonymous with “science” and the “scientific revolution,” Kerr countered Cold War worries about cultural decline.
This path was moral as well as intellectual. Kerr propounded a liberal perfectionism — a modern faith that absolved human beings of their finitude (or sin) and legitimated a belief that the future was ours to master. In the modern system of higher education, the function of “ethics” was to make moral sense of the work that scientists and technologists had already done, and the function of the humanities was to provide “leisure” for a largely satisfied society of the university-educated middle class.
Like other liberal institutions — competitive markets, democratic governance, and a free press — the Cold War American university was part of a system that sustained, as Kerr put it, a “permanent revolution” meant to drive humankind to ever greater heights of well-being, as measured by economists and other social scientists. Over the course of his career, Kerr attempted to induce trust not so much in individual humans’ capacity to reason and deliberate, but rather in competitive markets and the expert systems that protected those markets from outside interference. Systems, not human beings, make reason rational; science, not scientists, creates knowledge; markets choose, not people. Individual ideas are of “no value at all” until they have been processed. Reason only becomes real in the “hidden hand” of processes and systems.
Over the next two decades, in what he would retrospectively dub the “golden age” of American higher education, Kerr developed his progressive theory of history into a detailed, normative account of modernization. In Industrialism and Industrial Man (written with three other labor economists), Marshall, Marx and Modern Times, myriad lectures and short essays, and the more than 160 reports and publications he oversaw as chairman of the Carnegie Commission, Kerr and his colleagues described, measured, and defended what they considered to be the “necessary” path of social development, not just for the United States but for all societies and people as they progressed toward the highest stage of human development: industrial modernity.
In so doing, Kerr and his fellow mandarins of modernity (the phrase is Nils Gilman’s) articulated the fixed function colleges and universities played in this historical and social process of global development. They cast the university as the central institution in a system of collective, “evolutionary” rationality, a system whose functions included not just the transmission of knowledge and the production of ideas but also “the instruments whereby men control their environment.” The university was the instrument of post-ideological social management.
Reflecting decades later on his wartime work as a labor economist — during World War II, he had served as the West Coast director of wage stabilization for the National War Labor Board — Kerr wrote that war had compelled him to eschew academic and political theories and to focus instead on pragmatic, even experimental, approaches to increasing the nation’s production. This purported rejection of “ideology” led economists such as himself to jettison economic dogmas from classical economics and Marxism to Friedrich Hayek’s early, philosophically inflected neoliberalism, and to become, as Kerr put it, “more unified in outlook and more neutrally professional.” Neither pro-labor nor pro-management, those economists sought “workable policies” for immediate problems, not ideas for “the best of all possible worlds,” not “Procrustean beds for facts from theories and ideologies.” These were the dispositions that would shape his tenure in higher education, too.
The postwar university created ideas and knowledge by producing particular types of people — “experts,” as Kerr explained in a 1968 lecture, who “help settle the inevitable conflicts of interest on the basis of facts and analysis.” These experts were the guardians of knowledge and the agents of progress, and, regardless of their particular industry or economic sector, they were purveyors of the belief that American universities were essential to both. This elite was distinguished not only by its technical skills but more basically by its induction into the all-enveloping “web of rules” — conventions, norms, and moral values — that structured industrial modernity and safeguarded its global markets. Universities educate the kinds of people required by the industrial age, inculcating the requisite ethos and moral aspirations: choice, consent, adaptability, and consumerism.
As Kerr envisioned it, this system would eventually give rise to a global meritocracy, one that amplified the homogenization of elites and experts, and increased their financial and social distance from everyone else. “The elites,” wrote Kerr and his co-authors in Industrialism and Industrial Man (1960), “become less differentiated, the ideologies become more pragmatic; the old culture becomes dimmer in the memory. The elites all wear gray flannel suits; the ideological controversies become more barren; the cultural patterns of the world intermingle and merge.”
Kerr claimed to be observing a rearrangement of the global social order and a new stage in the history of capitalism. Whereas the feudal lord had faced the peasants and the industrial capitalist had confronted the working class, now the professional manager — whether managing a bit of the federal or state bureaucracy, a division of GM, or a university — faced “knowledge workers.” This made college campuses, not the manor or the factory, the contemporary locus of social and political conflict. Technological advances and increased access to higher education, observed Kerr, had created a new “intellectual class,” one dissatisfied with the distribution of authority of earlier forms of capitalism. (Kerr encountered such “dissatisfaction” directly on his own campus when, in 1964, the Free Speech Movement denounced him and the “machine” he managed.)
Kerr believed that the administrative and managerial methods he had helped to devise and enact in postwar American universities could be applied at much greater scales and across the globe. Decentralization, human-capital development, and administrative and managerial reforms borrowed from General Motors and Shell Oil (and adapted by Cresap, McCormick, & Paget, one of the first higher-ed consulting firms, during Kerr’s tenure as UC president) weren’t only efficient mechanisms for economic growth — they were forces capable of propelling all properly modern social structures and of producing, as Schlesinger put it, a “a wide amount of basic satisfaction and … a substantial degree of individual freedom.” These processes would necessarily spread across the globe, especially the decolonizing world.
Between 1950 and 1970, Kerr’s belief in higher education was buoyed by research on the correlation between educational attainment and life “outcomes” as measured by prospective employment and earnings. The belief was institutionalized by unprecedented state and federal investment in public research universities and the creation of a vast tier of four-year state and community colleges — all subsidized by student grant and loan programs.
These state and federal investments were premised on the idea that colleges raised the individual student’s employment prospects and future earnings, drove economic growth, and would therefore lead to social equality. The Servicemen’s Readjustment Act of 1944 (the GI Bill) introduced a decades-long series of federal-university joint ventures that eclipsed such previous efforts as the Morrill Act of 1862 and the New Deal’s more limited 1930s investments in colleges. The 1950s brought the National Science Foundation, several new institutes for the National Institutes of Health, and research dollars, plus large state investments in colleges in California and New York. The year 1965 saw the Higher Education Act, which, when reauthorized in 1972, introduced means-tested Pell Grants (then called Basic Educational Opportunity Grants) and helped to reduce direct tuition expenses for low-income students at the lowest-cost institutions.
The faith that universities could generate rising incomes and social equality was hard won. For several decades, Kerr and his allies worked to identify the university with society generally, thereby obscuring the interests of the university itself. For a brief time, this was rhetorically effective because it transformed the internal functions and problems of the university — everything from its finances and governance to curricular battles and student protests — into general cultural concerns. What happened on college campuses was a microcosm of the larger culture and society. This was the basis for the claim that higher education could solve economic and social inequality. But when this faith in the link between the university and society began to erode as student protests raged and the oil shocks and economic turmoil of the early 1970s hit, the question of who benefits, who pays, and who should pay became a cultural and political fault line.
After Ronald Reagan’s election as governor, in 1966, and, to his delight, the UC Board of Regents’ subsequent dismissal of Kerr (Reagan had cast Berkeley as the epitome of irresponsible university radicalism), the belief in universities as central agents of modernization persisted, but in new and different forms. In 1972, Kerr, in his new position as chair of the Carnegie Commission on Higher Education, traveled to the University of Nairobi, where he delivered a lecture titled “Education and National Development.” With colleges in the United States facing, as one Carnegie Commission publication was titled, a “new depression,” Kerr asked if higher education, first in the United States and now possibly in Kenya, had proven to be “another god that had failed.”
As in the United States, across Africa a “revolution” in higher education, noted Kerr, had brought unprecedented enrollments; between 1950 and 1970, the number of colleges between the Sahara and the Limpopo grew from four to 30. In both the United States and East African nations such as Kenya and Tanzania, however, doubts about the contribution of higher education to economic growth and “development” had increased as underemployed college graduates “flooded” cities and “shunned agricultural and manual labor.” Standing squarely in the future he had once so confidently predicted, Kerr counseled his audience in Nairobi and across the decolonizing world to engage in a “more realistic” appraisal of the role higher education played in shaping society.
Kerr’s postcolonial revision of higher education’s historical function included a major correction of one of Cold War liberalism’s basic tenets: the presumed close correlation between higher education, economic growth, and individual earnings. Now, in 1972, he described the connection between the two as “loose.” Yet he professed the same belief he had for two decades, namely that higher education was a “necessary condition” for national and global progress. Economic data would not be allowed to weaken the faith. What mattered more than measures like average annual earnings or GDP, argued Kerr in Nairobi, was peoples’ “attitudes toward modernization,” their trust in “political, scientific elites,” and their own confidence as consumers. The belief in higher education mattered not because it nurtured intellectual desire or bound scholarly communities together but rather because it remade people in the image of Cold War liberalism and consumer capitalism.
With American universities facing apparent financial ruin and postcolonial universities figuring out what their social role ought to be, Kerr suggested a change in the system: The private benefits of higher education, he explained, had “so outrun” the social benefits that it was now necessary to reduce “social costs” by shifting the financial burden away from society as a whole and onto private interests — students and their families. He even approvingly cited a proposal in Kenya’s 1964 Development Plan to move toward loans as “a means of financing higher education.”
Kerr’s revision of the contract between the university and society at the dusk of the short-lived “golden age” of American higher education maintained the liberal belief in higher education but adapted it to the political and economic conditions of the late 1960s and early 1970s — and to newly ascendant concepts about competitive markets and the moral values they prioritized: private self-interest and choice. In so doing, Kerr helped to transform this belief into a justification for student debt.
In January 1967, Governor Reagan had proposed that California’s higher-education budget be cut by 10 percent and, to make up for it, that California’s public colleges charge tuition for the first time since 1900. His plan echoed the views of some of UC’s own faculty members, such as the UCLA economists Armen A. Alchian and William R. Allen, who argued that advocates of zero tuition overlooked the “bonanza” enjoyed by the rich in such a system, in which the “residents of Watts subsidize the residents of Beverly Hills.” As Alchian and Allen explained:
College-calibre persons are, in fact, rich in their inherited mental talents. Such “human capital” is wealth, and for the talented, this wealth is of great magnitude. Further, slighting such human wealth is to ignore the difference between wealth and current earnings. A man with a pool of untapped oil is rich — although he is not now marketing his resource. Similarly, the current earnings of an intelligent youth student may be small, but his wealth — present value of his future earnings — is large. College students, even those with little present income, are not poor. Subsidized higher education gives the student a second windfall — a subsidy to exploit his inherited windfall of talent. This is like subsidizing drilling costs for owners of oil-bearing lands.
Access and equal opportunity could be provided by encouraging students to borrow against their future income, which “will presumably be enlarged by their present training.” It’s here that belief in higher education is transformed into financial speculation. Trust and belief become creditworthiness.
Alchian and Allen’s argument marked a shift in how a belief in the potential economic advantages of a college education could be used. (And they make no mention of anything having to do with intellectual ideals and virtues, or the value of truth-seeking.) It shifted the burden of debt away from public responsibility in the form of state and federal tax dollars, and toward private responsibility and personal choice. Three years later, in 1971, the founding father of law and economics, Henry G. Manne — a self-described acolyte of Alchian — argued that if colleges were forced to compete in a free market and rely on market solutions, students would finally be liberated to become who they were always meant to be: “sovereign consumers.”
The idea of the “sovereign student consumer” was not entirely incongruous with ideas Kerr and like-minded Cold War liberals had championed for the previous 20 years: Higher education was not just efficient at expanding the national GDP, but also constituted a profitable investment in individuals. When President Lyndon B. Johnson signed the Higher Education Act of 1965, he emphasized the “personal value of college” and the “increased personal reward,” and then clarified the nature of that value and reward: “The nation couldn’t make a wiser or more profitable investment.” The HEA doubled the federal government’s annual budget for colleges and universities, and the almost $2 billion in student aid it provided helped, among other things, to increase by threefold the number of Black students enrolled in colleges and universities from 1968 to 1978.
It also adopted the economic and moral logic that Kerr had sought to institutionalize for over a decade in California. Taking on debt to finance the surest path to a middle-class American life didn’t require the faith of a mustard seed, but it did demand a desire to become middle class. Becoming (or remaining) middle class also meant trusting the experts like Kerr and his colleagues on the Carnegie Commission, who in 1971 declared that American students ought to become “accustomed to the idea of borrowing … against future earnings” to finance their college educations.
Kerr’s explicit embrace of a previously implicit market logic around 1970 marked the beginning of a two-decades-long transformation of the university into a system that leveraged faith in higher education to create markets for student credit and debt. By coupling students’ aspirations for financial success to the financial interests of colleges, the system sustains an asymmetric alliance. One party has dreams and hopes, and the other decades of data, professional know-how, and, as middle men in the federally underwritten student-aid financial complex, plausible deniability about their own role in the system.
In 1960, Kerr was on the cover of Time magazine, where an article dubbed him the “master planner,” a phrase Kerr had adopted in the “Uses of a University” lectures to describe the leader of the modern “multiversity.” But later he said that the more accurate title for a university president was “image maker.” Fundamental to the presidential persona was an ability to enchant, to make potential students, and society at large, believe in not only a particular university but in the liberatory promise of higher education as such. The irony of the image of Kerr as the “master planner” and technocratic manager of the multiversity was the degree to which he understood that technocracy, like any system of authority, required faith. This faith motivated individuals seeking a better life and shaped institutions that would come to constitute the American system of higher education.
Without this faith — or, rather, credulity — the gradual shift from public to largely private financing of U.S. higher education would not have been possible. Taking higher education’s liberatory promise as their premise, university leaders compared human capacities to untapped oil reserves and urged aspiring students to “invest” in themselves, to put themselves and their families up as collateral for cash to pay for college. The original Higher Education Act of 1965 had established, among other things, the Guaranteed Student Loan Program, which committed the federal government to repaying a loan if a student defaulted. It also cemented credit, debt, and finance as foundational features of the American system of higher education. With every reauthorization of the HEA — from the establishment of Sallie Mae as an independent financial corporation for incentivizing private loans (1972) and the Middle Income Student Assistance Act (1978) to the PLUS parent loans (1980) and unsubsidized Stafford loans (1992) — Congress expanded the market for student debt.
Kerr et al. converted intellectual desire into a market for student debt.
By the time Sallie Mae devised the first securitized pools of student loans, in the mid-1990s, that market was booming. From 1989 to 2020, total federal loans for undergraduate and graduate students increased from just over $20 billion to over $87 billion annually (in 2019 dollars), an increase of 328 percent. The quantity of loans produced by the system, however, can obscure how differentiated and stratified its burden is across lines of class and race. For example, in 2019, the share of student loans whose current balance exceeded the loan’s original balance was 74.2 percent in Black-plurality communities, compared with 47.5 percent in white-plurality communities, according to statistics gathered by the Jain Family Foundation Report.
Many factors have contributed to the growth and differentiated effects of student debt: the nearly constant decline in state appropriations, the steady increase in tuition at public and private colleges, and the proliferation of for-profit higher education, which began in the mid-1990s. Kerr, Alchian, and Manne may not have envisioned such a staggering market for student debt, but they leveraged public belief in the progressive promises of higher education into a debt-fueled, acquisitive, speculative system whose primary purpose is to maintain itself.
By the early 1970s, Kerr had recognized these trends, and by the end of the decade had begun to argue that the future of higher education ran “through the marketplace,” where some colleges would compete for students who could pay full tuition, and others would bring in federal dollars, whether as guaranteed loans or direct payments. Turning competition, debt, and private returns into widely recognized norms, making competitive market values university values, was, wrote Kerr in 1979 in the final Carnegie Commission report, the industry’s only “road to survival.”
He also understood that in order for that future to be widely accepted, the morals of the marketplace had to be fused with the felt legitimacy of a meritocratic system most fully realized in the highly stratified system of American higher education. Were that belief ever to falter, the entire system risked collapse.
This system now consists of over 4,000 different institutions, but each in its own way relies on a faith in the individually and socially transformative power of college. These institutions are compelled to participate in social policies and institutional norms that every year induce students and families to believe that college is worth it — no matter the financial costs or the actual goods of the education itself.
Atop this pyramid scheme sit institutions like my own, the University of Virginia, which masks its constant competition for more — more money, more status, more prestige — as a belief in higher learning. Given the goals they set for themselves, UVA and other wealthy institutions need the system of higher education to continue just as it is. They profess to do so out of a faith that meritocracy’s hidden hand will watch over their graduates, ensuring the liberal, progressive order. And they hire professionals to manage that faith, such as UVA’s recently appointed vice provost for enrollment, who will ensure the most efficient use of students’ hopes in higher education to maximize revenues.
Kerr didn’t create the American system of higher education. But he was its prophet. The banality of his rhetoric, his training as an economist, his proud professionalism and unwavering commitment to expertise, and his matter-of-fact liberalism belied the audacity of his basic premise: that the Cold War university was the necessary path for individual, national, and global flourishing. The liberal belief that colleges can change lives for the better was not simply a delusion or an ideology; it came from a desire to imagine and build a better future. Yet this desire was from its first formulations bound to a belief that the path to such a future was fixed. It just had to be rationally managed by the experts who deserved not only trust but deference.
McMillan Cottom has shown the effects of the higher faith on the lives of those preyed on by for-profit colleges. Caitlin Zaloom has shown its effects on individual students and their families. What Zaloom calls the “student finance complex” shapes the lives of students, families, and communities across the country. It subjects students and their families to the ideals, norms, and values of credit, tying their worth to the determinations of the higher-ed financial complex and its judgment about who deserves to be trusted.
It also shapes our colleges and universities. That the “golden age” of U.S. higher education coincided with the “golden age” of U.S. capitalism should give us pause about elegies for a now-lost democratic institution. It is true that in those mythic decades, Kerr and company largely realized their vision of a system of higher education that was the engine of economic and technological production. But the sheen of success has blinded us to the political and spiritual costs of the system: a corrupt meritocracy and the systematic rejection of the liberatory promise of education. American higher education has produced many goods. But it also launders privilege, luck of birth and circumstance, and financial and social greed into socially acceptable status under the rubric of merit. And it now exacerbates persistent and worsening financial and social inequalities.
Its greatest failure is moral and political. It manufactures the illusions of merit that make individual mettle a marker of worth and dignity. It transforms political conflicts over truth, values, and visions of different futures into unassailable moral differences, matters not of collective action but of individual choice and preference. Yes, the radical expansion of public universities and growth of new tiers of higher education were instrumental to U.S. technological advances, economic growth, and “upward mobility,” but by reconceiving of higher learning as human capital development and universities as competing interests in an economic system called higher education, Kerr and his allies transformed them into acquisitive market actors seeking new revenue sources and the fleeting consolations of prestige — vices legitimated by the global import of the higher faith. Kerr et al. converted intellectual desire into a market for student debt.