When Allan Bloom’s infamous book, The Closing of the American Mind, was published in 1987, it was, as Camille Paglia later declared, the “first shot of the culture wars.” Bloom’s book—by any account an odd amalgam of polemical denunciation of academe, philosophical argument, and memoir—quickly generated a spirited debate about college life, the place of the liberal arts, and, as Bloom himself put it, “the state of our souls.”
At the time, I took Bloom’s attack personally, since it singled out in particular the kind of disciplinary portfolio I was inhabiting as an anthropologist and historian. Bloom equated those fields with the doctrine of cultural relativism, which he charged “succeeds in destroying the West’s universal or intellectually imperialistic claims, leaving it to be just another culture.” Yet while he claimed an openness—to other histories and cultures—he used partial readings and understandings to dismiss them in comparison with his ideal image of Socratic Greece. Moreover, his notion of the “college” was so narrow that even the shared classical education that he defended—hedged by his own ironic awareness of the possible misuse of canons—seemed an impossible dream.
There is, however, considerable pathos in going back to this text. The culture wars provided fuel for a growing public antipathy to the humanities. But if Bloom became the poster child for the neoconservative onslaught on the university, his book would hardly be embraced by most critics who attack the university today, whatever their political perspective. It is hard to avoid a feeling of nostalgia for the virulent days of the “culture wars,” when participants on different sides of the debate shared a common conviction about the importance of a humanist education.
When students are abandoning the liberal arts in favor of majors and programs that promise—or at least seem to them more likely to produce—immediate financial awards, we desperately need to move beyond the kind of polemics that surrounded Bloom. At the same time, we need to provide a coherent defense of the liberal arts, for their role in the larger university and for the broader society. For me, the defense is both deeply personal and based on intellectual and institutional arguments.
I grew up with the kind of religious faith that Bloom would appear to have admired. My father was raised on a farm in Iowa, the descendent of a family of German immigrants who had two generations earlier taken a boat to America to escape the Prussian draft. When it became clear that my father’s congenitally bad heart would make it impossible for him to live the physically demanding life of a farmer, he went off to a local college with the idea that he would become a Presbyterian minister in an Iowa parish. Although he followed the general plan, he moved from a seminary in Chicago to New York City, where he began to study with the liberal theologians Paul Tillich and Reinhold Niebuhr, ultimately earning a Ph.D. in philosophy at Columbia University.
It is time to put aside the sharp acrimony of debates about the core and reclaim the passion for a humanistic education.
When he returned to the Holland (Iowa) German Presbyterian Church for his ordination, he was cross-examined by the church elders to discern whether the “taint of the East” had made him unsuitable. He passed the test and was ordained but never followed the pastoral route, joining the professoriate instead.
My mother was a schoolteacher and a dietician; when the family moved east, she completed a master’s degree at Columbia’s Teachers College and worked for a time at Good Housekeeping magazine. A few years later, after my father took up a teaching position at the Yale Divinity School, my mother strayed from Presbyterianism and joined a local assembly of Plymouth Brethren. I grew up cognizant of, and occasionally caught between, the differences of two disparate (and even opposing) strands of Protestant belief and practice. I was baptized twice, once as an infant in a Presbyterian Church, later as a young adult, fully immersed not just in the water of a small pool under the pulpit but also in the heady atmosphere of fundamentalist Christianity.
In other words, I grew up preoccupied by questions of faith. Long before I understood the full implications of doctrinal debates, I had a palpable sense of the importance of theological distinctions. On rare though memorable occasions, my parents would betray their passions—usually leading to a period of silence on the part of my mother and then profuse apologies on the part of my father.
I was drawn by the immediacy of the fundamentalist message, convinced that my personal relationship with God was of greater importance than any other relationship, but also troubled by the fact that I had no special conversion narrative to which I could give witness. My “failure” led to a peculiar form of self-doubt, a sense that exacerbated my discontent with certain doctrines, not to mention the lectures in summer Bible camp about the folly of scientific assertions concerning evolution that were at first unconvincing and later embarrassing.
After I went to India as a 12-year-old, when my father had a Fulbright grant to teach at a Christian college in Southern India, I could no longer believe that my own brand of Christian belief was the only path to God. Even as I set out on our year abroad aboard on a Greek ocean liner in June of 1963, I sensed that the world I had known was opening up.
As I returned to the United States a year later, I could not yet know that my experience in India would be crucial to my own coming of age during the years of growing controversy over U.S. involvement in Vietnam. I did, however, soon become aware of the influence of Mahatma Gandhi on Martin Luther King Jr., and when I asked my father to take me to the Yale chapel to hear the sermons of the peace and civil-rights activist William Sloane Coffin Jr., was struck by the openness of an ecumenical church to social and political engagement and advocacy. I joined many others of my generation in being swept up by the civil-rights movement and the growing concern about Vietnam (not insignificantly inflected by the role of the draft in the lives of young men), while seeking both to get into a good college and figure out the relationship between those turbulent times and my own hopes and dreams (and fears).
As befit the times, I went to a college, Wesleyan University, that had abandoned all but the most rudimentary of distribution requirements, the only real obstructions to full freedom of choice being one “Freshman Humanities” course and one in “behavioral psychology” that fulfilled a science requirement. I took the psychology course because of the instructor, who had taught my freshman seminar on “free will and necessity” jointly with a professor of religion, in a riveting display of intellectual debate between a Skinnerian and a Hegelian. The questions raised stayed with me, reverberating in the interdisciplinary major I adopted in politics, economics, and history.
The interdisciplinary major was modeled on the famous Oxford PPE course (philosophy, politics, and economics) in both content and form (with very small classes called tutorials), but my own program was fundamentally different, since it concerned Asia and Africa and experimented with a form of area studies that was embedded within a program in “Western” studies rather than isolated from it.
My move to history and anthropology was governed by my continued preoccupation with India, as well as by my sense of the need to steep myself in the history and cultural context of a specific place that defied easy characterization. And yet my sense of “cultural” difference, and my instinctive mode of comparative analysis, were inevitably also dictated by my coming of age in late 20th-century America.
As the pace of globalization has only intensified in the years since, the certainties of cultural lineage and identity have inevitably and increasingly been seen as hybrid, contingent, and connected across a world stage. Unfortunately, at the very time when the kind of moral and political education I had in college seems more compelling than ever before, the liberal arts are under siege, charged as irrelevant, wasteful, and unnecessary.
It is frequently said that the politicization of the humanities, and the growing tendency to historicize works of culture and emphasize cultural context, created the conditions for the loss of standing and support for the liberal arts. But some of the great works of cultural criticism and analysis in the past decades have come precisely from linking historical and sociological study to cultural texts, which inevitably has meant examining the politics of those involved in the production, consumption, and use of culture. While some assumptions about politics have been too unreflective about multiple political perspectives and possibilities, and often too reductive, those extremes have been no more egregious than occasions when historical and sociological considerations have been deliberately banished. Nor should we forget that what Bloom and his followers saw as today’s core was yesterday’s avant garde.
During my years at Columbia, first in anthropology and then as vice president of the Faculty of Arts and Sciences, I strongly advocated for the importance of a core curriculum. It constituted the major part of the liberal-arts component of the university’s undergraduate education and, most important, provided a vehicle for the articulation of questions, traditions, and debates that became, through the pedagogy of a faculty engaged with myriad disciplinary questions well outside the scope of the core itself, the basis for collective discussion of issues of contemporary significance.
I realize, however, that such goals can be achieved through any number of means. What is important is to find ways to draw faculty members out of their own research specializations and to commit part of their pedagogy to the teaching and exploration of the large questions that are fundamental to any good liberal-arts curriculum. It is time to put aside the sharp acrimony of debates about the core and reclaim the passion for a humanistic education that was so much a part of the culture wars. A starting point might be to clarify the role of the liberal arts within the university.
I do not subscribe to the usual stark divide between teaching and research, between general and professional education, between the goals of a college and the aspirations of a university. The American university system has become so successful precisely because it combines those features rather than insulating them from each other. That is why the kinds of intellectual challenges that are part of undergraduate education in the liberal arts and sciences are critical to the formulation and defense of the idea of the research university. Without in any way detracting from the crucial importance we place on the university for fostering research that will produce better knowledge, policies, medicines, products, and ideas, I would also suggest that it is the core commitment to undergraduate education that provides the most secure ground on which to predicate a response to the prevailing loss of public confidence in the basic value of education itself.
In a series of lectures she gave at the University of California in 2009, Hanna Holborn Gray, a historian and former president of the University of Chicago, underscored the fundamental value of the liberal arts for the idea of the university. She used the competing—if more often than not also complementary—ideas of Clark Kerr and Robert Maynard Hutchins to make her argument.
Despite his signal role in inventing the modern American university system, Kerr, who created the California Master Plan for Higher Education in 1960, was troubled by the institutional juggernaut American education was becoming. The plan created the three different systems of colleges and universities in the state, which were classified along a grid from educating all high-school graduates to producing the highest levels of research. While Kerr was an unapologetic advocate of mass education as part of a genuinely meritocratic system, of advanced research, and of much needed specialization, he did worry about the decline of the prestige of the humanities, the exclusive focus of some professors on research and graduate rather than undergraduate education, and the loss of common knowledge and values.
He had been as shaped by his own liberal-arts education in a small and distinguished Eastern college as he was by his experience leading the pre-eminent system of public higher education in the nation. His writings (and speeches), both while in office and afterward, reveal the steady refrain of one committed to the same kind of utopia imagined by Hutchins, the legendary president and then chancellor of the University of Chicago from 1929 to 1951.
Hutchins had become famous for advocating the importance of undergraduate education, promoting a focus on “great books,” and stressing the “collegiate” idea for a university that was also known for its interdisciplinary graduate programs. As Gray showed, both Kerr and Hutchins played powerful roles in making the university what it is today, with Hutchins maintaining its core values concerning general education—educating young people to become genuine citizens—while Kerr crafted the institutional conditions for its embrace of the American ideal of educating as many of its citizens as possible. Both also believed that the university was the one institution that could aspire toward utopian ideals and at the same time find actual ways to put them into effect.
As we inherit those debates—and in some respects the debates have changed little over the past century—we confront the demands of a new century and the growing public concern about not just the goals of higher education but, even more crucially, its cost. Kerr’s Master Plan is no longer adequately financed by the state, and it is unfortunate but likely that it never will be again. As the burden for financing education shifts from governments to students (and former students, as alumni philanthropy has been a major source of funding for elite private colleges and increasingly for public colleges, too), the idea of the university has come under greater public scrutiny than ever before.
The liberal arts now often seem to be a luxury that only the elite can afford, and even those people—to the dismay of many educators in top colleges—seem increasingly skeptical. This decline has been attributed to the economic downturn, the ever-growing need for technological and entrepreneurial innovation, and the pervasive anti-intellectualism of American life, a version of American pragmatism that has no interest in the liberal arts in any form.
But the real problem may be the ambivalence of the intelligentsia about playing its full public role. It is urgent that the professoriate come forward to make coherent and powerful arguments on behalf of the kind of liberal education that has now been recognized as a model around the world.
The liberal arts have been fundamental not just to the constitution of the American middle class, but also to the creativity of mind and spirit that flowers in areas as different as the arts and the sciences, engineering and entrepreneurship. Business educators and scientists—as well as employers in many professions—often find themselves in agreement that their best students need not just math and statistics but also philosophy and literature. The professoriate should be able to argue for the importance of a rigorous moral and political education in the most general of terms—without either relying solely on such measures as the continuing percentage of English majors or the financial value of a liberal-arts degree.
The most successful site for rethinking these issues is likely to be undergraduate teaching, a utopian space of the college once imagined by Hutchins and also lodged at the heart of all genuinely great research universities, as envisioned by Kerr.
Utopia will always be more an aspiration than a reality for the university. We must ensure that our passion for the power of the classroom is still palpable, even when using new technological means to help us teach (and learn), or when pursuing ever more specialized research. At the same time, we must recognize now that the very specialization so distrusted by critics has been fundamental to the advances in knowledge and discovery; that managing the realities of the university research infrastructure is more complicated and costly than ever before—in large part because of our abiding belief in the transformative necessity of knowledge for its (and our) own sake.
Allan Bloom notwithstanding, it has become increasingly clear to educators that any liberal-arts education in the 21st century must include significant attention to the global contexts that not just inform but now also constitute the world in which we live. My own passage to India was opened up by the lines of a world already set by the aftermath of World War II in America, with Fulbright’s major program to facilitate cultural exchange as the catalyst, and the development of university area studies as the conduit that linked an interest in India to a scholarly career. Since then, walls fell, markets opened, people moved, so-called developing economies began to grow at a record pace—and the world changed.
Like all great transitions, the shifts have been more gradual, and continuous, than we often think. Yet there is no doubt that the rate of change has accelerated in significant respects. Universities began to introduce general-education courses in the postwar period in fields like “Asian civilization,” as part of a recognition of the need for students to be aware of the existence of great and vibrant civilizations alongside the West. But it still seemed clear that only “Western civilization” had been able to propel a full embrace of change, from the Renaissance to the onset and advance of modernity. Max Weber was read to show not only the inextricable, if in part counterintuitive, relationship between Protestantism and the rise of capitalism, but also, in concert with assertions from Hegel, Marx, and modernization theory, that only the West was dynamic and progressive in political, cultural, social, and economic registers. All this has to be rethought, not least because contemporary world conditions have shown so many of the old theories to be wrong.
In my own scholarly work, I was never persuaded that the reigning social and political theoretical frameworks of “West” and “East” were independent of the historical disparities in world political and economic power. And over the years, I came to be much more interested in the history of imperialism than with essential differences of civilizations. As a “cultural,” or “anthropological,” historian, I was less intrigued by the warfare and trade that led to imperial outcomes than by the intellectual and cultural consequences of shifts in world history.
In the end, the crux of the issue for me was always seeing the role history played in shaping what we see as natural in the present, without losing a sense of the complexity and contradiction that are part of any given historical narrative. No single rewriting of the past will be sufficient; no account will ever settle all the debates around truth, meaning, culture, power, etc. For that reason, neither history nor anthropology should be seen an antagonistic to the large moral questions that each civilization has engaged in its own ways.
Those of us who reflect back on the archives of our own lives all share in common a belief in the possibility of knowledge about the world, and about ourselves, that may indeed seem naïve and mistakenly utopian. But that is what keeps making us return to those archives of old, to tell us something new we never could have known before.
Nicholas B. Dirks is chancellor of the University of California at Berkeley. This essay is adapted from Autobiography of an Archive: A Scholar’s Passage to India, published this month by Columbia University Press.