If there’s one thing about which Americans agree these days, it’s that we can’t agree. Gridlock is the name of our game. We have no common ground.
There seems, however, to be at least one area of cordial consensus—and I don’t mean bipartisan approval of the killing of Osama bin Laden or admiration for former Rep. Gabrielle Giffords’s courage and grace.
I mean the public discourse on education. On that subject, Republicans and Democrats speak the same language—and so, with striking uniformity, do more and more college and university leaders. “Education is how to make sure we’ve got a work force that’s productive and competitive,” said President Bush in 2004. “Countries that outteach us today,” as President Obama put it in 2009, “will outcompete us tomorrow.”
Audio
Listen: Andrew Delbanco explains why the American college experience is worth preserving. (14:18) | Link
What those statements have in common—and there is truth in both—is an instrumental view of education. Such a view has urgent pertinence today as the global “knowledge economy” demands marketable skills that even the best secondary schools no longer adequately provide. Recent books, such as Academically Adrift: Limited Learning on College Campuses, by Richard Arum and Josipa Roksa, and We’re Losing Our Minds: Rethinking American Higher Education, by Richard P. Keeling and Richard H.H. Hersh, marshal disturbing evidence that our colleges and universities are not providing those skills, either—at least not well or widely enough. But that view of teaching and learning as an economic driver is also a limited one, which puts at risk America’s most distinctive contribution to the history and, we should hope, to the future of higher education. That distinctiveness is embodied, above all, in the American college, whose mission goes far beyond creating a competent work force through training brains for this or that functional task.
College, of course, is hardly an American invention. In ancient Greece and Rome, young men attended lectures that resembled our notion of a college course, and gatherings of students instructedby settled teachers took on some of the attributes we associate with modern colleges (libraries, fraternities, organized sports). By the Middle Ages, efforts were under way to regulate the right to teach by issuing licenses, presaging the modern idea of a faculty with exclusive authority to grant degrees. In that broad sense, college as a place where young people encounter ideas and ideals from teachers, and debate them with peers, has a history that exceeds two millennia.
But in several important respects, the American college is a unique institution. In most of the world, students who continue their education beyond secondary school are expected to choose their field of specialization before they arrive at university. In America there has been an impulse to slow things down, to extend the time for second chances and defer the day when determinative choices must be made. When, in 1851, Herman Melville wrote in his great American novel Moby-Dick that “a whaleship was my Yale College and my Harvard,” he used the word “college” as a metaphor for the place where, as we would say today, he “found himself.” In our own time, a former president of Amherst College writes of a young man experiencing in college the “stirring and shaping, perhaps for the first time in his life, [of] actual convictions—not just gut feelings—among his friends and, more important, further down, in his own soul.”
In principle, if not always in practice, this transformative ideal has entailed the hope of reaching as many citizens as possible. In ancient Greece and Rome, where women were considered inferior and slavery was an accepted feature of society, the study of artes liberales was reserved for free men with leisure and means. Conserved by medieval scholastics, renewed in the scholarly resurgence we call the Renaissance and again in the Enlightenment, the tradition of liberal learning survived in the Old World but remained largely the possession of ruling elites.
But in the New World, beginning in the Colonial era with church-sponsored scholarships for promising schoolboys, the story of higher education has been one of increasing inclusion. That story continued in the early national period through the founding of state colleges, and later through the land-grant colleges created by the federal government during the Civil War. In the 20th century, it accelerated with the GI Bill, the “California plan” (a tiered system designed to provide virtually universal postsecondary education), the inclusion of women and minorities in previously all-male or all-white institutions, the growth of community colleges, and the adoption of “need-based” financial-aid policies. American higher education has been built on the premise that human capital is widely distributed among social classes and does not correlate with conditions of birth or social status.
Seen in that long view, the distinctive contribution of the United States to the history of liberal education has been to deploy it on behalf of the cardinal American principle that all persons have the right to pursue happiness, and that “getting to know,” in Matthew Arnold’s much-quoted phrase, “the best which has been thought and said in the world” is helpful to that pursuit. That understanding of what it means to be educated is sometimes caricatured as elite or effete, but in fact it is neither, as Arnold makes clear by the (seldom-quoted) phrase with which he completes his point: “and through this knowledge, turning a stream of fresh and free thought upon our stock notions and habits.” Knowledge of the past, in other words, helps citizens develop the capacity to think critically about the present—an indispensable attribute of a healthy democracy.
These ideals and achievements are among the glories of our civilization, and all Americans should be alarmed as they come to be regarded as luxuries unaffordable for all but the wealthy few. A former director of the for-profit University of Phoenix put it this way in an interview on Frontline: “I’m happy that there are places in the world where people sit down and think. We need that. But that’s very expensive. And not everybody can do that.” Meanwhile, too many selective nonprofit colleges are failing to enroll significant numbers of students from low-income families, and those colleges are thereby reinforcing rather than ameliorating the discrepancies of wealth and opportunity in American society. Yet even at selective nonprofit colleges, where students come overwhelmingly from affluent families and are still invited to “sit down and think,” they are more and more likely to choose fields of study for their preprofessional utility—on the assumption that immersing themselves in learning for the sheer joy of it, with the aim of deepening their understanding of culture, nature, and, ultimately, themselves, is a vain indulgence.
One of the difficulties in making the case for liberal education against the rising tide of skepticism is that it is almost impossible to persuade doubters who have not experienced it for themselves. The Puritan founders of our oldest colleges would have called it “such a mystery as none can read but they that know it.”
Testimony by converts can help. One student, born and educated in China, who came to the United States recently to attend Bowdoin College, encountered the modern version of the Puritan principle that no communicants should “take any ancient doctrine for truth till they have examined it” for themselves. “Coming from a culture in which a ‘standard answer’ is provided for every question, I did not argue with others even when I disagreed. However, Bowdoin forced me to reconsider ‘the answer’ and reach beyond my comfort zone. In my first-year seminar, ‘East Asian Politics,’ I was required to debate with others and develop a habit of class engagement,” he said in an interview with the Web site Inside Higher Ed about a book he and two other Chinese students wrote for an audience in China, about their liberal-arts educations in America.
“One day we debated what roles Confucianism played in the development of Chinese democracy. Of the 16 students in the classroom, 15 agreed that Confucianism impeded China’s development; but I disagreed. I challenged my classmates. Bowdoin made me consistently question the ‘prescribed answer.’”
That kind of education does not lack for eloquent exponents. A current roster would include, among many others, Martha C. Nussbaum (in her books Not For Profit: Why Democracy Needs the Humanities, 2010, and Cultivating Humanity: A Classical Defense of Reform in Liberal Education, 1997, as well as in an essay in The Chronicle, "The Liberal Arts Are Not Elitist”), Anthony T. Kronman (Education’s End: Why Our Colleges and Universities Have Given Up on the Meaning of Life, 2007), Mark William Roche (Why Choose the Liberal Arts, 2010), and, most recently, in The Chronicle, Nannerl O. Keohane, “The Liberal Arts as Guideposts in the 21st Century.” But in our time of economic retrenchment, defenders of the faith are sounding beleaguered. Everyone who is honest about academe knows that colleges and universities tend to be wasteful and plagued by expensive redundancies. The demand for greater efficiency is reasonable and, in some respects, belated. The cost of college must be reined in, and its “productivity"—in the multiple senses of student proficiency, graduation rates, and job attainment—must be improved. The trouble is that many reforms, and most efficiencies, whether achieved through rational planning or imposed by the ineluctable process of technological change, are at odds with practices that are essential if liberal education is to survive and thrive.
High on the list of such practices is the small-class experience that opened the mind of the Chinese student at Bowdoin. One of the distinctive features of the American college has always been the idea that students have something to learn not only from their teachers but also from each other. That idea of lateral learning originates from the Puritan conception of the gathered church, in which the criterion for membership was the candidate’s “aptness to edifie another.” The idea persists to this day in the question that every admissions officer in every selective college is supposed to ask of every applicant: “What would this candidate bring to the class?” It underlies the opinion by Justice Lewis Powell in the landmark case of Regents of the University of California v. Bakke (1978), in which the Supreme Court ruled that considering a candidate’s race is constitutional for the purpose of ensuring “the interplay of ideas and the exchange of views” among students from different backgrounds. Those are modern reformulations of the ancient (by American standards) view that a college, no less than a church, exists fundamentally as what one scholar of Puritanism calls the “interaction of consciences.”
A well-managed discussion among peers of diverse interests and talents can help students learn the difference between informed insights and mere opinionating. It can provide the pleasurable chastisement of discovering that others see the world differently, and that their experience is not replicable by, or even reconcilable with, one’s own. It is a rehearsal for deliberative democracy.
Unfortunately, at many colleges, as fiscal imperatives overwhelm educational values, this kind of experience is becoming the exception more than the rule. The educational imperative is clear: A class should be small enough to permit every student to participate in the give-and-take of discussion under the guidance of an informed, skilled, and engaged teacher. But the economic imperative is also clear: The lower the ratio between students and faculty, the higher the cost. One obvious way to mitigate the cost is to put fewer full-time tenured or tenure-track faculty in the classroom, and to replace them with underpaid, overworked part-timers—something that is happening at a frightening pace across the nation.
An even more promising strategy for cost containment is to install one or another technological “delivery system” in place of the cumbersome old system of teachers mentoring students. On that matter, the academic community is divided among true believers, diehard opponents, and those trying to find some middle ground in the form of “hybrid” or “blended” learning, whereby students are instructed and assessed through electronic means but do not entirely lose face-to-face human contact with their teachers and with one another.
Those of us who have trouble imagining how technology can advance liberal learning are liable to be charged with mindless obedience to what the English classicist F.M. Cornford famously called the first law of academe: “Nothing should ever be done for the first time.” No doubt there is some truth to that charge. But as a more recent English scholar, Alison Wolf, puts it in her book Does Education Matter? Myths About Education and Economic Growth, “We have not found any low-cost, high-technology alternatives to expert human teachers.” At least not yet.
Meanwhile, American academic leaders, long accustomed to assuming that their institutions are without peer abroad, are looking nervously over their collective shoulder at the rising universities of Asia, as well as at “the Bologna process” in Europe—the movement to make degree requirements compatible across national borders, so that, for example, a baccalaureate in chemistry earned in a French university will qualify the holder for further study or skilled employment in, say, Belgium. They are watching, too, those countries—notably China and Germany—that have a long tradition of standardized national examinations by which students are evaluated quite apart from whatever academic credentials they hold.
The standardized-testing regime (along with the mania for institutional rankings) is spreading throughout the world and making inroads in the historically decentralized education system of the United States. With it arises the specter that our colleges will be subject to some version of what, in our elementary and secondary schools, has come to be known as the No Child Left Behind (NCLB) assessment program. There is no reason to doubt President Bush’s good intentions when, on behalf of minority children in weak schools, he called for the imposition of enforceable standards to put an end to “the soft bigotry of low expectations.” But there is mounting evidence that the law has had little positive effect, while driving “soft” subjects such as art and music to the margins or out of the curriculum altogether.
There is also no reason to doubt President Obama’s deep understanding—as anyone will recognize who has read his prepresidential writings—of the immense and immeasurable value of a liberal education. But as the distinguished psychologist Robert J. Sternberg, provost of Oklahoma State University, wrote recently in an open letter to the president published in Inside Higher Ed, there is reason to worry that blunt “metrics for progress” of the NCLB type would “undermine liberal education in this country.” So far President Obama’s plans are not yet sharply defined. His initial emphasis has been on the cost of education, the promise of technology, and the establishment of standards for the transition from school to college. As a strategy emerges in more detail for holding colleges accountable for cost and quality, we need to keep in mind that standardized tests—at least those that exist today—are simply incapable of measuring the qualities that should be the fruits of a true liberal education: creativity, wisdom, humility, and insight into ethical as well as empirical questions.
As we proceed into the future, fantasies of retrieving an irretrievable past won’t help. College is our American pastoral. We imagine it as a verdant world where the harshest sounds are the reciprocal thump of tennis balls or the clatter of cleats as young bodies trot up and down the field-house steps. Perhaps our brains are programmed to edit out the failures and disappointments—the botched exams, missed free throws, unrequited loves—that can make college a difficult time for young people struggling to grow up.
In fact, most college students today have nothing like the experience preserved in myth and selective memory. For a relatively few, college remains the sort of place that Kronman, a former dean of Yale Law School, recalls from his days at Williams College, where his favorite class took place at the home of a philosophy professor whose two golden retrievers slept on either side of the fireplace “like bookends beside the hearth” while the sunset lit the Berkshire hills “in scarlet and gold.” But for many more students, college means the anxious pursuit of marketable skills in overcrowded, underresourced institutions, where little attention is paid to that elusive entity sometimes called the “whole person.” For still others, it means traveling by night to a fluorescent-lit office building or to a classroom that exists only in cyberspace.
It is a pipe dream to imagine that every student can have the sort of experience that our richest colleges, at their best, still provide. But it is a nightmare society that affords the chance to learn and grow only to the wealthy, brilliant, or lucky few. Many remarkable teachers in America’s community colleges, unsung private colleges, and underfinanced public colleges live this truth every day, working to keep the ideal of liberal education for all citizens alive.
It seems beyond doubt that the American college is going through a period of truly radical, perhaps unprecedented, change. It is buffeted by forces—globalization; economic instability; the continuing revolution in information technology; the increasingly evident inadequacy of elementary and secondary education; the elongation of adolescence; the breakdown of faculty tenure as an academic norm; and, perhaps most important, the collapse of consensus about what students should know—that make its task more difficult and contentious than ever before.
Moreover, students tend to arrive in college already largely formed in their habits and attitudes, or, in the case of the increasing number of “nontraditional” (that is, older) students, preoccupied with the struggles of adulthood: finding or keeping a job, making or saving a marriage, doing right by their children. Many college women, who now outnumber men, are already mothers, often single. And regardless of age or gender or social class, students experience college—in the limited sense of attending lectures, writing papers, taking exams—as a smaller part of daily life than did my generation, which came of age in the 1960s and 70s. They live in an ocean of digital noise, logged on, online, booted up, as the phrase goes, 24/7, linked to one another through an arsenal of gadgets that are never powered down.
As we try to meet those challenges, it would be folly to dismiss as naïveté or nostalgia an abiding attachment to the college ideal—however much or little it ever conforms to reality. The power of this ideal is evident at every college commencement in the eyes of parents who watch their children advance into life. What parents want for their children is not just prosperity but happiness. And though it is foolish to deny the linkage between the two, they are not the same thing.
As the literary scholar Norman Foerster once put it, the American college has always sought to prepare students for more than “pecuniary advantage over the unprepared.” To succeed in sustaining college as a place where liberal learning still takes place will be very costly. But in the long run, it will be much more costly if we fail.
A few years ago, when I was beginning to work on my book about the American college, I came across a manuscript diary kept in the early 1850s by a student at a small Methodist college in southwest Virginia. One spring evening, after attending a sermon by the college president that left him troubled and apprehensive, he made the following entry: “Oh that the Lord would show me how to think and how to choose.” That sentence, poised somewhere between a wish and a plea, sounds archaic today. But even if the religious note is dissonant to some of us, it seems hard to come up with a better formulation of what a college should strive to be: an aid to reflection, a place and process whereby young people take stock of their talents and passions and begin to sort out their lives in a way that is true to themselves and responsible to others. “Show me how to think and how to choose.”