Imagine a distant grocery store that advertises that it can meet all of your cooking and eating needs. You make the trip, and when you get there you discover that you can only buy lemon grass, pomelos, and Sriracha sauce. You ask about the limited selection, and the manager tells you to wait till next week, when they’ll be selling pimentos, artichoke hearts, and brandied cherries.
That’s what it’s like to pick your courses when you’re a beginning graduate student in the humanities. Term by term, year by year, the graduate course offerings in humanities departments don’t make sense together. They’re a hodgepodge of specialized inquiries: snapshots of books and articles in progress by professors who know what they’re teaching, but aren’t much aware of what’s being taught in colleagues’ courses alongside their own.
The fault doesn’t lie in particular courses. Moreover, if I singled out this or that one, my complaint would look too much like a snarky right-wing attack on the "irrelevant" humanities that have lost their way. The problem is the overall effect of the practice.
Let’s look at it first from the student’s point of view. Humanities graduate students pick through the eccentric course offerings on the buffet table and try to make a balanced meal out of them. They know that they have to nourish themselves for the comprehensive exam that’s ahead. But how do you gather together a bunch of specialized inquiries into preparation for a general and comprehensive one? Graduate students in the humanities usually solve that problem — that is, they pass their comps — but it takes time, and they have little of that to spare.
Now consider the problem from a faculty point of view. Most professors who teach graduate courses want to create a learning community, with lively seminars, workshops on professional issues like teaching, and a common department space where graduate students (and faculty members) can meet and talk.
But when do we talk to one another about what we’re doing when we teach graduate courses? We teach the same students in a department, so it ill befits us to be working at cross purposes, even accidentally. Yet we rarely check with one another to see just what those purposes are. What courses do we want to offer doctoral students, and why?
I don’t mean to suggest that this kind of discussion never happens. But it’s far more likely to see professors at liberal-arts colleges (where teaching is woven tightly into the institutional culture) exchanging ideas about their common undergraduate teaching goals than it is to see faculty members in master’s and doctoral programs talking together — not just in subfield groups — about what they want their graduate students to learn.
In other words, as graduate-school teachers in the humanities we pay scant attention to our own faculty communities — and it’s the students who stand to suffer.
During his 2008 term as president of the Modern Language Association, Gerald Graff, a professor of English and education at the University of Illinois at Chicago, warned against "courseocentrism" — the problem that "teaching in isolated classrooms leaves us knowing little about one another’s courses." This practice bewilders undergraduates because it makes our work "opaque" to them. The isolation problem — the belief that teaching is a "solo act" — has different but equally disastrous effects on the graduate level.
Instead of engaging one another on curricular goals, we hide behind a market-based model in which we invest rather too much faith. The rationale behind uncoordinated and unsequenced course offerings is based on two assumptions. The first is that professors do their best work if they teach something they’re interested in. The second is that professors should teach what they want to teach, and if they do, the diversity of their specialized inquiries will give graduate students a broad choice from which to meet their own educational needs.
That second assumption deserves a hard look. The underlying concept goes back to Harvard president Charles W. Eliot’s introduction of undergraduate electives to replace a fixed curriculum in the late 19th century. But the graduate-level precedents of that idea go back even further: to the German value of lehrfreiheit, the "freedom to teach" that prevailed (and mostly still does prevail) at German research universities, and that underpinned the earliest conceptions of academic freedom in the United States.
But the rationale for a random graduate "curriculum" arises from something more than academic freedom. It also rests on the assumption that if professors act as free agents in the graduate classroom, they will create a marketplace where students can shop to meet their learning needs.
Professors have long relied on that marketplace model, and our faith in it replicates the belief in the invisible hand that dominates the American public sphere. But the invisible hand works perfectly only in theory. Anyone who saw what happened in 2008 understands that marketplaces need regulation — the arguments are only about how much and what kind.
Some disciplines do depart from the market-based model of graduate-course selection. Most fields in the social sciences, for example, build their graduate curriculum vertically: Everyone has to begin by learning the same foundational concepts, even on the graduate level. Josiah Ober, a professor of political science at Stanford University, says that course offerings must be dictated by the need for students to learn the techniques that they’ll need to perform their own research. "In the contemporary world of the social sciences," he said, "you have to get the methodological skills down, and get a sense of what the literature is, and has been, in the field."
Many humanities departments throw a methods course into their annual eclectic mix. You can tell who’s teaching it because that professor’s arm is in a sling from the twisting required to get him to teach it. But there’s no consensus about which methods need to be taught in the humanities in the first place. Literary theory? Archival savvy? Book history? In the end, idiosyncrasy prevails there, too.
Educators have a name for this idiosyncrasy: teacher-centered curriculum. The term means doing what professors want to do instead of trying to figure out what students need. The eccentric results of this approach ill-serve students and professors alike. For one thing, the potpourri of classes doesn’t do enough to prepare students for the general examinations that we demand after their classwork is done. We ask them to display foundational knowledge on those tests. How can we legitimately demand such preparation if we don’t teach it?
Teacher-centered curriculum consequently adds years to graduate students’ time in school. Graduate students in the humanities are a bright and mostly enterprising bunch, and they do figure out what they need to know. They sift it out of their specialized courses like miners panning for gold, and then they dig for the rest in the library as they prepare for their comprehensive exams. As a result, it takes them extra time to study for those tests — and that adds to the unconscionable nine years that it takes them to earn a Ph.D. "If you really want the students to take their general exams, and soon," said Russell Berman, professor of German and comparative literature at Stanford, "then offer the courses that prepare them."
There’s a value to professors teaching their current research, to be sure. Graff has written of how a course based on a professor’s book can draw students and faculty into a kind of collaborative partnership. But courses based on specialized books in progress are not the only component of a balanced educational diet. Students might refine their palates by eating complicated dishes of anchovies and capers, but they also need a bowl of whole-grain cereal sometimes, even if it’s not as fascinating to cook.
Basic skills matter. André Soltner, the legendary owner and chef at Lutèce, is said to have auditioned his sous-chefs by asking them to prepare a plain omelet.
Andrew Delbanco, a professor of American studies at Columbia University, has written about how basic skills need to be taught again and again for each generation of humanities students. The "progressive power of science" allows us to require medical students to know more "about the genetic basis of disease or the management of organ transplantation than physicians knew 20 or even 10 years ago." The humanities, on the other hand, he said, "remain concerned with preserving truth by rearticulating it." Delbanco is talking about undergraduate teaching (his excellent book which I’m drawing from here is called, College: What It Was, Is, and Should Be). But his point holds equally true on the graduate level: We have a duty to teach our students what we expect them to know.
What students need to know is, on the graduate level, a disciplinary matter. I have little idea what an anthropologist considers essential for a beginning graduate student, but that’s not my bailiwick. Instead, members of anthropology departments need to get together to have that discussion.
Shared goals are essential. Not just for faculty collegiality — though that’s worthy in itself. If we could come to some agreement on our educational goals, we could create a more coherent educational plan for our graduate students and a more coherent set of course offerings.
Berman calls on professors to become better "curators of the graduate programs we offer." Scholars of education would describe it as a move to "student-centered curriculum," in which the plan is designed with the students’ needs in mind.
But there’s no reason to get technical. We can describe it in simpler terms as the flip side of academic freedom: Doing right by our students is a form of academic responsibility.