When I was an advanced graduate student preparing to take my chances on the academic job market, I approached the head of the freshman-writing program for a recommendation. “What do you want me to say about you?” he asked.
The question caught me off guard. No professor had ever asked me that before. Without thinking, I told him to describe me as a “teacher-scholar.” It made sense at the time, and decades later, I still see myself as some combination of teacher and scholar. So do most of us in academe, I believe — although scientists might prefer a term like “teacher-investigator.” (“Investigation” was the all-purpose word used in 1891 by William Rainey Harper, the president of the newly established University of Chicago, to describe what professors would do there once the place opened.)
My point is that professors — and graduate students — are socialized to view their jobs as some kind of combination of scholarship (or investigation) and teaching. It’s who we are — an identity we acquire in graduate school.
How do graduate programs forge those building blocks of professional identity? On one level, the answer seems pretty obvious — through role modeling. My advisers were scholars, and I wanted to emulate them.
But the creation of the “teacher” part of a graduate student’s identity turns out to be less direct, less secure, and more troubling.
Far from a building block, our institutional commitment to training graduate students as teachers crumples under the slightest pressure — because it’s tissue-thin. The implications of this tenuous commitment go far beyond the students’ careers, and reach to the heart of who we believe we are as professionals, and what we think we’re doing.
Let’s start with a long-running and disturbing national practice: Many doctoral programs throw their graduate students into the undergraduate classroom, sometimes in their very first semester, without any real training at all.
Students who fail their pedagogical training usually don’t end up teaching. Instead, administrators come up with something to substitute for their teaching.
That deplorable practice mostly prevails at large public institutions. Usually there’s a day or two of perfunctory orientation, and then into the machine march the new graduate-student teachers. It’s a sloppy approach that leads to some understandably poor teaching, and for the students affected (on both sides of the lectern), there’s little recourse.
Other doctoral programs do train their graduate students to teach. Some of those programs do so very well. The training often comes in the form of a seminar or teaching practicum that rookie teachers take. I had to take one such course while I was in graduate school; it was required to teach freshman composition. Later, as a professor, I taught such a graduate course myself. I valued the experience from both sides.
In the humanities and some social sciences, just about every graduate student teaches. So, in departments with a training seminar, just about everyone in those fields completes it. The idea, after all, is to certify students to do teaching that they and the department both need, albeit for different reasons. (The students need the credential and the salary, while the departments need the value on the bottom line. It’s cheaper to have graduate students teach than it is to hire full-time faculty. Many deans at state universities wouldn’t be able to balance their instructional budgets without low-cost graduate-student teachers.)
But what happens if a student flunks the teaching practicum?
I had never thought about that possibility until recently. Then Eva Badowska, dean of the graduate school at Fordham University, where I work, told me about dealing with just such an experience.
It’s admittedly rare for a graduate student to prove an unfit teacher. Badowska doesn’t see it often, but occasional cases do make it to her office. And when they do … nothing consequential happens.
That’s right: There are no prurient stories to tell about students who do poorly in their teacher training. No scandals break out. The students’ failure at being taught to teach proves no deal-breaker and it never threatens their progress toward their degrees. Maybe that’s the scandal.
Students who fail their pedagogical training usually don’t end up teaching. Instead, administrators come up with something to substitute for their teaching. One humanities professor at a private university explained the usual practice of finding “work arounds” — by which he means “different kinds of training” for those students, at different kinds of work.
I emailed a handful of graduate deans about how their programs dealt with graduate students who performed poorly in the classroom. The survey wasn’t comprehensive, but I would venture to call the results representative. One former dean at a large state university said that he “always suspected that there were a lot of bad TA’s but had no real mechanism to deal with it.” A current dean at a state university said, “I have seen faculty desire to expel students on account of lousy teaching, but it never gets anywhere.”
We in the arts and sciences may say that we’re training teacher-scholars (or scholar-teachers) then, but that’s not really true. If someone fails to acquire the “teacher” part of that identity, we confer the degree just the same. Ph.D.s aren’t required to be teachers. It’s just desired, that’s all.
Viewed historically, that’s not surprising. The United States borrowed the idea of the Ph.D. from European universities, which train Ph.D.’s as scholars. But this isn’t Europe. Utility has always mattered in American higher education. Today’s roiling debates over the “use value” of a college degree didn’t come out of nowhere. They’ve flared on and off for many generations.
The consistent push for utility helped create the expectation that in the United States, professors don’t just do research. They teach. That’s part of how society at large defines who we are, and it’s also how we view our own jobs. So if we confer a doctorate in the arts and sciences on a graduate student who can’t teach, we invoke the European past and deny the American present.
At this point, you may think: So what? All professors teach anyway, so what does it matter when they learn how? And if a student leaves academe, why should the ability to teach matter?
In fact, it matters a great deal. Most graduate students won’t become professors, and we’re paying increasingly more attention these days to training them for diverse careers. That attention is realistic and most welcome — but it’s impossible to carry it out without teaching graduate students to teach at the same time.
The nonacademic workplace expects — and often demands — that graduate students know how to teach. Employers value Ph.D.’s for their ability to work with complex information — to analyze it deeply, summarize it, redact it, synthesize it — and then to teach it to others.
“Only those graduate students with strong preparation as teachers will succeed” in today’s workplace, say the authors of the Modern Language Association’s 2014 report on doctoral study in modern language and literature. “The tendency to devalue teacher preparation in parts of doctoral education is at odds with the ever-growing national pursuit of effective teaching.”
Put simply, if graduate students can’t teach, they won’t do well in any job market.
Most graduate students are inspired to go to graduate school by their college teachers, and most of them enter with an idea that they want to become teachers themselves. Given the superficiality of our own commitment to teacher training in our graduate programs, that seems ironic at the very least.
Leonard Cassuto, a professor of English at Fordham University, writes regularly about graduate education in this space. His latest book is The Graduate School Mess: What Caused It and How We Can Fix It, published by Harvard University Press. He welcomes comments, suggestions, and stories at lcassuto@erols.com. Twitter handle: @LCassuto.