Ever since the release of ChatGPT last November, higher education has been engulfed in a deluge of commentary about artificial intelligence. ChatGPT’s ability to converse with its users has provoked debates on the future of educational research, and especially of teaching. But what does AI mean for graduate students? That question has received scant attention.
As teachers of introductory courses, graduate students are in the same boat with AI tools as their professors, but they are seated in the front and getting drenched by the spray. As researchers, the position of doctoral students is more like that of undergraduates, except the use of AI tools in their research and writing raises even more red flags.
AI and graduate-student teaching. The question of how to teach — and especially how to assess undergraduate work — in the age of AI is being batted about like a piñata in the educational public square. The problem is real, so the anxious pursuit of solutions figures to go on awhile.
Even in these early days of ChatGPT, many undergraduates have proved eager to let mediocre, AI-authored writing substitute for their own work. In empirical subjects, machines can easily outperform humans. As Bryan Caplan, a professor of economics at George Washington University, said in a recent Chronicle forum (“How Will Artificial Intelligence Change Higher Ed?”), it will be all too easy for students to use chatbots to write papers outside of class. And, he added, “unless the exams are in person, they’ll be a farce, too. I’m known for giving difficult tests, yet GPT-4 already gets A’s on them.”
The challenge of cheating-detection has already loosed a torrent of hopes, suggestions, hand-wringing, and apocalyptic warnings. That storm will settle at some point, and generative AI will probably find its place in educational practice. In the short term, here are some steps for graduate-student instructors to consider:
- You don’t need to be Nostradamus to predict that you will need more in-person exams and in-class writing assignments. This summer, think about how you can adjust your syllabus to confront these realities.
- Departments and institutions are still developing policies on AI and cheating. If your university is still working on its policy, consider devising one of your own in the interim to post on your syllabus.
- Read as much as you can on ChatGPT and teaching. Play around with AI tools enough to know how they work. After all, your generation of faculty is on the front lines of this issue, and it’s your teaching that will be affected most.
AI and doctoral research. To what degree ChatGPT and other tools will upend the classroom status quo remains to be seen. But AI can play a positive role in a graduate student’s research and writing — if we let it. There’s already a tendency among faculty members to criminalize the use of AI. That’s appropriate if students are using it to cheat. But not every use of AI is cheating.
A note to professors: Your graduate students probably use AI already. But that doesn’t mean that they’re getting away with something — and we shouldn’t act as though they are. AI isn’t new. The internet has long relied on it. You use it when you do a Google search, for instance. Predictive text — when your phone completes the word you’re typing or offers you the next one — is another instance of how we’ve long been splashing in the shallow end of the AI pool.
As such examples suggest, there is no bright line between “my intelligence” and “other intelligence,” artificial or otherwise. It’s an academic truism that no idea exists in an intellectual vacuum. We use other people’s ideas whenever we quote or paraphrase. The important thing is how.
Writing is a process that often involves collaboration. Writers benefit from feedback, whether from peers or teachers. AI models (there are more on the way) can be collaborators of sorts — provided that you recognize their limitations and work within them, as you would with any collaborator. (Some scholarly organizations explicitly allow for the use of AI now.) You might show your work to one colleague because you know she’s great on the sentence level but not at assessing your whole argument. With another colleague, it might be the other way around.
Likewise, AI tools have strengths and weaknesses. “Language learning models” like Chat GPT are good at generating a lot of basic information about well-known subjects very quickly. They’re also adept at summarizing. Those can be useful advantages for graduate students, especially in the early stages of research. A key weakness is that conversational AI presents its findings in generic, mediocre prose.
Ethan Mollick, an associate professor at the University of Pennsylvania Wharton School of Business, recently compared AI to “a high-end intern,” while Merve Tekgürler, a graduate student studying history and symbolic systems at Stanford University, described it in an email as “a thesaurus with context.” Given that AI tools are powerful and potentially useful, the question is: How can graduate students best use them in their scholarship? Among the many options:
- After you write a draft of a chapter or an essay, feed the information into a chatbot and compare the ideas it spits back with your own.
- Use an AI tool to brainstorm, as did Darryll J. Pines, president of the University of Maryland at College Park, when he was preparing a speech recently.
- To save time, use ChatGPT to write a rough draft of your syllabus or other such documents that adhere to widely accepted templates. Then shape the result to make it your own.
- Allow AI to help you with the details. I recently asked ChatGPT for the proper bibliographic format to cite a particular eccentric source. It supplied a useful answer right away.
Those are just a few of the possibilities. In each case, you are using AI to spark your thinking or to shortcut some time-consuming busy work. And that’s fine, so long as you keep these cautions in mind:
ChatGPT works best on subjects that are widely written about. The reason is simple: AI works by scouring vast amounts of information pulled from the internet. (It’s not directly connected to the internet itself.) The more information that’s on the web about a given subject, the more knowledgeable the AI will be.
If you ask for an AI boost on an obscure subject, one of two things will happen: Either the AI will come up empty, or it will make stuff up. (Yes, it really does that. How very human.)
Don’t rely on AI to know things instead of knowing them yourself. AI can lend a helping hand, but it’s an artificial intelligence that isn’t the same as yours. One scientist described to me how younger colleagues often “cobble together a solution” to a problem by using AI. But if the solution doesn’t work, “they don’t have anywhere to turn because they don’t understand the crux of the problem” that they’re trying to solve.
The educational world is rapidly filling with stories of students who submit AI-written papers containing errors that the students don’t catch because they never bothered to learn the material themselves. Those transgressions will receive their just deserts from teachers, supervisors, and/or at the Final Judgment. My point is simply that, as a writer, you have to know the stuff you’re writing about in order to do a good job. If you rely on AI to do the thinking, you become the curator, not the author, of the writing that results.
And without an author, the writing will be bloodless. “These large language models [like Chat GPT] will never have anything related to human emotions,” said a Colorado geoscientist I interviewed. “Emotions, including just the standard motivations that cause us to do anything at all, are completely lacking.”
Emotionless writing might be OK for a user’s manual that tells you how to work your new air conditioner. But scholarly writing — not just in humanities but across the disciplines — needs sensibility. Luckily, sensibility is something that humans, both in and out of graduate school, have plenty of. So keep these cautions in mind, and go ahead and add AI to the research tools at your disposal. Just remember: Use it to help you, not be you.