You can see where this is headed. A writing assignment asks students to compare and contrast feminist themes in Jane Eyre and Wuthering Heights. Yup, it can do that. A political science exam requires short-essay responses to questions around the rise and fall of the Soviet Union. Check.
Is the writing captivating? No. Is it coherent? Mostly. So what does this all mean for teaching? That’s one question I set out to explore when I wrote about ChatGPT last month. I’d like to dig in here to some of the ideas that I heard from digital-literacy experts, writing instructors, and teaching and learning specialists.
First, if you want to make your assignments AI-proof, that’s likely impossible. These tools can be used in large and small ways. Maybe you won’t receive a paper written by a bot, but a bot-written essay may inform your students’ writing. Yes, you can shift all writing to in-class assignments, or you can have students write by hand. But as Anna Mills, an English instructor at College of Marin pointed out to me, these strategies introduce new problems. For example, you may have students with learning disabilities who struggle under such conditions.
Second, you may want to shift to different types of assignments and assessments. Maybe you allow students to produce a podcast instead of writing a paper. Or you create fewer writing assignments, but build in more feedback and revision to the ones you keep. Or you try prompts whose answers are less likely to be found on the internet. Of course, there is always another option, which is to invest in detection software. Already several tools on the market promise to do that. But many digital-learning experts say that’s a losing game — tech will keep advancing, and students will find ways around detection tools. Nor do most instructors want to become writing police.
The approach that most intrigued me is one that has to do with engaging students in a conversation about why and how they write, sometimes using these AI tools.
John Warner, a writing expert and author, notes that writing is a form of thinking. Writing requires you to process and synthesize a range of facts and ideas, and to come up with a coherent and hopefully insightful take on what you have learned. Students, though, may have been trained in high school to see writing as a form of regurgitation based on a set of formulas (compare and contrast!).
If you can explain to students the value of writing, and convince them that you are genuinely interested in their ideas, they are less likely to reach for the workaround, Warner told me.
There are also a host of people excited about using this technology in their classroom. Why? Well, for one, it’s not going away. Ignoring the fact that students will use it is seen by some instructors as an abdication of professional responsibility. These are powerful tools and it’s better to help students learn how to use them judiciously, and to understand their limitations and benefits. The other reason to use them is that they can help spark the creative process, professors say, and enhance learning.
Marc Watkins, an instructor at the University of Mississippi, wrote a thoughtful essay about this recently. He and his colleagues in the department of writing and rhetoric started a working group last summer to figure out how to incorporate AI research, writing, and brainstorming tools into their classes. They used a counterargument generator to encourage students to explore different perspectives on a topic, and a research tool to help them brainstorm.
“What message would we send our students,” Watkins writes, “by using AI-powered detectors to curb their suspected use of an AI writing assistant, when future employers will likely want them to have a range of AI-related skills and competencies?
“What we should instead focus on is teaching our students data literacy so that they can use this technology to engage human creativity and thought.”
The significance of that argument was brought home to me by Mills and others, when they compared these writing tools to calculators. There was also plenty of hand-wringing when hand-held calculators appeared, but now they’re integral to teaching math and other STEM disciplines. Mills says she can even envision a day when ChatGPT is embedded in programs like Word and Google Docs.
Of course, all AI tools are not the same. Those that complete your sentences in an email are at one end of the spectrum, and large language models, like ChatGPT, are on the other. As Watkins and others point out, the most sophisticated tools are also potentially dangerous. Some have been shut down after turning out biased or nonsensical research papers that could still confuse a layperson. All the more reason, they say, to engage your students in a discussion of what they are, how they work, and how they can be used as an aide in learning.
These teaching experiments are coming fast and furious, but here is one that caught my eye. In this paper, Paul Fyfe, an associate professor of English at N.C. State University, described an experiment in which he asked students to “cheat” on a final essay using GPT-2, an earlier version of the AI that underlies ChatGPT, and then discuss how the AI influenced them, as well as its potential uses and abuses.
Have you experimented with any of these large language model tools in your teaching? What questions, concerns and hopes do you have about this technology? Write to me at beth.mcmurtrie@chronicle.com and your story may appear in a future newsletter.
Want More on AI?
Want to get involved in the conversation around AI writing tools? Here are a few resources you might find helpful.
Anna Mills has put together several documents:
AI Text Generators and Teaching Writing: Starting Points for Inquiry
How do we prevent learning loss due to AI text generators?
AI Text Generators: Sources to Stimulate Discussion among Teachers
Elsewhere, a group of professors is compiling examples of how instructors are using text generation technologies in their assignments. The results will be published in an open-access collection. You can find out more about the project on this site, Teaching with Text Generation Technologies.
If you’re rather listen to a discussion, here are a couple of webinars:
AI and the Future of the Essay
What might ChatGPT mean for higher education?
Finally, if you don’t think the AI is sophisticated enough yet to be worth consideration, check these out:
- An emeritus professor of educational technology created a 2,000-word academic paper in 10 minutes.
- An admissions expert created a Common App essay that several admissions counselors thought was real.
- A professor at the Wharton School created a syllabus, an assignment and lecture notes for an MBA-level introductory course on entrepreneurship.
Keep on Teaching
New year, new webinar series for Teaching readers! This one, which we call Keep on Teaching, delves into two challenges we know many of you are contending with: how to make class time meaningful, and how to support students without overextending yourselves.
The first session, on January 20, is open to anyone. The second, on February 10, is just for newsletter subscribers. (If you haven’t signed up yet, remember, it’s free!)
In both sessions, you can expect an interactive discussion that brings together practical advice, evidence-based insights, encouragement, and a reminder that you’re not alone.
Read more about the series and sign up here. We hope you’ll join us!
Thanks for reading Teaching. If you have suggestions or ideas, please feel free to email us at beckie.supiano@chronicle.com or beth.mcmurtrie@chronicle.com.
— Beth
Learn more about our Teaching newsletter, including how to contact us, at the Teaching newsletter archive page.