In my latest story, “Professors Ask: Are We Just Grading Robots?”, I look at what faculty members have experienced this past semester when it comes to teaching and AI. As I noted, some are riding the wave while others feel like they’re drowning in fake writing. No matter where they stand, though, instructors are clear they need more guidance and more training. Many of those who are incorporating AI still feel like they’re barely hanging on to their surfboards.
That’s because even if you’ve redesigned your assignments and assessments, the proliferation of new tools will require ever more changes. Many professors and administrators are unaware of how aggressively these tools are being marketed to students.
Marc Watkins, a lecturer in the department of writing and rhetoric at the University of Mississippi, who has been reporting on these changes in his Substack, notes that TikTok is filled with videos in which students extol the benefits of apps that promise to listen to and distill lectures, or summarize readings. As Watkins put it, “Now we have to start thinking about more than just assessments in AI. We have to think about learning itself.”
So what do instructors say they need? Often it’s practical, discipline-specific advice on how to adjust their teaching, to either incorporate AI meaningfully or create assignments in which it’s unlikely that AI will be used inappropriately.
Recent data shows what many instructors are facing. Tyton Partners’ annual survey, “Time for Class 2024: Unlocking Access to Effective Digital Teaching and Learning,” found that 39 percent of faculty members surveyed said their institution had not yet offered training on generative AI. Of the rest, only 11 percent said their college offered training in course redesign in response to these tools.
Meanwhile, the survey found that 34 percent of instructors said their workload has increased because of AI. The most common reasons were monitoring for academic integrity and redesigning assessments to counteract AI usage. Coming in third and fourth were time spent learning AI tools and redesigning assessments to incorporate AI.
I’m going to explore some resources and teaching strategies in the coming weeks. And if you have others you’d like to share, please write to me at beth.mcmurtrie@chronicle.com
Adding friction
One approach that discourages AI misuse — and encourages careful reading — is to ask students to annotate what they read, either with hand-written comments on printed paper or through tools such as Hypothesi.is and Perusall. Watkins describes this as one way to add “friction” to the process of learning.
You can read more of his thoughts in his Substack piece, We Need to Reclaim Slowness. (If you’re interested in reading more on the concept of friction, Watkins recommends Katie Conrad’s Friction v. “Magic” and Jane Rosenzweig’s Boston Globe op-ed “ChatGPT Is at Odds With What Education Is for.”)
James Lang, another teaching expert who has written Chronicle Advice pieces on how to wrestle with the existence of AI, describes annotation as one of many ways in which you can slow-walk AI usage. Another might be sequencing assignments so that students work on an outline in class, tech-free and collaboratively with other students, before considering how AI could help them in their process. In another Advice piece, he describes some compelling writing assignments that are also less likely to be hacked with AI.
Derek Bruff, visiting associate director of the Center for Excellence in Teaching and Learning at the University of Mississippi, has put together some resources to help instructors create assignments that are harder for students to complete using AI. He, too, offers ways in which to use annotation in teaching and learning. Another set of resources he compiled focuses on alternatives to traditional essays. Finally, he wrote an essay on his blog last year on assignment makeovers, which offers ways to think about different forms of writing with and without AI.
Incorporating AI
If you want to explore using AI constructively in the classroom, the University of Central Florida has published a free guide, ChatGPT Assignments to Use in Your Classroom Today.
As for discipline-specific teaching strategies, Wendy Howard, director of the Pegasus Innovation Lab at UCF, says the university will be launching a “Teaching with AI” repository during its annual Teaching and Learning with AI conference next month. So stay tuned.
The AI Pedagogy Project, run by the metaLAB (at) Harvard, has created a three-part guide, including a tutorial, resources, and assignments.
Finally, for those of you interested in what’s being done at an institution, or at the state and national level, here are several notable initiatives:
If you have come across resources that helped you adjust your teaching, please tell me about them at beth.mcmurtrie@chronicle.com so I can share them with readers.
Thanks for reading Teaching. If you have suggestions or ideas, please feel free to email us at beth.mcmurtrie@chronicle.com or beckie.supiano@chronicle.com.
—Beth
As always, nonsubscribers who register for a free Chronicle account can read two articles a month. Your readership supports our journalism.
Learn more about our Teaching newsletter, including how to contact us, at the Teaching newsletter archive page.