I’ll share a few examples here, from a variety of courses:
Gregg L. Michel, an associate professor of history at the University of Texas at San Antonio, asked students in an upper-level course on the history of the civil-rights movement to work in groups to annotate a ChatGPT-generated essay using Hypothes.is. Then each the group rewrote the essay, using what they wanted of the original in the final draft.
“The students were divided on their assessment of the AI essay. Some thought, as I did, that the argument was perfunctory and the writing flat and not engaging. Some students, though, found the essay to be ‘college level’ work but did think it still needed refinement (e.g., better prose, more evidence). Several thought the essay amounted to a good outline for building a more formal essay.”
As with the experiment I wrote about last month, Michel found that his students had little previous experience using AI tools. And he feels that if academics want students to be able to use such tools productively, they should help students with the prompting process. Michel sees potential, for example, in AI helping students organize their thoughts, brainstorm, and revise their essays.
**
Kevin McCullen, an associate professor of computer science at the State University of New York at Plattsburgh, teaches a freshman seminar about AI and robotics. As part of the course, students read Machines of Loving Grace: The Quest for Common Ground Between Humans and Robots, by John Markoff. McCullen had the students work in groups to outline and summarize the first three chapters. Then he showed them what ChatGPT had produced in an outline.
“Their version and ChatGPT’s version seemed to be from two different books,” McCullen wrote. “ChatGPT’s version was essentially a ‘laundry list’ of events. Their version was narratives of what they found interesting. The students had focused on what the story was telling them, while ChatGPT focused on who did what in what year.” The chatbot also introduced false information, such as wrong chapter names.
The students, he wrote, found the writing “soulless.”
**
Grace Heneks, a lecturer in the English department at Texas A&M University, asked students to work with ChatGPT in a technical professional-writing course and a writing about literature course. In both courses, she says, students were underwhelmed by the product.
In the technical writing course, students used ChatGPT to write job-application materials, such as cover letters and resumes. Most of them, she wrote, found that the chatbot actually created work because, even with multiple prompts, the writing did not have much personality and was repetitive. The same held true in her writing course when she used ChatGPT in class and also when students reviewed a ChatGPT-generated essay.
“Over all,” Heneks wrote, “it’s been fun to play with ChatGPT in class, and I think the more professors do so, the more skeptical students will be. My students definitely seem to be more critical of it now.”
***
Dan Sarofian-Butin, a professor of education at Merrimack College, offered a more positive take on working with ChatGPT.
In an introductory education course and an introductory diversity course this semester he wanted students to understand ChatGPT’s ability to summarize and synthesize complex issues.
“I require students to use ChatGPT in class and in every minor and major assignment (formative and summative; low and high stakes),” he wrote. “I show them how to ask it better and better questions each class, starting from the basics (‘can you explain [X] to me…’) to the complex (‘concisely, can you summarize [X] for me and provide me with a few suggestions of how this issue intersects with [Y] and [Z]’).
“I also require students to have at least two back-and-forth questions after the initial response on the topic in order to help them understand how to develop a question (and thus an argument). Moreover, I am now teaching them how to use ChatGPT to better understand, focus, and develop a topic that they will be researching for their midterm projects.”
He writes that students have been almost universally positive about using ChatGPT. “Students greatly value the ability to have immediate responses 24/7, the ability to have an issue explained clearly and concisely, and the opportunity to brainstorm something in real time.”
I asked Sarofian-Butin how he tries to prevent inappropriate use by students, such as passing ChatGPT’s work off as their own.
He takes several steps: one is to start the semester with some low-stakes reflective writing so he has a baseline through which to compare their writing. He also tries to link assessments and assignments to to students’ interests, which helps encourage authenticity.
“And finally,” he writes, “on a deep psychological level, I hope that by showing them explicitly how to use ChatGPT in all aspects of the course, they know that I know that they know that I know that (etc., etc.) how ChatGPT sounds and works, and so they shouldn’t use it to pretend in their own essays.”
I’ll share more examples in the coming weeks, including an interesting case study in chemistry. If you’d like to share how you’re using generative AI in your teaching, write to me at beth.mcmurtrie@chronicle.com. Your story may appear in a future newsletter.
Helping international students
Do you teach international students? Do they sometimes come to you for advice on academics or finances? Over at Latitudes, my colleague Karin Fischer describes a training program for faculty members and others who want to become better informed about the unique challenges international students face. Members of the Office of International Students and Scholars at the University of Arkansas at Fayetteville explain why some standard advice that’s appropriate for U.S. students, like suggesting they lighten their course load or seek a part-time job, might run afoul of student-visa regulations.
You can read more about the nearly two-decade-old program, In Their Shoes, here. Karin is looking to share more innovative approaches to common problems in international-student support, study abroad, and other aspects of international education. You can write to her at karin.fischer@chronicle.com
Meaningful engagement
Many of the reasons students might feel disengaged from college are outside of an instructor’s hands. Still, professors can take action to increase the chances that students will meaningfully engage in coursework, for instance: better connecting lessons to the real world and career aspirations; being more transparent about learning outcomes; and introducing students to emerging technologies, like generative AI. On Thursday, October 19, Beckie will moderate an expert panel in a Chronicle virtual event on this topic. Register to attend here.
Have questions or thoughts about this topic? You can share them with Beckie ahead of the event: beckie.supiano@chronicle.com.
Thanks for reading Teaching. If you have suggestions or ideas, please feel free to email us at beckie.supiano@chronicle.com or beth.mcmurtrie@chronicle.com.
— Beth
Learn more about our Teaching newsletter, including how to contact us, at the Teaching newsletter archive page.