Maier, who recently retired from teaching economics at Glendale Community College in California but continues to run teaching workshops, had struggled with how best to help students write meaningful undergraduate research papers. Too often, he notes, students see the process as a continuation of what they learned in high school: Find five sources on the topic, and synthesize what they have to say. Instead, he wanted to see how students’ thinking evolved based on what they were reading.
So in a team-taught course in American economic history, Maier tried out an approach called the I-Search Paper, in which the subject becomes the process of searching for information, what the student learned, and what questions arose from that.
“It really is like a dissertation proposal,” he says, “on a much more informal level”: This is what I learned. This is a question that stemmed from that. This is why I think it’s really important and interesting. These are the kinds of sources I plan to use to find answers.
Most important, says Maier, students must explain how their research changed their thinking. While ChatGPT could be used as a research tool, the final product would be an original work.
For example, he says, a student’s initial question might be: Why did the Great Depression last so long? The student might then describe in her paper how she went to the library and what research she did on the topic. She then might conclude that the more interesting questions were: Why did President Roosevelt abandon fiscal policy in 1937, and why didn’t the Federal Reserve increase the money supply in 1931?
“So now suddenly the student has a whole different understanding and focus, and the paper would describe that transition,” says Maier.
The student could write about how her plan would be to find a country where the central bank behaved differently than the Federal Reserve did. She wouldn’t actually do the research, but lay out that next step.
Maier says the process worked well with his students, especially because he scaffolded it into the course, doing one step at a time. He intends to talk about I-Search in workshops, on writing across the curriculum, that he continues to lead on campus. Professors are definitely worried about the potential for misuse of AI, he says. But “when people raise the issue of cheating, maybe first we need to think about our assignment design.”
While it would be hard to make an assignment ChatGPT-proof, Maier says, I-Search helps by focusing on the process of research. “Yes, students could [take a] shortcut and have ChatGPT generate new questions, or evaluate new resources it suggests,” he says. “But, at least in principle, student writing will explain how they used AI and how it affected their thinking.”
For others interested in this approach, Maier recommends two books, including one by John Bean, who he says has shaped his thinking on good writing and on writing across the curriculum:
The I-Search Paper: Revised Edition of Searching Writing, by Kevin Macrorie
Engaging Ideas: The Professor’s Guide to Integrating Writing, Critical Thinking, and Active Learning in the Classroom, by John C. Bean and Dan Melzer
Have you revised your writing assignments in ways that you think are particularly creative? Drop me a line, at beth.mcmurtrie@chronicle.com, and your idea may appear in a future newsletter.
AI and Coding
Most of the public conversation about AI in education has revolved around writing. But David Nelson, associate director of the Center for Instructional Excellence at Purdue University, points out another task that AI can do well — writing code. He shared this article on Codex, a deep-learning model created by the nonprofit OpenAI, which also created ChatGPT. The authors compared Codex’s answers to questions on introductory programming exams with those of students who took the exams, finding that Codex scored higher than most students did.
“On our campus,” Nelson writes, “discussions understandably concern students’ using ChatGPT to replace self-authoring papers, but most of the violations of academic integrity involve authoring and sharing code.”
Nelson would be interested to hear from instructors who are struggling with AI code-writing bots in their coding courses, or instructors who are using bots or AI in interesting ways. “There is real potential for a larger conversation about intellectual property and code authoring,” he writes, “as well as immediate need for practical suggestions for instructors used to sending students home with code-writing assignments.”
Do you teach coding? And have you thought about how you plan to deal with this challenge? If so, write to me, at beth.mcmurtrie@chronicle.com, and your ideas may appear in a future newsletter.
What Your Peers Are Doing
We’re continuing to see a lot of stories and social-media posts about professors who either have decided to incorporate ChatGPT into their teaching or have written policies and guides for their students. Here are a few examples of what they’re doing:
Chris Marsicano, an assistant professor of educational studies at Davidson College, shared his class policy on Twitter. His take: ChatGPT falls in the “gray area” of the North Carolina college’s honor code, and AI in general is not yet trustworthy enough to be used as a learning tool. But if students want to use it, they should list the program under works cited.
In this NPR interview, Ethan Mollick, an associate professor at the University of Pennsylvania’s Wharton School, says he requires students in a course on entrepreneurship and innovation to use ChatGPT. For example, they used it to generate ideas for a class project. Like Marsicano, he warns students that ChatGPT can be wrong and that students should check it against other sources, as well as cite when and how they use it.
Andrew Piper, a professor in the department of languages, literature, and culture at McGill University, in Montreal, explains in this Twitter thread why and how he uses ChatGPT in class to show its strengths and weaknesses as a research tool.
Finally, the University of Florida’s College of Liberal Arts and Sciences is holding a discussion on this issue with a panel of faculty members on February 7 at 6 p.m. ET. If you’re interested in listening to “Big Bot on Campus,” follow this link.
As always I’m interested in hearing your ideas, concerns, and plans for using — or avoiding — ChatGPT and other AI tools in your teaching. You can reach me at beth.mcmurtrie@chronicle.com.
Tackling Burnout
The next session of our two-part virtual forum, Keep on Teaching, runs on Friday, February 10, at 2 p.m. Eastern time. This one will focus on how to motivate students without burning yourself out.
This online session is exclusively for newsletter subscribers, so now’s a good time to make sure that’s you (you can check here) and to recommend Teaching to colleagues you think might like it, too. When you register for the event, you’ll have a chance to let us know what topics you most want to hear our panelists weigh in on.
We hope you’ll join us. If not, registering will allow you to watch a recording at your convenience.
Do you have questions or feedback on Keep on Teaching? Ideas of how else we can help newsletter readers find the teaching ideas and community they seek? We love hearing from you.
Thanks for reading Teaching. If you have suggestions or ideas, please feel free to email us, at beckie.supiano@chronicle.com or beth.mcmurtrie@chronicle.com.
— Beth
Learn more about our Teaching newsletter, including how to contact us, at the Teaching newsletter archive page.