“A New Muse: How Guided AI Use Impacts Creativity in Online Creative Writing Courses,” by J.T. Bushnell and Wayne Harrison, who teach in the School of Writing, Literature & Film at Oregon State University, describes a small experiment they did in 2024.
Students were drawn from online sections of an introductory course in creative writing and asked to write a scene about a person at work on a day when something unusual happens.
In the experiment, students were asked to complete the assignment three different times. First, they wrote a scene entirely on their own. Next, they were told to use AI however they wanted to assist them. Finally, they watched a series of short video lectures by Bushnell on how to use AI in the writing process, then completed the assignment.
The authors found that on average, the creativity of the students’ writing decreased with unstructured AI use, but increased with guided use. The least skilled writers showed the most marked gains in their writing after watching the videos. In fact, the creativity of their writing caught up to the high-skilled writers.
Meanwhile, the most skilled writers did not show a drastic difference in their writing after watching the videos, but their writing suffered notably with unguided AI use.
I talked with Bushnell to learn more about the study and its significance.
He noted that he has the same concerns about AI that a lot of instructors do, but has been in discussion with students for a while and has even experimented with it in class. In order to design this experiment, though, he said he needed to do two things: think through the components of the creative process and better understand how AI works.
To write well, he says, “you have to learn to master the individual components of a story, like, how do you provide robust characterization? How do you open up a significant conflict and then escalate the conflict?”
So the idea of looking at how AI works as a series of “micro processes” seemed natural to him. The goal was to determine where it can help people “engage further and think harder, rather than replacing our thinking and engagement.”
In the paper, the authors describe the phases of generating creative ideas: problem finding, divergent thinking, selectivity, and convergent thinking.
In the first phase, you come up with opportunities to innovate, in the second you generate ideas, in the third you pick from those ideas, and in the fourth you synthesize distinct ideas and yield new insights. (The process is more nuanced than how I’m describing it here — but that’s the gist.)
AI, the authors write, is particularly good at helping people with certain steps within this process. But it must be used carefully and deliberately.
AI programs, they write, “need significant human judgment both to initiate these phases with ‘problem-finding’ and to make good use of them afterward through selectivity, adaptation, and convergent thinking. That is the human element of the collaboration, in which writers, whether student or professional, are able to exercise the creativity that comes from their interests, experiences, intuitions, and good judgment.”
Bushnell’s video instruction lays all of this out in a conversational tone. He shows students how he might choose from, or ignore, the ideas AI generates when he gives it a prompt. Then he writes his own narrative and asks AI to store it for later review. When he’s ready to review what he has written, he asks for suggestions — not alterations — that will help him deepen or refine his writing. Throughout the process he makes clear that he remains in charge of what he writes.
Bushnell said the fact that many students did worse with unstructured AI use is an important finding. What that means for him is that telling students “you can use AI” is probably not a good idea if it doesn’t come with targeted instruction. Of course, that means instructors would have to understand how to use AI in order to give good advice.
He doesn’t worry that students might borrow ideas from AI during the brainstorming phase, either. Creative writing may be a largely solitary enterprise, he says, “but in other disciplines the way you brainstorm is you collaborate with other people. And that doesn’t mean you go into a meeting and you say, ‘What’s the answer to this?’ And whatever the first person says, you take it. People start throwing out ideas, and you ricochet off that idea towards something else.” Students can use AI in that same way.
Of course, he notes, if all you’re doing is letting AI create the “menu” and then selecting from the options presented, you are outsourcing your thinking. But he hopes that faculty members will work with students to help them avoid that approach.
In fact, he has started to use AI in his own writing. Recently when he finished a short story, before he sent it out to human readers, he discussed it with the chatbot Claude. What are the weaknesses here? Where could I add some description without interrupting the narrative flow? “Having that extra eye, just like from another human,” he says, “helped me see my piece in a new way.”
Bushnell notes that there are plenty of limitations of the study. It is small, for one, just 31 students, and it focuses on a single exercise. He wonders: If creative people start using AI more, will that help them fuel their creativity later on? He doesn’t know but thinks it’s worth exploring.
Bushnell says that while his inclination is to remain skeptical of AI, the experiment offers some hope. The improvements some students showed after using AI in an informed way “give me a lot of heart and faith in our ability to guide students, perhaps imperfectly,” he says, “toward a more productive use of this tool.”
If you have created an assignment, or a course, in which you have figured out how to harness AI to improve students’ outcomes, write to me at beth.mcmurtrie@chronicle.com and your story may appear in a future newsletter.
Assessing learning with AI
One of the best things about writing this newsletter is that it gives me and Beckie a direct line to readers. We ask for your ideas and experiences, you share them with us, and then we share them with others. It feels like an ongoing conversation at a time when faculty members are tackling some of the most foundational challenges to teaching.
We also receive great questions. Here’s one I got recently from a reader who asked to remain anonymous. But it expressed a concern I hear a lot, and wonder about myself, when it comes to AI.
Here’s her question:
“One thing struck me after I returned to your newsletter late this evening: HOW did these professors obtain all these great, deep, thoughtful ‘results’ from their students? Did they run experiments with different controls and variables for different sections? Did they have some students using AI for assignments, etc., and some not using AI?
“The professors expound on how well the students did, but the mechanisms for assessment and assignments are unclear. AI will ‘ask us to think very differently,’ one said. About…? How? How will and do we get students to ‘think differently’ while using sophisticated Google, basically, to answer questions?
“It would be helpful if professors/devotees of AI-linked teaching/assignments could explain more, step-by-step, what they ask students to do; how they assess student work (does everyone get an A/A+ if they prod AI effectively?); how they are so certain that these AI-directed prompts or assignments create better learning outcomes, skills, etc., (than NOT using AI — were we failing students when we didn’t have them querying Big Tech’s new tools?).”
If you teach with AI and are able to explain how you assess students in a way that answers this reader’s questions, write to me at beth.mcmurtrie@chronicle.com and your example may appear in a future newsletter.
Thanks for reading Teaching. If you have suggestions or ideas, please feel free to email us at beth.mcmurtrie@chronicle.com or beckie.supiano@chronicle.com.
Learn more at our Teaching newsletter archive page.