A medieval literature course next term at the University of California at Los Angeles will lean heavily on the use of artificial intelligence, including to generate a single custom textbook, the university announced last week.
The news has triggered skepticism from academics who questioned whether outsourcing the core functions of a course to AI will only cheapen the classroom experience and devalue human expertise.
But the course’s professor said using AI will allow her to teach the material more deeply. “I feel more empowered by this process,” said Zrinka Stahuljak, a professor of comparative literature and French. “I am more of a teacher than I’ve ever been.”
In past versions of the course, Stahuljak assigned readings from two Norton Anthologies and several supplementary materials, like maps and photos, to help students understand a reading’s historical context. She would lecture on the material, then her teaching assistants would help students connect the primary sources to the historical context covered in those lectures.
This time, she fed PowerPoint slides, self-produced YouTube videos, and course notes from previous iterations of the class into an AI platform called Kudu, which consolidated them into one text that she reviewed. Nothing in the book is actually written by AI. “It’s my words, my writing,” she said. The images in the book, like maps and paintings, are not AI-generated either.
The resulting book is a compilation of Stahuljak’s expertise that she says would otherwise be bound by the classroom’s walls. “The knowledge that I have been able to convey as a professor is really ephemeral. It’s in me, it’s in my notes, it’s in my persona, and in my presence. When I go away, I go away,” she said.
Synthesizing the materials in this way will free up class time to be used on analysis, rather than absorbing information, Stahuljak said. It will also make her teaching more accessible and replicable, she added.
Humans are guiding artificial intelligence in how to create what we want. And when it’s not good enough, we do it again, and we do it again.
AI’s involvement doesn’t stop with the text on the page. Students can chat with the e-book’s large language model, which can answer questions but only by using information from Stahuljak’s uploaded resources. Course assignments that her teaching assistants give will also be AI-generated. While she is still determining the details, she expects the AI to create an “array of possible writing exercises or discussion points.” This way, no matter which TA leads a section, students receive “a more standard, a more coherent, and a more even training.”
Online observers have criticized the announcement, and some have fixated on the textbook’s cover — an AI-generated painting of a medieval landscape with nonexistent Latin words. “UCLA’s new AI-designed literature course has the worst-looking textbook cover I’ve ever seen,” read a Literary Hub headline.
Stahuljak said she was glad to see the mockery because, after all, the cover is a “clever joke.” She said the image illustrates the book’s title of History & Fiction because it combines historically based images with fictional words.
“They’re not mistakes,” she said. “I would be very disturbed if people took it literally and thought that the expert in this class would have accidentally posted an image like that.”
To the broader critique — that the use of AI diminishes human expertise in the humanities — Stahuljak pointed to the intensive human labor that went into the textbook’s creation. “Humans are guiding artificial intelligence in how to create what we want,” she said. “And when it’s not good enough, we do it again, and we do it again. This is an iterative, human-labor-intensive process that is interactive. It’s human-driven.”
Some may see the use of AI as incompatible with a course that studies the written word. Whether to include AI in one’s curriculum is a matter of academic freedom, said C. Edward Watson, vice president for digital innovation at the American Association of Colleges and Universities. “If it doesn’t make sense to be using AI in your course because it’s antithetical to your perceptions of the course, then it’s probably a good idea not to use it within that course.”
Meredith Martin, a professor of English at Princeton University and the director of its Center for Digital Humanities, said discussion of the specifics of the course may miss the broader implications. “It’s just a textbook,” she said. “Whether or not this textbook is good or not is less the issue than what it stands for.”
What it stands for, she says, is the commercialization of education. “We are the products, and we are also the consumers,” she said. “If we want to turn education more into the university-as-a-service model, then absolutely give your intellectual property to a start-up. I think that’s the danger of a model like this.”
Kudu, a start-up company founded by a UCLA professor, compensated Stahuljak for the hours she spent reviewing and editing the material. Stahuljak said that she doesn’t think Kudu will “hijack these materials from me and reuse them in unethical ways.” But who owns the intellectual property of course materials after it is uploaded to Kudu? “That information changes from professor to professor and is privately negotiated,” a UCLA spokesperson said, adding that Kudu is “a corporation with the goal of funding graduate students when and if year-end revenues result in profits.”
Kudu’s founders, Warren Essey and Alexander Kusenko, a UCLA professor of physics and astronomy, said in an email that the professor who creates the learning material owns the intellectual property, and that the company “does not retain copyright” of the material but holds a nonexclusive license to distribute it. “In some cases,” they added, “a university department may wish to have the copyright assigned to the university instead of the professor. In such cases, Kudu signs a publishing agreement with the university.”
If AI-generated, customized textbooks catch on, publishing companies could be affected, said Alexa Joubin, a professor of English at George Washington University and co-director of its Digital Humanities Institute. She doesn’t expect the impact to be swift, however. “Most of the professors in the humanities are conservative,” she said. “They’re more used to traditional print textbooks. So there’s still a market.”
Joubin also still sees the need for primary texts, like Norton Anthologies. “I can see professors maybe experimenting with feeding lecture notes and readings into their specific AI for use in a specific class, but the starting point is still some kind of textbook. You still need a foundation, and I continue to see the necessity of the Norton Anthology. It has to be there.”
Stahuljak said she chose to use Kudu because writing a traditional textbook would take two to three years, compared to the three to four months it takes Kudu to produce a book. Traditional publishers also want a textbook that appeals to a wide market, and Stahuljak didn’t want to abandon the focus of her course. At $25, a Kudu book is also less expensive than a traditional textbook. And unlike other digital course packs, Stahuljak’s textbook is still available to students after they complete the course.
Stahuljak’s class is the first humanities course at UCLA to use Kudu technology, and she plans to use it in other courses as well. Despite the spate of criticism, she sees the textbook as “highly pedagogical and highly responsible.” And to those who disagree, she would like to understand where their opinions originate. “Is it just fear?”