“Alright psych adventurers, buckle up,” says a male voice, “we’re diving deep into Dr. Gurung’s Intro to Psych syllabus.” “Yeah,” a female voice replies, “we’re going to equip you with the knowledge to not just survive, but crush this class.”
Neither is human.
The voices appear in the opening minutes of a “podcast” generated by Google’s NotebookLM, an experimental AI tool that promises to help students succeed by turning their notes, readings, and class materials into easily digestible snippets.
The tool is free to use and has been available for more than a year, but new features added in September have caught the attention of students and faculty members. NotebookLM (referred to internally as simply Notebook) now advertises its ability to automatically generate “podcasts” — summaries of documents in the form of a conversation between two synthesized voices.
The new feature is an extension of Notebook’s core offerings, which focus on giving users control over the model’s purview. Rather than training the chatbot’s output on reams of data from across the internet, Notebook invites users to upload a handful of their own documents — be they text, audio, or video.
Such user control has already prompted attempts to create increasingly surreal “episodes” — in which Notebook’s synthesized hosts issue commentary on an alien invasion, the rapture, and, in one particularly noteworthy segment, their own artificial nature and fear of death.
Yet the tool is marketed primarily to students, and Google’s demo site shows a user uploading what appear to be notes and documents from a physics class. In posts on X, Steven Johnson, a developer of Notebook, has encouraged students to upload their own handwritten notes, in addition to recording audio of class sessions from which to generate summaries later.
By limiting its knowledge base, Google claims, Notebook minimizes the fabrications common to other AI tools. Notebook “instantly becomes an expert in the information that matters most to you,” says Google — a promise that some educators have found encouraging.
“To me, that’s a game changer,” said Kevin Steeves, a faculty instructor and co-chair of the AI task force at Lane Community College. “A lot of faculty, they want to bring in the students, but then how do we trust that [AI is] accurate when we’re not there with students to kind of guide them, facilitate how they’re interacting with it?”
Some educators have their doubts, however. For José Antonio Bowen — co-author of Teaching With AI: A Practical Guide to a New Era of Human Learning — Notebook’s biggest risk lies in making it too easy to skip the hard work of learning. “I certainly think it has the potential to short-circuit reading,” said Bowen, noting that a “lively” podcast might seem more appealing than a lengthy reading assignment. That’s a problem, Bowen said, since he’s long reminded his students that “when the discomfort starts, the benefits start.”
Nevertheless, Bowen remains optimistic that Notebook could take its place as a powerful learning tool, as long as faculty members continue to center learning in the student experience. “We’ve got to make sure that we’re talking about learning as well as product,” Bowen said. “I need to make sure the students understand what the learning value is of everything we assign. That is not new technology. That’s just good teaching.”
Enthusiastic Adoption — With Caveats
At the University of California at Riverside, students and faculty members have been making use of Notebook for more than a year at the behest of the institution’s XCITE Center for Teaching and Learning.
Three faculty members who spoke with The Chronicle expressed general satisfaction with the experience, although they had some concerns about the tool’s limitations. Rich Yueh, an assistant professor of teaching in information systems, said that Notebook’s performance declines significantly when it comes to writing or assessing computer code. Carole-Anne Tyler, an associate professor in Riverside’s English department, noted that in rare cases the model can exceed its source material and make factual mistakes.
For Yueh, however, Notebook remains exciting for its personal touch. “It’s cool teaching students that you can work with AI,” said Yueh. “I think up to that point, students had been using it just to get an answer. And then now we have not only an answer, but their own notes or notes from the class, and they can start to interact with this content rather than saying ‘this comes from some magical space on the internet.’”
That kind of interaction isn’t limited to students. Regan A.R. Gurung, a professor of psychological science at Oregon State University, has experimented with alternative syllabi, and in 2023 distributed an audiobook-style “syllabus read by the author,” recorded in his own voice. When he first began experimenting with Notebook’s podcast feature, he immediately saw an opportunity to take his approach a step further. “[It] blew me away,” Gurung said. “I don’t know if I could have scripted a podcast on my syllabus that much better.”
“I was really looking for something novel and different that would catch students’ attention,” Gurung continued. In Notebook, he found it. “They’re amused,” he said, after sharing his syllabus podcast with his general-psychology class. “Will they pay more attention to it? We’ll see.”