So he has been incorporating AI carefully into his classes. In a course called “Becoming Human,” he shows students, step by step, how to develop effective prompts that define what they want AI to do, supplying it with the right background and examples to get the most from it. He encourages students to use it in a variety of ways, to include, if they want, assisting with a final project in which they create a game about human evolution. They might use AI, for example, to brainstorm the rules of the game.
“All tools can be exploited in a good or a negative way,” he says. “Education and learning how to use it is our major tool to [help students] use it in a good way,”
Some of his departmental colleagues see AI only as a cheating machine. So another one of Pozzi’s roles is to help counter limited or negative views of these tools by talking with other professors and showing them how he uses it. In that, he is supported by a group of peers doing something similar in their own departments and disciplines across campus.
Many colleges are struggling with the challenge of creating an AI-literate campus. How do you teach students about AI if many of your faculty members are also novices? For UTSA, one approach has been through peer-to-peer engagement.
The university’s Generative AI Peer Learning Network, created in early 2023, brings together people across campus to help adapt higher education to AI. It includes 60 faculty members and 28 others from the divisions of academic innovation, student success, student affairs, technology, institutional research, and the library. Pozzi is one of its members.
The goal has been to create a hub where faculty can address different aspects of AI, says Claudia Arcolin, executive director for teaching and learning experiences in the univerity’s department of teaching, learning and digital transformation. For some that might mean experimenting with AI in teaching with the help of instructional designers. For others it could mean working with someone in the student-conduct office to address academic-integrity concerns.
Another member of the peer-learning network is Sue Hum, a professor of English. She has been teaching her students how to combine primary archival research and AI prompting to create multimedia projects focused on overlooked parts of history. Through the process, she instructs students about how AI might have its own biases toward dominant narratives. “Largely what they learned is that they had to be in the driver’s seat,” she says.
The projects are designed to be informative and accessible. Students might use AI to help identify scholarly sources, generate counterarguments to their ideas, organize information, and edit their writing for clarity and audience, among other things.
“The projects that I got are far more complex,” she says, than they would have been without AI use. “I’m able to ask them to do more. They’re not just doing research, they’re engaging in multimodal literacy.”
One student, Sydney Grona, focused her project, “HemisFair ’68: The Forgotten Rubble,” on how a fair designed to celebrate San Antonio’s 250th anniversary led to the displacement of more than 1,600 residents. Grona used AI to research why HemisFair was harmful for certain communities and to rework parts of her written narrative, among other things.
Hum sees another one of her roles as broadening the campus conversation around AI. “This is very much a grassroots endeavor, where multiple people are having input and participation and we’re learning together,” she says.
To that end, she is working with the academic-innovation division on creating modules to teach students how to develop effective prompting techniques, similar to what Pozzi does with his students. Hum believes that professional expertise matters more than ever when it comes to teaching students to use AI appropriately.
“That’s what we’ve been learning together, my students and I, is how expertise matters, and how specific prompting — really good specific prompting — matters,” she says. “I don’t think it’s going to do away with creative thinking or critical thinking. I think it’s going to ask us to think very differently. But ultimately, the takeaway is you need to know more rather than less.”
Has your campus created peer networks to foster AI literacy? Or has your campus found other ways to support professors in their efforts to investigate how AI connects to their work in the classroom? If so, write to me at beth.mcmurtrie@chronicle.com, and your approach may be featured in a future newsletter.
Teaching AI literacy
In my most recent story, I raise a pressing question: should colleges teach students to be AI literate? Whether your immediate reaction is “of course,” or “no way!” I encourage you to read it and learn what your peers on other campuses are thinking. While many professors continue to see AI as harmful to the development of students’ critical thinking and communication skills, even some skeptics believe that AI literacy is valuable.
The reason is twofold. One is that AI is now everywhere: in our search engines, social-media feeds, and digital writing tools, to name a few. ChatGPT is quickly becoming a go-to tool for many people, including students. Whether they are thinking critically about what AI is and what it produces, though, it is not a given.
The other reason is that AI literacy requires being able to understand and evaluate AI, including its many limitations. That critical lens is what has drawn some academics to the concept, on the idea that AI literacy is simply an extension of the communication, information-literacy, and critical-thinking skills that colleges are well-equipped to teach.
I’ll be reporting more this year on how AI is shaping higher education, with a focus on how it’s affecting students and professors in their day-to-day lives. How is AI use influencing our uniquely human traits, like reasoning and empathy? In what ways are students using AI as a study tool, and how does that inform their learning? Are professors using AI much in their teaching, whether to lighten the burden of tedious tasks or engage students creatively?
If you have ideas on what you’d like me to explore, or want to share some burning questions you have had about AI, drop me a line at beth.mcmurtrie@chronicle.com. I welcome your thoughts.
Thanks for reading Teaching. If you have suggestions or ideas, please feel free to email us at beth.mcmurtrie@chronicle.com or beckie.supiano@chronicle.com.
Learn more at our Teaching newsletter archive page.