Here’s how five colleges are building their AI expertise
It doesn’t take a big budget or a deep bench of computer-science researchers for organizations to begin equipping themselves to understand the potential impact of powerful new artificial-intelligence tools. That was the clearest takeaway — and in many ways, the most comforting one — from the many responses to my question last month on how organizations are preparing for an AI-powered future. I appreciate the detailed responses so many of you sent my way.
Notably, many of the interesting ideas I received came from institutions that aren’t giant, and the people heading up the efforts aren’t necessarily techies, but rather folks like librarians and professors of writing. I’m impressed by the initiative so many people are taking in developing resources for their colleagues and I’ll note that many of these resources are available for sharing. I also want to highlight the anxiety over AI many of you shared — concerns about its use in cheating and in diminishing the value of teaching writing — that I saw in many of the comments. Clearly that concern runs deep and deserves continued attention.
Following are highlights from five colleges:
Metropolitan State University of Denver runs monthly workshops on AI Empowerment in Higher Education led by Samuel M. Jay, a professor of communications studies and director of faculty affairs. It is also building an AI for All site that is rich with readings, how-to videos, and policy guidance to help professors and staff members. Jay said he plans to spend the summer developing additional videos explaining the best approaches to “prompt engineering” — that’s the skill of crafting the questions so AI tools return the most useful responses. He’ll also be working with the English department on policies for the best use of Chat-GPT and other generative AI tools. Last year MSU Denver created a generative AI taskforce that advises the provost, deans, and others in academic affairs on matters, like whether to deploy AI detectors. Having faculty members well-represented on committees like this, Jay wrote, is “a great way to lower the anxiety level (just a bit).”
Randolph College, in Virginia, has been engaging on AI issues through one of its committees, the Writing Board, which has been soliciting its faculty members for opinions on and strategies for generative AI. This is more than informational, said one member, Jennifer Gauthier, a professor of media and culture. The board is hoping to use the information to propose changes to the Honor Code, which is entirely managed by students. Also, she said, most professors now realize they need statements on AI use in all their syllabi, and probably for each assignment, too: “It has become clear that AI might be useful and could be allowed as a resource in some circumstances.” (As an instructor teaching students to analyze media messages, however, Gauthier is still wary. The speed at which the tools provide answers, she said, is “truly terrifying to me.”)
Hudson County Community College, in New Jersey, created a GenAI Professional Learning Community as a vehicle for faculty and staff members to explore AI hands-on. Over the past year the community has organized a series of individual and group exercises using the tools to develop lesson plans, prepare meeting agendas, write grant proposals, and the like. The idea is for college personnel to get “a feel for how these tools can be used ethically to increase creativity and productivity,” wrote Matthew LaBrake, executive director of the college’s Center for Online Learning. The hands-on exercises are key. “Experimentation with gen AI,” LaBrake said, “is really a prerequisite to understanding them and their potential impact on both education and industry.”
Marshall University brought together professors, administrators, and technologists for a Presidential Task Force on AI, and that has in turn developed templates for language faculty members might use on their syllabi and some practical guidance for teaching with AI. (The first lesson in that latter document: “AI is here to stay.”) The West Virginia university is also experimenting on the business side. It’s working with AI tools from Microsoft and Amazon Web Services to explore if the technology can be used to increase efficiency. Some examples, per Jodie Penrod, the university’s chief information officer: Can they spot abnormal end-of-the-year spending or simplify the query systems used for budgeting ? Even more exciting — to me, at least: Marshall is also evaluating ways that AI might be able to scan the content of courses offered by other universities, to pave the way for smoother transfer for students.
At Camden County College, in New Jersey, library personnel are leading the charge. Information-literacy librarian Lori Lenox has been teaching both students and instructors the power of using AI tools for research — and the ways they can come up with some pretty bad responses to queries. Andy Woodworth, a systems and web-services librarian, has developed an admirably comprehensive guide to a wide range of topics related to AI and higher ed. Yet in his note to me, Woodworth said he’s now finding it useful to look beyond higher ed — at things like data-center investments, chip-design announcements, and geopolitics — to better understand the AI developments likely to affect colleges in the future. “The AI chatbots that are now being pushed by vendors come from somewhere,” Woodworth wrote, “and understanding the underlying players helps inform me as to what has merit and what has hype.”
Of course not all responses were examples of campuses enthusiastically leaning into AI.
Lew Ludwig, a math professor at Denison College and director of its Center for Learning and Teaching, wrote that he’s encountered “mounting avoidance of this new technology” at his liberal-arts college, in Ohio. At the sessions he’s held for professors at the center, he said, “three or four of my 220 faculty show up, even with a stipend.” (He’s also penned a Harry Potter-themed analogy that laments higher ed’s reluctance to wrestle with “yet another potentially disruptive enchantment.”)
In that same blog post, Ludwig channels the message of Ethan Mollick, a Wharton School professor and author of a new book, Co-Intelligence: Living and Working With AI, who argues that professors and others in higher education need to directly explore AI tools. To do otherwise, Ludwig wrote, is “effectively outsourcing our role in shaping the future of this transformative technology to a few tech executives.”
The Supreme Court affirms the CFPB’s future
Throughout its 14-year life, the Consumer Financial Protection Bureau has played a key role in policing the lending practices of for-profit colleges, the credit-card deals colleges offer to their students, and the operating procedures of organizations that service student loans.
Its right to exist has also been frequently under attack. But earlier this month the U.S. Supreme Court ruled 7 to 2 to legitimize the way it’s funded, a decision that assures the bureau will stay on the scene for the foreseeable future.
Join our discussion this week on the trust gap
Sometimes it seems that the Mars-Venus analogy applies all too well to administrators and faculty members. But with financial pressures growing and enrollments falling at many colleges, the need for greater understanding — and collaboration — has never been more vital. What does it take to find common ground? Can new models of shared governance bridge these divides? Please join me on Thursday, May 23, at 2 p.m. Eastern, as I explore those questions and dive into related survey data on a Chronicle virtual panel. Sign up here to tune in and pose questions live, or watch on demand.
Got a tip you’d like to share or a question you’d like me to answer? Let me know, at goldie@chronicle.com. If you have been forwarded this newsletter and would like to see past issues, find them here. To receive your own copy, free, register here. If you want to follow me on X, @GoldieStandard is my handle. Or find me on BlueSky Social, which I just joined with the same handle.