Chances are, faculty members on your campus have been collectively stressed out over the easy availability of AI tools and the implications of that for college teaching and learning. And it’s similarly probable that you’re worried about if, and how, you should deal with this on your fall syllabi.
When it comes to course and syllabus design, few subjects are as fraught as ChatGPT and other large-language-model tools. In recent months, there’s been no shortage of wild claims: ChatGPT will be the “end” of writing instruction; students will use this tool to plagiarize on any and every occasion; and/or we as instructors need to completely rethink our approach to “cheating detection.”
With the fall semester fast approaching, many institutions have yet to adopt formal policies on generative AI tools. It might be tempting to ignore the elephant in the classroom (“my course doesn’t lend itself to AI tools,” you might claim, or you may believe that your assignments can’t be completed by nonhuman labor). But from what students are telling us, the use of ChatGPT and other AI tools is ubiquitous — and the days of brushing them aside in our course and assessment planning are over.
So, yes, you probably do need to add an AI policy to your syllabi. Here are some next steps.
First, do the reading. Before you start drafting, make sure you have an adequate handle on AI tools — what they are, what they can do, and, just as important, what they aren’t and can’t do. Without some baseline knowledge, any course policy you devise might have unintended effects, and be inadequate or unnecessary.
Start with a crash course on the basics and then delve (but maybe not too deeply down that rabbit hole) into the hype and doomsaying around these tools. Often, a critical examination of their limitations and (many) flaws can get buried under all the noise. To gain a sense of what it all means for higher ed:
- In The Chronicle’s pages, you can find a primer on AI and teaching, advice on how to prepare for ChatGPT in your classroom, and tips on how to devise “compelling” writing assignments in an AI era.
- This explainer from Ted Chiang shows how ChatGPT is the equivalent of, as he put it, “a blurry JPEG of the web.”
- John Warner assesses its impact on teaching and learning here, and Zak Cohen’s slightly more optimistic assessment is also helpful. Warner also appeared on a recent episode of the “Teaching in Higher Ed” podcast, where he further explored “teaching writing in an age of AI.”
- An important corrective to the drama can be found in Ian Bogost’s examination in The Atlantic, “ChatGPT Is Dumber Than You Think.”
- In New York Magazine, John Herrman analyzed the shifting nature of ChatGPT, which could be interpreted as a decline in the tool’s effectiveness, an impression perhaps confirmed by a Stanford research team’s finding that ChatGPT went from producing correct responses to math problems at a high of 90 percent to down to a rate of less than 3 percent in recent months.
- Higher-ed futurist Bryan Alexander has compiled the most comprehensive online list I’ve seen of resources (both text and multimedia) on ChatGPT and AI tools.
- Finally, it’s clear that purported “AI cheating detectors” don’t work, and are an insufficient answer to the ChatGPT “problem.”
With an understanding of ChatGPT’s functionality, affordances, and limitations, it’s easier to articulate what its place might be in your course and syllabus — or determine that it doesn’t have a place.
What to factor into your AI policy. Professors will not win an arms race with AI tools, and attempting to do so is both unrealistic and unsustainable. That is a key consideration to keep in mind as you write. Your course policy should be clear-cut, but be mindful of how lengthy it is compared with other policy language on your syllabus. Be specific enough that students understand the rules but not so in the weeds that they stop reading.
Among the minority of instructors who have already integrated ChatGPT into their teaching, there is no one-size-fits-all approach. Some ask students to engage critically with AI tools, using them in particular tasks with the aim of assessing their strengths and limits. Other instructors seek to pre-empt cheating by explicitly incorporating these tools in assignments. Still others eschew AI entirely, and devise assignments that students must complete without the aid of external content generation.
Whatever approach you choose needs to be conveyed to students transparently. Here are some effective strategies for figuring out how you want to deal with AI on your syllabus.
Revisit your institution’s policies on academic integrity. You may find language there that you can adapt. Or you might not: A common problem for many colleges is that their antiplagiarism policies specifically proscribe the unattributed use of material produced by other people — whereas ChatGPT is not a “people.”
If you seek to place limits on the use of generative AI tools for coursework, just be sure to use accurate language on your syllabus policy.
Remember this landscape is changing quickly. Try to avoid handcuffing yourself and your students to inflexible policies that aren’t easily modified, as necessary.
Simply producing a list of “thou shalt nots” will be ineffective, not to mention will set the wrong tone for your course. Explain the thinking behind your AI policy. You might use this section of your syllabus to spark a class discussion, not just about ChatGPT but also about knowledge creation, attribution, and citation ethics.
Consider including some language about the limitations and/or inherent biases built into generative AI. Many students (and faculty members) believe these tools are more “trustworthy” dispensers of content than they actually are. Here, too, use your syllabus to open a larger conversation with students.
Be detailed in your dos and don’ts. If it’s OK for students to use these tools for some things (generating ideas or creating outlines) but not for others (writing drafts or preparing bibliographies), say so directly on the syllabus. Make sure it explicitly describes, with examples, which types of coursework fall into which category.
For instance, for assignments in which you allow use of an AI tool, you might indicate on the syllabus something like: “If you use these platforms in [assignment title], please add a note that describes where in your process you used AI and which platform(s) you used.”
Do your assignments, projects, and tests need a makeover? Do the topics, tasks, and criteria lend themselves to easy completion via ChatGPT? To put it bluntly: If an AI tool can earn an A on a particular assignment, that’s likely a problem with the assignment, not the AI tool. Whether you decide to allow or prohibit the use of these tools in a course, Derek Bruff, an educator and writer, offers a useful set of questions for what he calls “assignment makeovers in the AI age”:
- Why does this assignment make sense for this course?
- What are specific learning objectives for this assignment?
- How might students use AI tools while working on this assignment?
- How might AI undercut the goals of this assignment? How could you mitigate this?
- How might AI enhance the assignment? Where would students need help figuring that out?
- Focus on the process. How could you make the assignment more meaningful for students or support them more in the work?
Workshop your syllabus language with colleagues. A collective approach can help ensure students aren’t encountering wildly conflicting policy approaches in their courses. If you work with peers and colleagues from outside your campus, you will discover that not only are you not the only one struggling to figure this “AI thing” out, but that many teachers and scholars have already been putting a lot of thought and care into the ways in which they communicate with students about AI tools, coursework, and academic integrity.
You can find some guidance in this crowdsourced document where academics are sharing syllabus language: “Classroom Policies for AI Generative Tools,” curated by Lance Eaton, a doctoral student in higher education at the University of Massachusetts at Boston. It’s one of the most helpful and comprehensive such documents I’ve come across as of this writing. (Feel free to add your own syllabus language if it adds to the variety of examples available there.)
This is not the end of us. Finally, it’s worth repeating that, despite many loud proclamations to the contrary, ChatGPT and AI tools are not the end of higher education. Yes, the scale and ease with which they do their work is new, but the underlying strategies to deal constructively with this issue are the same ones that underlay effective syllabi in general:
- Use student-centered language on your AI policy (as opposed to an impersonal, punitive tone).
- Be as transparent and precise as possible.
- Provide tangible examples.
- And use the syllabus as a launching pad for the important conversations you want to have throughout the course.