Skip to content
ADVERTISEMENT
Sign In
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
  • More
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
    Upcoming Events:
    An AI-Driven Work Force
    AI and Microcredentials
Sign In
One Year In

ChatGPT Has Changed Teaching. Our Readers Tell Us How.

By Beth McMurtrie and Beckie Supiano December 11, 2023
illustration of a robot hand holding a pencil against concentric circles
Illustration by The Chronicle

If you’re wondering how generative AI has changed teaching, consider the classroom of Jeanne Law. An English professor at Kennesaw State University, Law redesigned her first-year writing course to include weekly “AI infused” discussion threads and assignments. She shows students how to engineer prompts so that AI can help them understand components of rhetoric, such as audience, purpose, context, and genre. She grades their results as if they had done the writing entirely on their own.

In her syllabus, Law, who is also director of composition, has added a section on AI use. It reads, in part: “You will be expected to use AI generative tools in this class, following the instructor’s permissions and directions,” and use them only “on assignments where AI tools are allowed.”

To continue reading for FREE, please sign in.

Sign In

Or subscribe now to read with unlimited access for as low as $10/month.

Don’t have an account? Sign up now.

A free account provides you access to a limited number of free articles each month, plus newsletters, job postings, salary data, and exclusive store discounts.

Sign Up

If you’re wondering how generative AI has changed teaching, consider the classroom of Jeanne Law. An English professor at Kennesaw State University, Law redesigned her first-year writing course to include weekly “AI infused” discussion threads and assignments. She shows students how to engineer prompts so that AI can help them understand components of rhetoric, such as audience, purpose, context, and genre. She grades their results as if they had done the writing entirely on their own.

In her syllabus, Law, who is also director of composition, has added a section on AI use. It reads, in part: “You will be expected to use AI generative tools in this class, following the instructor’s permissions and directions,” and use them only “on assignments where AI tools are allowed.”

Stephanie Masson, an English instructor at Northwestern State University, in Louisiana, also redesigned her first-year composition courses. Her goal was to discourage AI use. To do that she breaks down essay assignments into smaller steps and has students work together as they brainstorm ideas, participate in group discussions about essay development, and do other peer-reviewed activities.

Masson, too, added an AI policy to her syllabus. “Since writing, analytical, and critical-thinking skills are part of the learning outcomes of this course, all writing assignments should be prepared by the student,” it reads, in part. “Developing strong competencies in this area will prepare you for the competitive work force. Therefore, AI-generated submissions (using ChatGPT, iA Writer, Midjourney, DALL-E, etc.) are not permitted and will be treated as plagiarism.”

Those vastly different approaches to college writing pretty much sum up the responses to generative AI: They’re all over the map.

One year after its release, ChatGPT has pushed higher education into a liminal place. Colleges are still hammering out large-scale plans and policies governing how generative AI will be dealt with in operations, research, and academic programming. But professors have been forced more immediately to adapt their classrooms to its presence. Those adaptations vary significantly, depending on whether they see the technology as a tool that can aid learning or as a threat that inhibits it.

Even the most enthusiastic advocates strongly encouraged their students to approach generative AI with caution.

Some instructors have concluded that the only way to ensure that students are doing their own work, and not outsourcing their thinking, is to require them to submit notes and other artifacts of their work process, and run their assignments through AI-detection software. Other professors are training their students to become skilled AI users, on the premise that they will need to navigate such technologies in the workplace and that these tools can, if used judiciously, enhance their critical-thinking skills. Despair and enthusiasm operate side by side, sometimes within the same department.

To get a detailed look at what is happening AI-wise in classrooms across the country, The Chronicle asked readers to describe the fall semester. We wanted to know if professors had studied up on generative AI, altered what or how they taught, and talked with their students about these new technologies.

We were also curious how students responded. Cheating has been a top concern since ChatGPT appeared in November 2022. We wondered how prevalent that turned out to be this academic year. And, finally, we asked professors what they wanted to happen next. Were they hoping for better AI-detection software to root it out? Did they want their campus colleagues to embrace AI? Or did the path forward lie somewhere in between?

Nearly 100 faculty members shared their stories. While not a representative sample, they teach at a wide range of institutions: 15 community colleges, 32 public and 24 private four-year colleges or universities, seven international institutions, and one for-profit college. They teach a variety of subjects, including animal science, statistics, computer science, history, accounting, and composition. Many spent hours learning about AI: enrolling in workshops and webinars, experimenting with the tools, and reading articles, so that they could enter the fall semester informed and prepared.

Changing Course

Fewer than 10 respondents indicated that they had kept their assignments and policies the same. That small number may simply reflect that professors who had experimented with AI — even if they concluded it is a danger to learning — probably had more reason to write to us. Many instructors, for example, suggested that they were out in front of their colleagues when it came to dealing with AI. They were often disappointed with how little progress their colleges had made in coming up with guidance on appropriate AI use or how to handle incidents of suspected misuse.

ADVERTISEMENT

“My department has no policy. My college has had a variety of workshops, but not all that much,” wrote Steven Greene, a political-science professor at North Carolina State University. “Most everything I have learned on my own and tried to be a resource for my colleagues. Most of whom, honestly, are just sticking their heads in the sand on this.”

Many instructors added language to their syllabus outlining what they considered appropriate AI use (for some it was a full ban) or talked to their students about AI. Frequently, they did both. Even the most enthusiastic advocates strongly encouraged their students to approach generative AI with caution. And some felt it was important to have an ongoing conversation with students so they could talk through the technology’s ethical implications, problems with bias, or how to appropriately cite its use.

In her syllabus, Law, for example, notes that generative AI makes up facts, including false citations; that code generators can be inaccurate; and that image generators can copy original works or produce offensive products.

Ultimately, she tells students, “you will be responsible for any inaccurate, biased, offensive, or otherwise unethical content you submit, regardless of whether it originally comes from you or an AI tool. If you use an AI tool, its contribution must be credited in your submission.” If students try to pass off AI-produced work as their own, in short, she considers it cheating.

ADVERTISEMENT

Failing to properly cite work produced or shaped by AI is a serious violation in the eyes of most of the respondents. Even when they allow AI use, many want their students to explain how, exactly, they used it. Some, for example, require a screenshot or link to the original text produced by the AI program, so they can see how the student altered it.

A number of faculty members, even if they support AI use in some ways, decided that they had to significantly alter or eliminate certain types of assignments or assessments.

“I used to have take-home exams for my public-policy class that, I discovered, AI was great at writing answers for. Those are no longer take-home, unproctored exams,” wrote Greene. “I assume that students are using AI to help them write better, and heck, maybe even generate some text, but I have tried to design my assignments to minimize the opportunity to genuinely ‘cheat.’”

“I am under no illusions that I can consistently distinguish AI from student writing,” he wrote, adding, “(my colleagues are under this illusion — it is an illusion).”

ADVERTISEMENT

Students aren’t the only one using AI. Faculty members have begun using it to help them design their courses, viewing it as a tool that can make instruction more effective and engaging.

In his business-communication class, Carl Follmer, director of the Frank Business Communication Center in the University of Iowa’s Tippie College of Business, created an AI chatbot he calls Impy. It’s “designed to play devil’s advocate to generate counterpoints and different perspectives,” which can be useful, he wrote, when teaching a relatively homogenous group of students. “The chatbot was programmed to politely disagree with everything a student says in 10 sentences or less.”

While the chatbot isn’t a substitute for a real person, he notes, the use of Impy might help students think critically about, and develop empathy toward, people not like themselves. He has also run his lesson plans through AI tools to surface any weaknesses, such as areas that may hold bias or that students might find confusing.

Many professors who responded to The Chronicle’s query said they allowed AI in some assignments but not others. A common assignment, for example, is to ask students to use ChatGPT to produce an essay or other material, then to critique the work. Instructors found this useful in a number of ways.

ADVERTISEMENT

First, they said, they are teaching students how the AI works, which they felt was necessary given the ubiquity of these tools. Several also remarked that they were surprised to learn how little some of their students knew about AI.

I have tried to design my assignments to minimize the opportunity to genuinely ‘cheat.’

The assignments also helped students understand what these tools can and can’t do well. A number of professors, for example, said their students were unimpressed by the results. The writing might be stilted, off topic, or just plain wrong.

“I think some were hoping it would be a Get Out of Jail Free card — no more writing!” wrote Follmer. “Now it turns out that some students actually have to be cajoled into using it because they think it’s easier to write it themselves.”

ADVERTISEMENT

A third benefit was that students were learning the content of the class by working to improve the AI output. Briana Morrison, an associate professor of computer science at the University of Virginia, used ChatGPT to develop a series of take-home quizzes. She provided students with a specification for a problem along with incorrect code generated by the AI, and asked them to find and fix the errors. They are likely to need such skills in their work, she wrote.

Like many of the respondents, Morrison is fine with students using AI as a study guide. It is good for explaining concepts they may not understand, she noted, and generating practice quiz questions. Others said that they felt it was OK, too, for students to use it to help brainstorm ideas or polish their writing.

But what about the professors who want to minimize — if not eliminate — the use of AI because they believe it interferes with the learning process, inhibits creativity, and undermines authenticity?

Those instructors wrote in to say they are doing more in-class work, which also helps them understand their students’ writing styles and capabilities, in case they need to compare that work with writing they suspect might be AI-generated. Some modified assignments to include personal reflection or references to specific things that ChatGPT would have no knowledge of.

ADVERTISEMENT

They often broke large projects down into smaller parts so they could go over them step by step, figuring that smaller-stakes assignments lessen the temptation to use AI. Similarly, they lowered the weight they put on out-of-class activities like discussion posts. And some instructors asked students to include their raw materials, like notes or photos of their drafts, when submitting writing assignments.

Only a few people wrote to say that they had shifted to multimedia assignments or other creative options that put less emphasis on traditional essay writing.

An Uneven Landscape

By and large, responses to The Chronicle’s questions suggest that professors’ worries about the scope and severity of students cheating with generative AI have dissipated. Asked what has surprised them, they were more likely to point to how little students use generative AI than to how many of them use it to cheat.

ADVERTISEMENT

It’s possible, as some respondents noted, that they’re just not aware of cheating when it happens. But many instructors indicated that the measures they’ve added to reduce the likelihood seem to be working.

“I have not once this semester suspected a student of passing off AI-generated material as their own work or otherwise using AI inappropriately,” wrote Emily Pitts Donahoe, a lecturer in writing and rhetoric at the University of Mississippi. When Donahoe surveyed her students, most indicated, like Follmer’s, that generative AI “was more trouble than it’s worth,” though a handful did use it to brainstorm or to clarify assignments.

“This is, of course, a credit to the students themselves,” wrote Donahoe, who is also associate director of instructional support at the university’s Center for Excellence in Teaching and Learning. “I also think it’s due in part to the careful design decisions I made for the class, the ways we spoke about generative AI in the first weeks, and the culture of trust we’ve created together.” Donahoe is also using ungrading — a practice that de-emphasizes grades in favor of detailed feedback and repeated revisions — she noted, which probably reduces the temptation for students to cheat.

To be sure, some professors did indicate that cheating is a significant concern.

ADVERTISEMENT

“I feel my role has changed from assistant professor to assistant plagiarism detective,” wrote Lana Seibert, who teaches English as a second language at Missouri Valley College and English for academic purposes at Johnson County Community College, in Kansas. It’s pretty easy for her to tell when students haven’t done their own work, she said: “I can’t say for certain when they have used AI, but I can always tell when they haven’t written something themselves, because English learners make particular types of mistakes which generative AI will not reproduce, even if they are able to teach it their writing style.”

We also heard from several professors who pointed out particular challenges in online courses, where many of the strategies instructors rely on to mitigate cheating don’t work as easily — or at all.

Holly DeGrow’s approach included redesigning several assignments and adding more in-class writing when she teaches in person “to provide a baseline of student work,” she wrote. “However, that doesn’t happen online, which is where more of the AI work seems to be being submitted,” added DeGrow, an instructor of composition and literature at Mt. Hood Community College.

Building in the kind of classroom culture that promotes trust, the ways Donahoe described, is also harder online, DeGrow notes. The students in her online classes, she said, feel less connected to her, making it easier for them to justify using AI. “They don’t have the experience of putting their name on something and handing it in to the instructor,” she wrote. “It’s one step removed from that human interaction, and they can simply copy, paste, submit.”

ADVERTISEMENT

Still, DeGrow said, some of her other efforts do seem to be making a dent. “Revising my policies and assignments (more possibility for extensions, more personal-experience assignments, a reduction in weekly work) has made it easier for students to do the right thing.”

My role has changed from assistant professor to assistant plagiarism detective.

Quite a number of the faculty members who wrote in expressed a desire for more guidance on these and other difficult issues they face in their teaching. That suggests many colleges have yet to make any official policy on student use of AI, and those that have done so have kept it pretty bare bones, offering a simple statement on academic integrity and perhaps some syllabus language for professors to choose from.

“My institution needs to develop a policy,” wrote Daniel Walther, professor and chair of the history department at Wartburg College. “This will give not only faculty clarity, but also students because we will have consistency across the campus.” The college, he adds, plans to have such a policy by the start of the next academic year.

ADVERTISEMENT

What that policy should be is another question. Some faculty members said they feared colleges failed to recognize the potential dangers of AI, and argued for a complete ban, better detection tools, and a return to in-class, pen-and-paper test-taking. Others worried that higher education is failing to tap into AI’s revolutionary potential. “We need to urgently think about how we will teach students to use the tool effectively,” wrote one, “but it is challenging if we as staff are not sure how to use it either.”

Perhaps one reason colleges aren’t stepping up is that there’s no way to make everyone happy.

We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Tags
Teaching & Learning Technology
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
McMurtrie_Beth.JPG
About the Author
Beth McMurtrie
Beth McMurtrie is a senior writer for The Chronicle of Higher Education, where she focuses on the future of learning and technology’s influence on teaching. In addition to her reported stories, she is a co-author of the weekly Teaching newsletter about what works in and around the classroom. Email her at beth.mcmurtrie@chronicle.com and follow her on LinkedIn.
Supiano_Beckie.jpg
About the Author
Beckie Supiano
Beckie Supiano is a senior writer for The Chronicle of Higher Education, where she covers teaching, learning, and the human interactions that shape them. She is also a co-author of The Chronicle’s free, weekly Teaching newsletter that focuses on what works in and around the classroom. Email her at beckie.supiano@chronicle.com.
ADVERTISEMENT
ADVERTISEMENT

More News

Photo illustration showing Santa Ono seated, places small in the corner of a dark space
'Unrelentingly Sad'
Santa Ono Wanted a Presidency. He Became a Pariah.
Illustration of a rushing crowd carrying HSI letters
Seeking precedent
Funding for Hispanic-Serving Institutions Is Discriminatory and Unconstitutional, Lawsuit Argues
Photo-based illustration of scissors cutting through paper that is a photo of an idyllic liberal arts college campus on one side and money on the other
Finance
Small Colleges Are Banding Together Against a Higher Endowment Tax. This Is Why.
Pano Kanelos, founding president of the U. of Austin.
Q&A
One Year In, What Has ‘the Anti-Harvard’ University Accomplished?

From The Review

Photo- and type-based illustration depicting the acronym AAUP with the second A as the arrow of a compass and facing not north but southeast.
The Review | Essay
The Unraveling of the AAUP
By Matthew W. Finkin
Photo-based illustration of the Capitol building dome propped on a stick attached to a string, like a trap.
The Review | Opinion
Colleges Can’t Trust the Federal Government. What Now?
By Brian Rosenberg
Illustration of an unequal sign in black on a white background
The Review | Essay
What Is Replacing DEI? Racism.
By Richard Amesbury

Upcoming Events

Plain_Acuity_DurableSkills_VF.png
Why Employers Value ‘Durable’ Skills
Warwick_Leadership_Javi.png
University Transformation: a Global Leadership Perspective
  • Explore Content
    • Latest News
    • Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Virtual Events
    • Chronicle Store
    • Chronicle Intelligence
    • Jobs in Higher Education
    • Post a Job
  • Know The Chronicle
    • About Us
    • Vision, Mission, Values
    • DEI at The Chronicle
    • Write for Us
    • Work at The Chronicle
    • Our Reporting Process
    • Advertise With Us
    • Brand Studio
    • Accessibility Statement
  • Account and Access
    • Manage Your Account
    • Manage Newsletters
    • Individual Subscriptions
    • Group and Institutional Access
    • Subscription & Account FAQ
  • Get Support
    • Contact Us
    • Reprints & Permissions
    • User Agreement
    • Terms and Conditions
    • Privacy Policy
    • California Privacy Policy
    • Do Not Sell My Personal Information
1255 23rd Street, N.W. Washington, D.C. 20037
© 2025 The Chronicle of Higher Education
The Chronicle of Higher Education is academe’s most trusted resource for independent journalism, career development, and forward-looking intelligence. Our readers lead, teach, learn, and innovate with insights from The Chronicle.
Follow Us
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin