Skip to content
ADVERTISEMENT
Sign In
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
  • More
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
    Upcoming Events:
    A Culture of Cybersecurity
    Opportunities in the Hard Sciences
    Career Preparation
Sign In
Advice

Why You Should Rethink Your Resistance to ChatGPT

How to teach with AI tools in ways that meet faculty concerns about ethics and equity.

By Flower Darby November 13, 2023
Lang-Miller-Jan.jpg
Illustration by The Chronicle, iStock

The resistance began immediately. After I wrote an essay last summer on preparing to teach with AI tools, the very first comment I received was from an instructional designer casting doubt. Many faculty members, she said, had valid ethical concerns about AI and had no plans to use ChatGPT in their courses any time soon.

We’re well into the fall semester, and I am still seeing faculty members divide into three main camps over the ethics of bringing these problematic tools into our teaching:

  • There are, of course, plenty of AI enthusiasts. They are embracing the technology and designing assignments to help students understand how to use it.
  • Another group (and I include myself here) are what I would call AI realists: We see legitimate ethical concerns but — given that ChatGPT is here to stay — we favor figuring out how to use it and how to equip students (and ourselves) for a rapidly changing workplace.
  • Finally, there is a significant pool of AI resistors. I’m hearing from many of them when I give talks on this issue. In early fall, for example, I gave a virtual presentation on this issue to a group of community colleges. Two strong naysayers insisted that it was unethical of me to even encourage faculty members to bring this “biased” tool into our courses.

To continue reading for FREE, please sign in.

Sign In

Or subscribe now to read with unlimited access for as low as $10/month.

Don’t have an account? Sign up now.

A free account provides you access to a limited number of free articles each month, plus newsletters, job postings, salary data, and exclusive store discounts.

Sign Up

The resistance began immediately. After I wrote an essay last summer on preparing to teach with AI tools, the very first comment I received was from an instructional designer casting doubt. Many faculty members, she said, had valid ethical concerns about AI and had no plans to use ChatGPT in their courses any time soon.

We’re well into the fall semester, and I am still seeing faculty members divide into three main camps over the ethics of bringing these problematic tools into our teaching:

  • There are, of course, plenty of AI enthusiasts. They are embracing the technology and designing assignments to help students understand how to use it.
  • Another group (and I include myself here) are what I would call AI realists: We see legitimate ethical concerns but — given that ChatGPT is here to stay — we favor figuring out how to use it and how to equip students (and ourselves) for a rapidly changing workplace.
  • Finally, there is a significant pool of AI resistors. I’m hearing from many of them when I give talks on this issue. In early fall, for example, I gave a virtual presentation on this issue to a group of community colleges. Two strong naysayers insisted that it was unethical of me to even encourage faculty members to bring this “biased” tool into our courses.

No doubt that mix of reactions is as much here to stay as AI. I understand the resistance. But I also share the growing sentiment in college-teaching circles that “if you’re not using AI, you’re falling behind.” We do our students a disservice — and we do not advance equitable outcomes in education or society at large — if we refuse to incorporate ChatGPT and other AI tools in the college classroom.

I got to thinking more about the ethical and equity aspects of teaching with AI (or refusing to) after I read a blog series on the topic by Leon Furze, a writer and educational consultant. Some of the faculty reluctance is political: Instructors worry about the human-labor and environmental costs of AI, and argue that the new tools seem to reflect and reinforce existing online biases. Among the costs: A lot of instructors — especially in contingent positions — may not have the time to invest in becoming teaching-with-AI experts, adding another layer of inequity to an already-imbalanced system of faculty haves and have-nots.

Some of the concerns are about privacy issues. When professors ask students to use a particular AI tool for class, they often have to create a login. Do students realize the personal data they may have to surrender to use tools that are external to the campus IT systems? That data may be later used for unsavory purposes, wrote Justin Reich, director of the Teaching Systems Lab at the Massachusetts Institute of Technology, in Failure to Disrupt: Why Technology Alone Can’t Transform Education.

At the same time, recent college graduates are anxious about how AI will affect them as job seekers, according to a July survey. Yet an analysis of the results suggests that some faculty members are not stepping up to meet that need: “About 54 percent of students said their instructors didn’t openly discuss the use of AI tools, and 60 percent of students said their instructors or schools didn’t specify how to use AI tools ethically or responsibly.”

A more recent study found that nearly half (49 percent) of college students are using generative AI tools, but only 22 percent of faculty members. That disconnect highlights faculty hesitation to use AI as much as it reveals potential resistance. The study also shows a continuing lack of institutional guidance or policies on AI use in teaching and learning, and emphasizes the need to deal with ethical and equity considerations via such policies.

You may never be an AI enthusiast, but banning it from your courses will never work. Our students are resourceful and, in using ChatGPT on assignments, are simply doing what humans have always done — i.e., taking advantage of available tools to reduce their workload, especially if said work is perceived to be difficult, time-consuming, unimaginative, and unrewarding.

What I advocate here is choosing the middle ground. If we aim to prepare today’s students to tackle tomorrow’s problems, we would do well to teach them how to think about and use AI tools to enhance their work. Neglecting to do so disadvantages students and may exacerbate existing inequities, many of which fall along racial and socioeconomic lines. We can and should teach them (and ourselves) to streamline mundane tasks in order to free up more time and cognitive resources for the work that chatbots can’t do: genuine creativity and higher-order thinking, both of which are still unique to human intellect.

To that end, we should bring AI into our syllabi, class activities, and assignments. Here are five ideas to get you started.

Be explicit about the use of AI tools in your class. Invite students to help shape your course policy with comments and suggestions. As educational developer Maha Bali argues in a video on AI literacy, the key to an effective policy is transparency, regarding both (a) how students can, and can’t, use AI for classwork, and (b) how to disclose that use in an assignment.

Add your AI policy to the syllabus. Better yet, collaborate with colleagues to draft departmental and collegewide policies. Create a public draft of the proposed policy and invite students to comment on it, ask questions about it, and make suggestions to the wording.

ADVERTISEMENT

Teach them how to use AI tools appropriately. In her video, Bali, who teaches digital literacies at the American University in Cairo, says she shows students how they might benefit from AI use in her class. For example, she encourages them to use AI to generate ideas, refine their first drafts, or even start an assignment with a ChatGPT-created draft and then make it their own. Writing instructors, and others for whom writing itself is a learning outcome, may not want to take that approach, but Bali’s example is instructive. It reminds faculty members to carefully analyze whether using AI will help or hurt students’ achievement in course outcomes and assignments.

Another example: Show students how to use generative AI to create visuals or logos for course projects. This strategy could be applied if you assign students to create websites, marketing campaigns, social-media posts, or promotional materials.

For more ideas on this front:

  • Daniel Stanford, an instructional-tech consultant and blogger, provides a helpful (and not too overwhelming) list of ways to incorporate AI in teaching.
  • Ethan Mollick, an associate professor of management at the University of Pennsylvania, wrote “Seven Ways of Using AI in Class” (plus a whole lot more on AI in various blog posts).
  • For those with time and energy to spend, peruse the open-access book, 101 Creative Ideas to Use AI in Education.

Demonstrate in class how scholars in your discipline might use AI. In a recent episode on his Assess Without the Stress podcast, Caleb Curfman, a history instructor at Northland Community and Technical College, in Minnesota, told me how he modeled the effective use of AI in his history course for the spring of 2023 semester. Initially, he said, he was anxious about how these tools would affect his courses. Then he landed on an in-class activity in which he and his students asked ChatGPT to design the perfect government. In class, he and his students suggested additional questions and prompts to refine the output. This approach helped students learn how to use an AI tool without requiring them to each create a login.

ADVERTISEMENT

Analyze AI results in class. Ask student teams to generate text, images, or code, and then evaluate the chatbot’s results. Similar to Curfman’s classwide exercise, this strategy could help avoid asking students to create logins, especially those who may not want to. It’s likely that assigning such tasks in groups will mean that at least one student already has access to one or more AI tools.

Still not convinced you should teach with AI? Then teach about AI. Assign activities aimed to help students become more critical consumers of AI. Autumm Caines, an instructional designer at the University of Michigan at Dearborn, offered several examples of how to encourage students to consider the ethical considerations surrounding AI, such as discussing climate and labor concerns related to the development and use of AI, or conducting a “technoethical audit” of tech that students might be asked to use in other classes.

Well-meaning faculty members have argued that it’s better not to require the use of AI in coursework because of the many varied ethical concerns regarding privacy of student data and the unequal access to technology. But realistically, it seems very likely that most students are already using chatbots on their phones and other devices, perhaps to cut corners on tasks they perceive to be tedious, busywork, or otherwise meaningless. Why leave the other students in the dark?

We can better prepare all students for the future if we teach them the responsible use of AI in class.

A version of this article appeared in the December 8, 2023, issue.
We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Tags
Teaching & Learning Technology
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
About the Author
Flower Darby
Flower Darby is an associate director of the Teaching for Learning Center at the University of Missouri at Columbia and co-author of The Norton Guide to Equity-Minded Teaching, published in March 2023. Find her here on LinkedIn.
ADVERTISEMENT
ADVERTISEMENT

More News

Harvard University
'Deeply Unsettling'
Harvard’s Battle With Trump Escalates as Research Money Is Suddenly Canceled
Photo-based illustration of a hand and a magnifying glass focusing on a scene from Western Carolina Universiy
Equal Opportunity
The Trump Administration Widens Its Scrutiny of Colleges, With Help From the Internet
Santa J. Ono, president of the University of Michigan, watches a basketball game on the campus in November 2022.
'He Is a Chameleon'
At U. of Michigan, Frustrations Grew Over a President Who Couldn’t Be Pinned Down
Photo-based illustration of University of Michigan's president Jeremy Santa Ono emerging from a red shape of Florida
Leadership
A Major College-President Transition Is Defined by an About-Face on DEI

From The Review

Photo-based illustration of a college building under an upside down baby crib
The Review | Opinion
Colleges Must Stop Infantilizing Everyone
By Gregory Conti
Photo illustration of Elon Musk and the Dome of the U.S. Capitol
The Review | Opinion
On Student Aid, It’s Congressional Republicans vs. DOGE
By Robert Gordon, Jordan Matsudaira
Photo-based illustration of a closeup of a blue-toned eye with a small hand either pushing or pulling a red piece of film over the top
The Review | Essay
We Don’t Need More Administrators Inspecting Our Ideas
By Nicolas Langlitz

Upcoming Events

Ascendium_06-10-25_Plain.png
Views on College and Alternative Pathways
Coursera_06-17-25_Plain.png
AI and Microcredentials
  • Explore Content
    • Latest News
    • Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Virtual Events
    • Chronicle Store
    • Chronicle Intelligence
    • Jobs in Higher Education
    • Post a Job
  • Know The Chronicle
    • About Us
    • Vision, Mission, Values
    • DEI at The Chronicle
    • Write for Us
    • Work at The Chronicle
    • Our Reporting Process
    • Advertise With Us
    • Brand Studio
    • Accessibility Statement
  • Account and Access
    • Manage Your Account
    • Manage Newsletters
    • Individual Subscriptions
    • Group and Institutional Access
    • Subscription & Account FAQ
  • Get Support
    • Contact Us
    • Reprints & Permissions
    • User Agreement
    • Terms and Conditions
    • Privacy Policy
    • California Privacy Policy
    • Do Not Sell My Personal Information
1255 23rd Street, N.W. Washington, D.C. 20037
© 2025 The Chronicle of Higher Education
The Chronicle of Higher Education is academe’s most trusted resource for independent journalism, career development, and forward-looking intelligence. Our readers lead, teach, learn, and innovate with insights from The Chronicle.
Follow Us
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin