Skip to content
ADVERTISEMENT
Sign In
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
  • More
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
    Upcoming Events:
    Hands-On Career Preparation
    An AI-Driven Work Force
    Alternative Pathways
Sign In
Koch-9-21-full-bleed.jpg
Chad Hagen for The Chronicle

It’s Not Just Our Students — ChatGPT Is Coming for Faculty Writing

And there’s little agreement on the rules that should govern it.

The Review | Opinion
By Ben Chrisinger February 22, 2023

Almost immediately after OpenAI released ChatGPT in late November, people began wondering what it would mean for teaching and learning. A widely read piece in The Atlantic that provided one of the first looks at the tool’s ability to put together high-quality writing concluded that it would kill the student essay. Since then, academics everywhere have done their own experimenting with the technology — and weighed in on what to do about it. Some have banned students from using it, while others have offered tips on how to create essay assignments that are AI-proof. Many have suggested that we embrace the technology and incorporate it into the classroom.

To continue reading for FREE, please sign in.

Sign In

Or subscribe now to read with unlimited access for as low as $10/month.

Don’t have an account? Sign up now.

A free account provides you access to a limited number of free articles each month, plus newsletters, job postings, salary data, and exclusive store discounts.

Sign Up

Almost immediately after OpenAI released ChatGPT in late November, people began wondering what it would mean for teaching and learning. A widely read piece in The Atlantic that provided one of the first looks at the tool’s ability to put together high-quality writing concluded that it would kill the student essay. Since then, academics everywhere have done their own experimenting with the technology — and weighed in on what to do about it. Some have banned students from using it, while others have offered tips on how to create essay assignments that are AI-proof. Many have suggested that we embrace the technology and incorporate it into the classroom.

While we’ve been busy worrying about what ChatGPT could mean for students, we haven’t devoted nearly as much attention to what it could mean for academics themselves. And it could mean a lot. Critically, academics disagree on exactly how AI can and should be used. And with the rapidly improving technology at our doorstep, we have little time to deliberate.

Already some researchers are using the technology. Among only the small sample of my work colleagues, I’ve learned that it is being used for such daily tasks as: translating code from one programming language to another, potentially saving hours spent searching web forums for a solution; generating plain-language summaries of published research, or identifying key arguments on a particular topic; and creating bullet points to pull into a presentation or lecture.

Even this limited use is complicated. Different audiences — journal editors, grant panels, conference attendees, students — will have different expectations about originality for particular tasks. For example, while peer reviewers might accept translated statistical code, students might balk at AI-generated lecture slides.

But it’s in the realm of academic writing and research where ethical debates about transparency and fairness really come into play.

Recently, several leading academic journals and publishers updated their submission guidelines to explicitly ban researchers from listing ChatGPT as a co-author, or using text copied from a ChatGPT response. Some professors have criticized these bans as shortsightedly resistant to an inevitable technological change. We shouldn’t be surprised at the disagreement. This is a new ethical space that only roughly follows the outlines of our existing agreements on plagiarism, authorship criteria, and fraud. Precisely where to draw red lines is not clear.

For example, the editors of Science have decided that authors should not use text generated by ChatGPT in a submitted manuscript. Fair enough. But can authors use ChatGPT to generate an early outline for a manuscript? Though not an exact copy-paste of text, is that not a copy-paste of AI-generated ideas? Academic research desperately needs a broader set of principles to inform future debates over rules and norms.

Academic research desperately needs a broader set of principles to inform future debates over rules and norms.

What feels most different about ChatGPT compared to other assistive technologies is the possible reduction of intellectual labor. For most professors, writing — even bad first drafts or outlines — requires our labor (and sometimes strain) to develop an original thought. If the goal is to write a paper that introduces boundary-breaking new ideas, AI tools might reduce some of the intellectual effort needed to make that happen.

Of course most papers are not breaking new ground. That’s because academe also features peculiar incentives that could strongly influence how researchers decide if and how to use AI assistance. Most obvious is the pressure to produce writing — and lots of it. This includes journal articles, books, and conference papers, but also proposals for grants and fellowships (which, in turn, lead to more academic writing). For many of those on the tenure track, the number of published works matters, even where “quality over quantity” is emphasized. While we might aspire to high-minded pursuit of new knowledge, in this pressurized environment, sometimes we settle for what’s good enough to satisfy peer reviewers, editors, or grant panels.

Some will see that as a smart use of time, not evidence of intellectual laziness. After all, if we can eliminate the struggle of staring at a blank page and blinking cursor, won’t that leave us much more time for the more creative and exciting parts of academic research? Yes, possibly. But there is critical room for inequality here, especially in departments and fields that value frequent publication. Researchers who adopt AI assistance may raise the bar, leaving behind those who choose not to use it, or who cannot. Notably, our current debates have been sparked by a free version of ChatGPT; pricing structures are likely to be forthcoming.

We can only monitor whether AI technologies are exacerbating existing inequalities in research (or creating new ones) if we know how they are being used. To do this, we can borrow from existing academic models around authorship, like author-contribution statements. One function of these statements is to shine a light on the often-unequal distribution of labor required to produce an academic journal article. Another is to ensure that authors with relatively large contributions are recognized fairly for those inputs.

That question of fairness is an especially difficult one. Each discipline and audience will need time to decide if and why red lines should be drawn, being careful to not stifle innovation while also examining questions of quality, rigor, and equity. Still, we should urgently adopt a principle of transparency for the use of ChatGPT and similar AI technologies.

Our academic systems rely on trust. As a peer reviewer for grants and journal articles, I’ve never used a plagiarism checker or directly questioned the accuracy of an author-contribution statement. Compare this to my students’ essays, which are automatically passed through plagiarism-checking software upon submission. Academics enjoy an environment where we might challenge claims and critique the novelty of ideas, but we rarely question the originality of each other’s written work.

For this system of trust to hold in academe, we must firmly and rapidly commit to transparency around the use of AI. Only then can we hope to have informed and reasoned discussions about what norms and rules should govern academic writing in the future.

A version of this article appeared in the March 3, 2023, issue.
We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Tags
Scholarship & Research Opinion Technology
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
About the Author
Ben Chrisinger
Ben Chrisinger is an associate professor of evidence-based policy evaluation in the department of social policy and intervention at the University of Oxford.
ADVERTISEMENT
ADVERTISEMENT

More News

Graphic vector illustration of a ship with education-like embellishments being tossed on a black sea with a Kraken-esque elephant trunk ascending from the depth against a stormy red background.
Creeping concerns
Most Colleges Aren’t a Target of Trump (Yet). Here’s How Their Presidents Are Leading.
Photo-based illustration of calendars on a wall (July, August and September) with a red line marking through most of the dates
'A Creative Solution'
Facing Federal Uncertainty, Swarthmore Makes a Novel Plan: the 3-Month Budget
Marva Johnson is set to take the helm of Florida A&M University this summer.
Leadership & governance
‘Surprising': A DeSantis-Backed Lobbyist Is Tapped to Lead Florida A&M
Students and community members protest outside of Coffman Memorial Union at the University of Minnesota in Minneapolis, on Tuesday, April 23, 2024.
Campus Activism
One Year After the Encampments, Campuses Are Quieter and Quicker to Stop Protests

From The Review

Glenn Loury in Providence, R.I. on May 7, 2024.
The Review | Conversation
Glenn Loury on the ‘Barbarians at the Gates’
By Evan Goldstein, Len Gutkin
Illustration showing a valedictorian speaker who's tassel is a vintage microphone
The Review | Opinion
A Graduation Speaker Gets Canceled
By Corey Robin
Illustration showing a stack of coins and a university building falling over
The Review | Opinion
Here’s What Congress’s Endowment-Tax Plan Might Cost Your College
By Phillip Levine

Upcoming Events

Ascendium_06-10-25_Plain.png
Views on College and Alternative Pathways
Coursera_06-17-25_Plain.png
AI and Microcredentials
  • Explore Content
    • Latest News
    • Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Virtual Events
    • Chronicle Store
    • Chronicle Intelligence
    • Jobs in Higher Education
    • Post a Job
  • Know The Chronicle
    • About Us
    • Vision, Mission, Values
    • DEI at The Chronicle
    • Write for Us
    • Work at The Chronicle
    • Our Reporting Process
    • Advertise With Us
    • Brand Studio
    • Accessibility Statement
  • Account and Access
    • Manage Your Account
    • Manage Newsletters
    • Individual Subscriptions
    • Group and Institutional Access
    • Subscription & Account FAQ
  • Get Support
    • Contact Us
    • Reprints & Permissions
    • User Agreement
    • Terms and Conditions
    • Privacy Policy
    • California Privacy Policy
    • Do Not Sell My Personal Information
1255 23rd Street, N.W. Washington, D.C. 20037
© 2025 The Chronicle of Higher Education
The Chronicle of Higher Education is academe’s most trusted resource for independent journalism, career development, and forward-looking intelligence. Our readers lead, teach, learn, and innovate with insights from The Chronicle.
Follow Us
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin