Skip to content
ADVERTISEMENT
Sign In
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Events
    • Virtual Events
    • Chronicle On-The-Road
    • Professional Development
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
  • More
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Events
    • Virtual Events
    • Chronicle On-The-Road
    • Professional Development
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
    Upcoming Events:
    College Advising
    Serving Higher Ed
    Chronicle Festival 2025
Sign In
Koch-9-21-full-bleed.jpg
Chad Hagen for The Chronicle

It’s Not Just Our Students — ChatGPT Is Coming for Faculty Writing

And there’s little agreement on the rules that should govern it.

The Review | Opinion
By Ben Chrisinger February 22, 2023

Almost immediately after OpenAI released ChatGPT in late November, people began wondering what it would mean for teaching and learning. A widely read piece in The Atlantic that provided one of the first looks at the tool’s ability to put together high-quality writing concluded that it would kill the student essay. Since then, academics everywhere have done their own experimenting with the technology — and weighed in on what to do about it. Some have banned students from using it, while others have offered tips on how to create essay assignments that are AI-proof. Many have suggested that we embrace the technology and incorporate it into the classroom.

To continue reading for FREE, please sign in.

Sign In

Or subscribe now to read with unlimited access for as low as $10/month.

Don’t have an account? Sign up now.

A free account provides you access to a limited number of free articles each month, plus newsletters, job postings, salary data, and exclusive store discounts.

Sign Up

Almost immediately after OpenAI released ChatGPT in late November, people began wondering what it would mean for teaching and learning. A widely read piece in The Atlantic that provided one of the first looks at the tool’s ability to put together high-quality writing concluded that it would kill the student essay. Since then, academics everywhere have done their own experimenting with the technology — and weighed in on what to do about it. Some have banned students from using it, while others have offered tips on how to create essay assignments that are AI-proof. Many have suggested that we embrace the technology and incorporate it into the classroom.

While we’ve been busy worrying about what ChatGPT could mean for students, we haven’t devoted nearly as much attention to what it could mean for academics themselves. And it could mean a lot. Critically, academics disagree on exactly how AI can and should be used. And with the rapidly improving technology at our doorstep, we have little time to deliberate.

Already some researchers are using the technology. Among only the small sample of my work colleagues, I’ve learned that it is being used for such daily tasks as: translating code from one programming language to another, potentially saving hours spent searching web forums for a solution; generating plain-language summaries of published research, or identifying key arguments on a particular topic; and creating bullet points to pull into a presentation or lecture.

Even this limited use is complicated. Different audiences — journal editors, grant panels, conference attendees, students — will have different expectations about originality for particular tasks. For example, while peer reviewers might accept translated statistical code, students might balk at AI-generated lecture slides.

But it’s in the realm of academic writing and research where ethical debates about transparency and fairness really come into play.

Recently, several leading academic journals and publishers updated their submission guidelines to explicitly ban researchers from listing ChatGPT as a co-author, or using text copied from a ChatGPT response. Some professors have criticized these bans as shortsightedly resistant to an inevitable technological change. We shouldn’t be surprised at the disagreement. This is a new ethical space that only roughly follows the outlines of our existing agreements on plagiarism, authorship criteria, and fraud. Precisely where to draw red lines is not clear.

For example, the editors of Science have decided that authors should not use text generated by ChatGPT in a submitted manuscript. Fair enough. But can authors use ChatGPT to generate an early outline for a manuscript? Though not an exact copy-paste of text, is that not a copy-paste of AI-generated ideas? Academic research desperately needs a broader set of principles to inform future debates over rules and norms.

Academic research desperately needs a broader set of principles to inform future debates over rules and norms.

What feels most different about ChatGPT compared to other assistive technologies is the possible reduction of intellectual labor. For most professors, writing — even bad first drafts or outlines — requires our labor (and sometimes strain) to develop an original thought. If the goal is to write a paper that introduces boundary-breaking new ideas, AI tools might reduce some of the intellectual effort needed to make that happen.

Of course most papers are not breaking new ground. That’s because academe also features peculiar incentives that could strongly influence how researchers decide if and how to use AI assistance. Most obvious is the pressure to produce writing — and lots of it. This includes journal articles, books, and conference papers, but also proposals for grants and fellowships (which, in turn, lead to more academic writing). For many of those on the tenure track, the number of published works matters, even where “quality over quantity” is emphasized. While we might aspire to high-minded pursuit of new knowledge, in this pressurized environment, sometimes we settle for what’s good enough to satisfy peer reviewers, editors, or grant panels.

Some will see that as a smart use of time, not evidence of intellectual laziness. After all, if we can eliminate the struggle of staring at a blank page and blinking cursor, won’t that leave us much more time for the more creative and exciting parts of academic research? Yes, possibly. But there is critical room for inequality here, especially in departments and fields that value frequent publication. Researchers who adopt AI assistance may raise the bar, leaving behind those who choose not to use it, or who cannot. Notably, our current debates have been sparked by a free version of ChatGPT; pricing structures are likely to be forthcoming.

We can only monitor whether AI technologies are exacerbating existing inequalities in research (or creating new ones) if we know how they are being used. To do this, we can borrow from existing academic models around authorship, like author-contribution statements. One function of these statements is to shine a light on the often-unequal distribution of labor required to produce an academic journal article. Another is to ensure that authors with relatively large contributions are recognized fairly for those inputs.

That question of fairness is an especially difficult one. Each discipline and audience will need time to decide if and why red lines should be drawn, being careful to not stifle innovation while also examining questions of quality, rigor, and equity. Still, we should urgently adopt a principle of transparency for the use of ChatGPT and similar AI technologies.

Our academic systems rely on trust. As a peer reviewer for grants and journal articles, I’ve never used a plagiarism checker or directly questioned the accuracy of an author-contribution statement. Compare this to my students’ essays, which are automatically passed through plagiarism-checking software upon submission. Academics enjoy an environment where we might challenge claims and critique the novelty of ideas, but we rarely question the originality of each other’s written work.

For this system of trust to hold in academe, we must firmly and rapidly commit to transparency around the use of AI. Only then can we hope to have informed and reasoned discussions about what norms and rules should govern academic writing in the future.

A version of this article appeared in the March 3, 2023, issue.
We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Tags
Scholarship & Research Opinion Technology
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
About the Author
Ben Chrisinger
Ben Chrisinger is an associate professor of evidence-based policy evaluation in the department of social policy and intervention at the University of Oxford.
ADVERTISEMENT
ADVERTISEMENT

More News

Vector illustration of large open scissors  with several workers in seats dangling by white lines
Iced Out
Duke Administrators Accused of Bypassing Shared-Governance Process in Offering Buyouts
Illustration showing money being funnelled into the top of a microscope.
'A New Era'
Higher-Ed Associations Pitch an Alternative to Trump’s Cap on Research Funding
Illustration showing classical columns of various heights, each turning into a stack of coins
Endowment funds
The Nation’s Wealthiest Small Colleges Just Won a Big Tax Exemption
WASHINGTON, DISTICT OF COLUMBIA, UNITED STATES - 2025/04/14: A Pro-Palestinian demonstrator holding a sign with Release Mahmud Khalil written on it, stands in front of the ICE building while joining in a protest. Pro-Palestinian demonstrators rally in front of the ICE building, demanding freedom for Mahmoud Khalil and all those targeted for speaking out against genocide in Palestine. Protesters demand an end to U.S. complicity and solidarity with the resistance in Gaza. (Photo by Probal Rashid/LightRocket via Getty Images)
Campus Activism
An Anonymous Group’s List of Purported Critics of Israel Helped Steer a U.S. Crackdown on Student Activists

From The Review

John T. Scopes as he stood before the judges stand and was sentenced, July 2025.
The Review | Essay
100 Years Ago, the Scopes Monkey Trial Discovered Academic Freedom
By John K. Wilson
Vector illustration of a suited man with a pair of scissors for a tie and an American flag button on his lapel.
The Review | Opinion
A Damaging Endowment Tax Crosses the Finish Line
By Phillip Levine
University of Virginia President Jim Ryan keeps his emotions in check during a news conference, Monday, Nov. 14, 2022 in Charlottesville. Va. Authorities say three people have been killed and two others were wounded in a shooting at the University of Virginia and a student is in custody. (AP Photo/Steve Helber)
The Review | Opinion
Jim Ryan’s Resignation Is a Warning
By Robert Zaretsky

Upcoming Events

07-31-Turbulent-Workday_assets v2_Plain.png
Keeping Your Institution Moving Forward in Turbulent Times
Ascendium_Housing_Plain.png
What It Really Takes to Serve Students’ Basic Needs: Housing
Lead With Insight
  • Explore Content
    • Latest News
    • Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Events
    • Chronicle Store
    • Chronicle Intelligence
    • Jobs in Higher Education
    • Post a Job
  • Know The Chronicle
    • About Us
    • Vision, Mission, Values
    • DEI at The Chronicle
    • Write for Us
    • Work at The Chronicle
    • Our Reporting Process
    • Advertise With Us
    • Brand Studio
    • Accessibility Statement
  • Account and Access
    • Manage Your Account
    • Manage Newsletters
    • Individual Subscriptions
    • Group and Institutional Access
    • Subscription & Account FAQ
  • Get Support
    • Contact Us
    • Reprints & Permissions
    • User Agreement
    • Terms and Conditions
    • Privacy Policy
    • California Privacy Policy
    • Do Not Sell My Personal Information
1255 23rd Street, N.W. Washington, D.C. 20037
© 2025 The Chronicle of Higher Education
The Chronicle of Higher Education is academe’s most trusted resource for independent journalism, career development, and forward-looking intelligence. Our readers lead, teach, learn, and innovate with insights from The Chronicle.
Follow Us
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin