Skip to content
ADVERTISEMENT
Sign In
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Events
    • Virtual Events
    • Chronicle On-The-Road
    • Professional Development
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
  • More
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Events
    • Virtual Events
    • Chronicle On-The-Road
    • Professional Development
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
    Upcoming Events:
    College Advising
    Serving Higher Ed
    Chronicle Festival 2025
Sign In
Advice

Artificial Intelligence: A Graduate-Student User’s Guide

AI can play a positive role in a doctoral student’s research and writing — if we let it.

By Leonard Cassuto July 25, 2023
illustration of a robot hand holding an ink quill pen writing
Harry Campbell for The Chronicle

Ever since the release of ChatGPT last November, higher education has been engulfed in a deluge of commentary about artificial intelligence. ChatGPT’s ability to converse with its users has provoked debates on the future of educational research, and especially of teaching. But what does AI mean for graduate students? That question has received scant attention.

As teachers of introductory courses, graduate students are in the same boat with AI tools as their professors, but they are seated in the front and getting drenched by the spray. As researchers, the position of doctoral students is more like that of undergraduates, except the use of AI tools in their research and writing raises even more red flags.

To continue reading for FREE, please sign in.

Sign In

Or subscribe now to read with unlimited access for as low as $10/month.

Don’t have an account? Sign up now.

A free account provides you access to a limited number of free articles each month, plus newsletters, job postings, salary data, and exclusive store discounts.

Sign Up

Ever since the release of ChatGPT last November, higher education has been engulfed in a deluge of commentary about artificial intelligence. ChatGPT’s ability to converse with its users has provoked debates on the future of educational research, and especially of teaching. But what does AI mean for graduate students? That question has received scant attention.

As teachers of introductory courses, graduate students are in the same boat with AI tools as their professors, but they are seated in the front and getting drenched by the spray. As researchers, the position of doctoral students is more like that of undergraduates, except the use of AI tools in their research and writing raises even more red flags.

AI and graduate-student teaching. The question of how to teach — and especially how to assess undergraduate work — in the age of AI is being batted about like a piñata in the educational public square. The problem is real, so the anxious pursuit of solutions figures to go on awhile.

Even in these early days of ChatGPT, many undergraduates have proved eager to let mediocre, AI-authored writing substitute for their own work. In empirical subjects, machines can easily outperform humans. As Bryan Caplan, a professor of economics at George Washington University, said in a recent Chronicle forum (“How Will Artificial Intelligence Change Higher Ed?”), it will be all too easy for students to use chatbots to write papers outside of class. And, he added, “unless the exams are in person, they’ll be a farce, too. I’m known for giving difficult tests, yet GPT-4 already gets A’s on them.”

The challenge of cheating-detection has already loosed a torrent of hopes, suggestions, hand-wringing, and apocalyptic warnings. That storm will settle at some point, and generative AI will probably find its place in educational practice. In the short term, here are some steps for graduate-student instructors to consider:

  • You don’t need to be Nostradamus to predict that you will need more in-person exams and in-class writing assignments. This summer, think about how you can adjust your syllabus to confront these realities.
  • Departments and institutions are still developing policies on AI and cheating. If your university is still working on its policy, consider devising one of your own in the interim to post on your syllabus.
  • Read as much as you can on ChatGPT and teaching. Play around with AI tools enough to know how they work. After all, your generation of faculty is on the front lines of this issue, and it’s your teaching that will be affected most.

AI and doctoral research. To what degree ChatGPT and other tools will upend the classroom status quo remains to be seen. But AI can play a positive role in a graduate student’s research and writing — if we let it. There’s already a tendency among faculty members to criminalize the use of AI. That’s appropriate if students are using it to cheat. But not every use of AI is cheating.

A note to professors: Your graduate students probably use AI already. But that doesn’t mean that they’re getting away with something — and we shouldn’t act as though they are. AI isn’t new. The internet has long relied on it. You use it when you do a Google search, for instance. Predictive text — when your phone completes the word you’re typing or offers you the next one — is another instance of how we’ve long been splashing in the shallow end of the AI pool.

As such examples suggest, there is no bright line between “my intelligence” and “other intelligence,” artificial or otherwise. It’s an academic truism that no idea exists in an intellectual vacuum. We use other people’s ideas whenever we quote or paraphrase. The important thing is how.

Writing is a process that often involves collaboration. Writers benefit from feedback, whether from peers or teachers. AI models (there are more on the way) can be collaborators of sorts — provided that you recognize their limitations and work within them, as you would with any collaborator. (Some scholarly organizations explicitly allow for the use of AI now.) You might show your work to one colleague because you know she’s great on the sentence level but not at assessing your whole argument. With another colleague, it might be the other way around.

Likewise, AI tools have strengths and weaknesses. “Language learning models” like Chat GPT are good at generating a lot of basic information about well-known subjects very quickly. They’re also adept at summarizing. Those can be useful advantages for graduate students, especially in the early stages of research. A key weakness is that conversational AI presents its findings in generic, mediocre prose.

ADVERTISEMENT

Ethan Mollick, an associate professor at the University of Pennsylvania Wharton School of Business, recently compared AI to “a high-end intern,” while Merve Tekgürler, a graduate student studying history and symbolic systems at Stanford University, described it in an email as “a thesaurus with context.” Given that AI tools are powerful and potentially useful, the question is: How can graduate students best use them in their scholarship? Among the many options:

  • After you write a draft of a chapter or an essay, feed the information into a chatbot and compare the ideas it spits back with your own.
  • Use an AI tool to brainstorm, as did Darryll J. Pines, president of the University of Maryland at College Park, when he was preparing a speech recently.
  • To save time, use ChatGPT to write a rough draft of your syllabus or other such documents that adhere to widely accepted templates. Then shape the result to make it your own.
  • Allow AI to help you with the details. I recently asked ChatGPT for the proper bibliographic format to cite a particular eccentric source. It supplied a useful answer right away.

Those are just a few of the possibilities. In each case, you are using AI to spark your thinking or to shortcut some time-consuming busy work. And that’s fine, so long as you keep these cautions in mind:

ChatGPT works best on subjects that are widely written about. The reason is simple: AI works by scouring vast amounts of information pulled from the internet. (It’s not directly connected to the internet itself.) The more information that’s on the web about a given subject, the more knowledgeable the AI will be.

ADVERTISEMENT

If you ask for an AI boost on an obscure subject, one of two things will happen: Either the AI will come up empty, or it will make stuff up. (Yes, it really does that. How very human.)

Don’t rely on AI to know things instead of knowing them yourself. AI can lend a helping hand, but it’s an artificial intelligence that isn’t the same as yours. One scientist described to me how younger colleagues often “cobble together a solution” to a problem by using AI. But if the solution doesn’t work, “they don’t have anywhere to turn because they don’t understand the crux of the problem” that they’re trying to solve.

The educational world is rapidly filling with stories of students who submit AI-written papers containing errors that the students don’t catch because they never bothered to learn the material themselves. Those transgressions will receive their just deserts from teachers, supervisors, and/or at the Final Judgment. My point is simply that, as a writer, you have to know the stuff you’re writing about in order to do a good job. If you rely on AI to do the thinking, you become the curator, not the author, of the writing that results.

And without an author, the writing will be bloodless. “These large language models [like Chat GPT] will never have anything related to human emotions,” said a Colorado geoscientist I interviewed. “Emotions, including just the standard motivations that cause us to do anything at all, are completely lacking.”

Emotionless writing might be OK for a user’s manual that tells you how to work your new air conditioner. But scholarly writing — not just in humanities but across the disciplines — needs sensibility. Luckily, sensibility is something that humans, both in and out of graduate school, have plenty of. So keep these cautions in mind, and go ahead and add AI to the research tools at your disposal. Just remember: Use it to help you, not be you.

We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Tags
Graduate Education Teaching & Learning Technology Scholarship & Research
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
cassuto_leonard.jpg
About the Author
Leonard Cassuto
Leonard Cassuto is a professor of English at Fordham University who writes regularly for The Chronicle about graduate education. His newest book is Academic Writing as if Readers Matter, from Princeton University Press. He co-wrote, with Robert Weisbuch, The New Ph.D.: How to Build a Better Graduate Education. He welcomes comments and suggestions at cassuto@fordham.edu. Find him on X @LCassuto.
ADVERTISEMENT
ADVERTISEMENT

More News

Vector illustration of large open scissors  with several workers in seats dangling by white lines
Iced Out
Duke Administrators Accused of Bypassing Shared-Governance Process in Offering Buyouts
Illustration showing money being funnelled into the top of a microscope.
'A New Era'
Higher-Ed Associations Pitch an Alternative to Trump’s Cap on Research Funding
Illustration showing classical columns of various heights, each turning into a stack of coins
Endowment funds
The Nation’s Wealthiest Small Colleges Just Won a Big Tax Exemption
WASHINGTON, DISTICT OF COLUMBIA, UNITED STATES - 2025/04/14: A Pro-Palestinian demonstrator holding a sign with Release Mahmud Khalil written on it, stands in front of the ICE building while joining in a protest. Pro-Palestinian demonstrators rally in front of the ICE building, demanding freedom for Mahmoud Khalil and all those targeted for speaking out against genocide in Palestine. Protesters demand an end to U.S. complicity and solidarity with the resistance in Gaza. (Photo by Probal Rashid/LightRocket via Getty Images)
Campus Activism
An Anonymous Group’s List of Purported Critics of Israel Helped Steer a U.S. Crackdown on Student Activists

From The Review

John T. Scopes as he stood before the judges stand and was sentenced, July 2025.
The Review | Essay
100 Years Ago, the Scopes Monkey Trial Discovered Academic Freedom
By John K. Wilson
Vector illustration of a suited man with a pair of scissors for a tie and an American flag button on his lapel.
The Review | Opinion
A Damaging Endowment Tax Crosses the Finish Line
By Phillip Levine
University of Virginia President Jim Ryan keeps his emotions in check during a news conference, Monday, Nov. 14, 2022 in Charlottesville. Va. Authorities say three people have been killed and two others were wounded in a shooting at the University of Virginia and a student is in custody. (AP Photo/Steve Helber)
The Review | Opinion
Jim Ryan’s Resignation Is a Warning
By Robert Zaretsky

Upcoming Events

07-31-Turbulent-Workday_assets v2_Plain.png
Keeping Your Institution Moving Forward in Turbulent Times
Ascendium_Housing_Plain.png
What It Really Takes to Serve Students’ Basic Needs: Housing
Lead With Insight
  • Explore Content
    • Latest News
    • Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Events
    • Chronicle Store
    • Chronicle Intelligence
    • Jobs in Higher Education
    • Post a Job
  • Know The Chronicle
    • About Us
    • Vision, Mission, Values
    • DEI at The Chronicle
    • Write for Us
    • Work at The Chronicle
    • Our Reporting Process
    • Advertise With Us
    • Brand Studio
    • Accessibility Statement
  • Account and Access
    • Manage Your Account
    • Manage Newsletters
    • Individual Subscriptions
    • Group and Institutional Access
    • Subscription & Account FAQ
  • Get Support
    • Contact Us
    • Reprints & Permissions
    • User Agreement
    • Terms and Conditions
    • Privacy Policy
    • California Privacy Policy
    • Do Not Sell My Personal Information
1255 23rd Street, N.W. Washington, D.C. 20037
© 2025 The Chronicle of Higher Education
The Chronicle of Higher Education is academe’s most trusted resource for independent journalism, career development, and forward-looking intelligence. Our readers lead, teach, learn, and innovate with insights from The Chronicle.
Follow Us
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin