Skip to content
ADVERTISEMENT
Sign In
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
  • More
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
    Upcoming Events:
    Hands-On Career Preparation
    An AI-Driven Work Force
    Alternative Pathways
Sign In
illustration of a blue hand holding a cellphone that has a book with letters flying out of it
Jerome Corgier for The Chronicle

Will Artificial Intelligence Kill College Writing?

Online programs can churn out decent papers on the cheap. What now?

The Review | Opinion
By Jeff Schatten September 14, 2022

When I was a kid, my favorite poem was Shel Silverstein’s “The Homework Machine,” which summed up my childhood fantasy: a machine that could do my homework at the press of a button. Decades later that technology, the innocuously titled GPT-3, has arrived. It threatens many aspects of university education — above all, college writing.

The web-based GPT-3 software program, which was developed by an Elon Musk-backed nonprofit called OpenAI, is a kind of omniscient Siri or Alexa that can turn any prompt into prose. You type in a query — say, a list of ingredients (what can I make with eggs, garlic, mushrooms, butter, and feta cheese?) or a genre and prompt (write an inspiring TED Talk on the ways in which authentic leaders can change the world) — and GPT-3 spits out a written response. These outputs can be astonishingly specific and tailored. When asked to write “a song protesting inhumane treatment of animals in the style of Bob Dylan,” the program clearly draws on themes from Dylan’s “Blowin’ in the Wind:”

To continue reading for FREE, please sign in.

Sign In

Or subscribe now to read with unlimited access for as low as $10/month.

Don’t have an account? Sign up now.

A free account provides you access to a limited number of free articles each month, plus newsletters, job postings, salary data, and exclusive store discounts.

Sign Up

When I was a kid, my favorite poem was Shel Silverstein’s “The Homework Machine,” which summed up my childhood fantasy: a machine that could do my homework at the press of a button. Decades later that technology, the innocuously titled GPT-3, has arrived. It threatens many aspects of university education — above all, college writing.

The web-based GPT-3 software program, which was developed by an Elon Musk-backed nonprofit called OpenAI, is a kind of omniscient Siri or Alexa that can turn any prompt into prose. You type in a query — say, a list of ingredients (what can I make with eggs, garlic, mushrooms, butter, and feta cheese?) or a genre and prompt (write an inspiring TED Talk on the ways in which authentic leaders can change the world) — and GPT-3 spits out a written response. These outputs can be astonishingly specific and tailored. When asked to write “a song protesting inhumane treatment of animals in the style of Bob Dylan,” the program clearly draws on themes from Dylan’s “Blowin’ in the Wind:”

How many more creatures must suffer?
How many more must die?
Before we open up our eyes
And see the harm we’re causing?

When asked to treat the same issue in the style of Shakespeare, it produces stanzas of iambic tetrameter in appropriately archaic English:

By all the gods that guide this Earth
By all the stars that fill the sky
I swear to end this wretched dearth
This blight of blood and butchery.

GPT-3 can write essays, op-eds, Tweets, jokes (admittedly just dad jokes for now), dialogue, advertisements, text messages, and restaurant reviews, to give just a few examples. Each time you click the “submit” button, the machine learning algorithm pulls from the wisdom of the entire internet and generates a unique output, so that no two end products are the same.

The quality of GPT-3’s writing is often striking. I asked the AI to “discuss how free speech threatens a dictatorship, by drawing on free speech battles in China and Russia and how these relate to the First Amendment of the U.S. Constitution.” The resulting text begins, “Free speech is vital to the success of any democracy, but it can also be a thorn in the side of autocrats who seek to control the flow of information and quash dissent.” Impressive.

If anyone can produce a high-quality essay using an AI system, then what’s the point of spending four years (and often a lot of money) getting a degree?
From an essay written by the GPT-3 software program

The current iteration of GPT-3 has its quirks and limitations, to be sure. Most notably, it will write absolutely anything. It will generate a full essay on “how George Washington invented the internet” or an eerily informed response to “10 steps a serial killer can take to get away with murder.” In addition, it stumbles over complex writing tasks. It cannot craft a novel or even a decent short story. Its attempts at scholarly writing — I asked it to generate an article on social-role theory and negotiation outcomes — are laughable. But how long before the capability is there? Six months ago, GPT-3 struggled with rudimentary queries, and today it can write a reasonable blog post discussing “ways an employee can get a promotion from a reluctant boss.”

Since the output of every inquiry is original, GPT-3’s products cannot be detected by anti-plagiarism software. Anyone can create an account for GPT-3. Each inquiry comes at a cost, but it’s usually less than a penny — and the turnaround is instantaneous. Hiring someone to write a college-level essay, in contrast, currently costs $15 to $35 per page. The near-free price point of GPT-3 is likely to entice many students who would otherwise be priced out of essay-writing services.

It won’t be long before GPT-3, and the inevitable copycats, infiltrate the university. The technology is just too good and too cheap not to make its way into the hands of students who would prefer not to spend an evening perfecting the essay I routinely assign on the leadership style of Elon Musk. Ironic that he has bankrolled the technology that makes this evasion possible.

To help me think through what the collision of AI and higher ed might entail, I naturally asked GPT-3 to write an op-ed “exploring the ramifications of GPT-3 threatening the integrity of college essays.” GPT-3 noted, with mechanical unself-consciousness, that it threatened to “undermine the value of a college education.” “If anyone can produce a high-quality essay using an AI system,” it continued, “then what’s the point of spending four years (and often a lot of money) getting a degree? College degrees would become little more than pieces of paper if they can be easily replicated by machines.”

The effects on college students themselves, the algorithm wrote, would be mixed: “On the positive side, students would be able to focus on other aspects of their studies and would not have to spend time worrying about writing essays. On the negative side, however, they will not be able to communicate effectively and will have trouble in their future careers.” Here GPT-3 may actually be understating the threat to writing: Given the rapid development of AI, what percent of college freshmen today will have jobs that require writing at all by the time they graduate? Some who would once have pursued writing-focused careers will find themselves instead managing the inputs and outputs of AI. And once AI can automate that, even those employees may become redundant. In this new world, the argument for writing as a practical necessity looks decidedly weaker. Even business schools may soon take a liberal-arts approach, framing writing not as career prep but as the foundation of a rich and meaningful life.

So what is a college professor to do? I put the question to GPT-3, which acknowledged that “there is no easy answer to this question.” Still, I think we can take some sensible measures to reduce the use of GPT-3 — or at least push back the clock on its adoption by students. Professors can require students to draw on in-class material in their essays, and to revise their work in response to instructor feedback. We can insist that students cite their sources fully and accurately (something that GPT-3 currently can’t do well). We can ask students to produce work in forms that AI cannot (yet) effectively create, such as podcasts, PowerPoints, and verbal presentations. And we can design writing prompts that GPT-3 won’t be able to effectively address, such as those that focus on local or university-specific challenges that are not widely discussed online. If necessary, we could even require students to write assignments in an offline, proctored computer lab.

Eventually, we might enter the “if you can’t beat ‘em, join ‘em” phase, in which professors ask students to use AI as a tool and assess their ability to analyze and improve the output. (I am currently experimenting with a minor assignment along these lines.) A recent project on Beethoven’s 10th symphony suggests how such projects might work. When he died, Beethoven had composed only 5 percent of his 10th symphony. A handful of Beethoven scholars fed the short, completed section into an AI that generated thousands of potential versions of the rest of the symphony. The scholars then sifted through the AI-generated material, identified the best parts, and pieced them together to create a complete symphony. To my somewhat limited ear, it sounds just like Beethoven.

A version of this article appeared in the September 30, 2022, issue.
We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Tags
Innovation & Transformation Teaching & Learning Technology
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
About the Author
Jeff Schatten
Jeff Schatten is an associate professor of business administration at Washington and Lee University. Email him at schattenj@wlu.edu.
ADVERTISEMENT
ADVERTISEMENT

More News

Photo-based illustration of scissors cutting through a flat black and white university building and a landscape bearing the image of a $100 bill.
Budget Troubles
‘Every Revenue Source Is at Risk’: Under Trump, Research Universities Are Cutting Back
Photo-based illustration of the Capitol building dome topping a jar of money.
Budget Bill
Republicans’ Plan to Tax Higher Ed and Slash Funding Advances in Congress
Allison Pingree, a Cambridge, Mass. resident, joined hundreds at an April 12 rally urging Harvard to resist President Trump's influence on the institution.
International
Trump Administration Revokes Harvard’s Ability to Enroll International Students
Photo-based illustration of an open book with binary code instead of narrative paragraphs
Culture Shift
The Reading Struggle Meets AI

From The Review

Illustration of a Gold Seal sticker embossed with President Trump's face
The Review | Essay
What Trump’s Accreditation Moves Get Right
By Samuel Negus
Illustration of a torn cold seal sticker embossed with President Trump's face
The Review | Essay
The Weaponization of Accreditation
By Greg D. Pillar, Laurie Shanderson
Protestors gather outside the Pro-Palestinian encampment on the campus of UCLA in Los Angeles on Wednesday, May 1, 2024.
The Review | Conversation
Are Colleges Rife With Antisemitism? If So, What Should Be Done?
By Evan Goldstein, Len Gutkin

Upcoming Events

Ascendium_06-10-25_Plain.png
Views on College and Alternative Pathways
Coursera_06-17-25_Plain.png
AI and Microcredentials
  • Explore Content
    • Latest News
    • Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Virtual Events
    • Chronicle Store
    • Chronicle Intelligence
    • Jobs in Higher Education
    • Post a Job
  • Know The Chronicle
    • About Us
    • Vision, Mission, Values
    • DEI at The Chronicle
    • Write for Us
    • Work at The Chronicle
    • Our Reporting Process
    • Advertise With Us
    • Brand Studio
    • Accessibility Statement
  • Account and Access
    • Manage Your Account
    • Manage Newsletters
    • Individual Subscriptions
    • Group and Institutional Access
    • Subscription & Account FAQ
  • Get Support
    • Contact Us
    • Reprints & Permissions
    • User Agreement
    • Terms and Conditions
    • Privacy Policy
    • California Privacy Policy
    • Do Not Sell My Personal Information
1255 23rd Street, N.W. Washington, D.C. 20037
© 2025 The Chronicle of Higher Education
The Chronicle of Higher Education is academe’s most trusted resource for independent journalism, career development, and forward-looking intelligence. Our readers lead, teach, learn, and innovate with insights from The Chronicle.
Follow Us
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin