We're sorry. Something went wrong.
We are unable to fully display the content of this page.
The most likely cause of this is a content blocker on your computer or network.
If you continue to experience issues, please contact us at 202-466-1032 or firstname.lastname@example.org
Not a single one of these emails has mentioned one of the most important recent developments in higher education, the November 2022 launch of AI chatbot text-generation technology, better known under the trademark ChatGPT. Its already-widespread use among undergraduates is presenting enormous challenges to faculty across the country.
“At any given time,” Owen Kichizo Terry recently wrote in these pages, “I can look around my classroom and find multiple people doing homework with the help of ChatGPT.” Or as a graduate student at a large state university told me, “We had students in our upper-level courses turn in essays written entirely by AI … but from what I’ve heard, even more of them will prompt the AI and then modify the answer themselves — I think that is already very widely used.”
Fellow graduate instructors in my own department have reported encountering student work this past semester that smelled strongly of AI generation. But with no dedicated institutional response to the issue, and an already-byzantine set of procedures for reporting derelictions of academic integrity, nobody in charge of a classroom had clear guidelines for what to do. Some abandoned traditional at-home essay-writing for in-class, handwritten tests and oral exams. But even then, much of the work produced in class had a vague, airy, Wikipedia-lite quality that raised suspicions that students were memorizing and regurgitating the inaccurate answers generated by ChatGPT.
And a humanities professor at a state university told me that he worries that adapting existing disability accommodations — extra time on tests, distraction-free environments, allowances for typing instead of writing longhand — to the new technological landscape while ensuring the integrity of student work will be forbiddingly difficult. Meanwhile, students are still experiencing a “stunning” level of post-pandemic disconnection from schoolwork and campus life. Faced with these challenges and frustrations, however, college administrations have largely remained silent on the issue, leaving teaching staff to fend for themselves.
On many campuses, high-course-load contingent faculty and graduate students bear much of the responsibility for the kinds of large-enrollment, introductory-level, general-education courses where cheating is rampant. According to the International Center for Academic Integrity, more than 60 percent of college students admit to participating in some kind of cheating. Add to this already-dismal situation the most easily accessible and lowest-cost cheating technology ever devised and watch the entire system of college education strain at its rivets. How can large or even mid-sized colleges withstand the flood of nonsense quasi-plagiarism when academic-integrity first responders are so overburdened and undercompensated?
College administrations have largely remained silent on the issue, leaving teaching staff to fend for themselves.
A meaningful education demands doing work for oneself and owning the product of one’s labor, good or bad. The passing off of someone else’s work as one’s own has always been one of the greatest threats to the educational enterprise. The transformation of institutions of higher education into institutions of higher credentialism means that for many students, the only thing dissuading them from plagiarism or exam-copying is the threat of punishment. One obviously hopes that, eventually, students become motivated less by fear of punishment than by a sense of responsibility for their own education. But if those in charge of the institutions of learning — the ones who are supposed to set an example and lay out the rules — can’t bring themselves to even talk about a major issue, let alone establish clear and reasonable guidelines for those facing it, how can students be expected to know what to do?
So to any deans, presidents, department chairs, or other administrators who happen to be reading this, here are some humble, nonexhaustive, first-aid-style recommendations. First, talk to your faculty — especially junior faculty, contingent faculty, and graduate-student lecturers and teaching assistants — about what student writing has looked like this past semester. Try to find ways to get honest perspectives from students, too; the ones actually doing the work are surely frustrated at their classmates’ laziness and dishonesty. Any meaningful response is going to demand knowing the scale of the problem, and the paper-graders know best what’s going on. Ask teachers what they’ve seen, what they’ve done to try to mitigate the possibility of AI plagiarism, and how well they think their strategies worked. Some departments may choose to take a more optimistic approach to AI chatbots, insisting they can be helpful as a student research tool if used right. It is worth figuring out where everyone stands on this question, and how best to align different perspectives and make allowances for divergent opinions while holding a firm line on the question of plagiarism.
Second, meet with your institution’s existing honor board (or whatever similar office you might have for enforcing the strictures of academic integrity) and devise a set of standards for identifying and responding to AI plagiarism. Consider simplifying the procedure for reporting academic-integrity issues; research AI-detection services and software, find one that works best for your institution, and make sure all paper-grading faculty have access and know how to use it.
Lastly, and perhaps most importantly, make it very, very clear to your student body — perhaps via a firmly worded statement — that AI-generated work submitted as original effort will be punished to the fullest extent of what your institution allows. Post the statement on your institution’s website and make it highly visible on the home page. Consider using this challenge as an opportunity to reassert the basic purpose of education: to develop the skills, to cultivate the virtues and habits of mind, and to acquire the knowledge necessary for leading a rich and meaningful human life.
AI technology, I suspect, will pose a significant threat to many institutions of higher learning, and especially to the already-ailing enterprise of the academic humanities. Smaller schools with more-intimate learning environments — not only high-status, prestigious liberal-arts colleges, but also provincial satellite universities and community colleges where faculty have closer student relationships and more familiarity with students’ work — may very well emerge relatively unscathed. But at bigger universities with large classrooms and overburdened teachers, where struggling students can remain anonymous in a cavernous lecture hall, the easy option may prove too tempting. It will take an aggressive and comprehensive institutional response to dissuade students from plagiarism. The time to act was months ago, well before term-paper deadlines. The second-best time is now.