ChatGPT has been around for less than three months, and professors are already sharing stories about students who have used the language-generating tool to cheat on exams or assignments. Many faculty members are debating what ChatGPT might mean for the future of teaching and academic integrity.
The artificial-intelligence tool quickly caught momentum after being released by OpenAI in November, gaining an estimated 100 million monthly active users within two months of its launch.
But this is not the first time new technology has kindled worries among faculty, who have long feared that students will take shortcuts instead of doing their own work. Ultimately, one expert told The Chronicle, instructors will adapt and move forward, as has been done before.
As hard as it is to believe today, the proliferation of calculators stoked similar panic decades ago.
Professors wondered whether students would lean on the technology as a crutch. “Just as some feared that pocket calculators would cause schoolchildren to forget their multiplication tables, some professors worry that students will learn how to use graphical calculators without learning the concepts of mathematics,” The Chronicle reported in 1992.
Some aspects of the anxiety over ChatGPT also mirror the conversation that gained steam 15 years ago about students bringing laptops and phones into the classroom — namely, that technology has the potential to distract students and is capable of facilitating cheating.
“[Students] know the information is a quick Google search away,” one professor wrote in a 2015 op-ed for The Chronicle encouraging professors to ban the use of calculators found on laptops and phones during exams. “What’s the point of memorizing it, they want to know.”
As students began to use laptops for note-taking during class, professors raised concerns that they were surfing the internet or playing games instead of paying attention.
“Are they not even in the same universe as I’m in because they’re looking at the internet?” another professor pondered in 2006.
Despite those fears, the use of calculators in math classrooms and the drum of keyboards in lecture halls are now commonplace.
ChatGPT is not yet as pervasive as these tools, but AI will increasingly take hold in college classrooms. David Rettinger, president emeritus at the International Center for Academic Integrity, cited SPSS — a statistical-software suite developed by IBM in the late 1960s — as a key example of this phenomenon.
“When it was invented, I think people just ignored it [as] this expensive, complicated thing that students are not really going to be using,” Rettinger said. “Fast forward 20 years — there’s no students who can run a quick t-test.” (The t-test is a manual method used to determine whether data can be used to support a hypothesis.)
New Territory
As academe adjusts to a world with ChatGPT, faculty will need to find fresh ways to assess students’ writing.
The same was true when calculators first began to appear in math classrooms, and professors adapted the exams.
“The calculator changes the kinds of questions that you can ask students,” one professor told The Chronicle in 1992. “A lot of problems we used to assign were very artificial, so the numbers would come out nicely. Today we don’t need to worry about that so much. The problems aren’t harder, but they’re not as neat.”
But there are a few ways that ChatGPT is different — and will require professors to respond accordingly.
For one, the chat-bot produces coherent, if often dull, writing. That function replaces a learned skill that’s completely different from solving equations or computing statistics.
Writing plays a key role in teaching reasoning to students, Rettinger said. And because professors did not grow up in classrooms with ChatGPT, they might find those concepts difficult to explain to students.
“Writing is absolutely fundamental to the way people think and build arguments in almost all disciplines, whereas statistics was a relatively niche kind of thing to have to learn to do,” Rettinger said.
When it comes to maintaining academic integrity on exams, Rettinger offered a simple solution: “Academic integrity is about being honest about the way you did your work.” Spell checkers, he pointed out, are a prime example of artificial intelligence that may have been controversial at first, but are now used routinely without a second thought to produce papers.
As assignments and assessments evolve to accommodate the presence of ChatGPT, it will become increasingly important that academic-integrity policies adapt, too, Rettinger said.
“Rules are going to change — they’re going to have to, just in the same way that students routinely use calculators in class now,” he said. “The world changes.”