Skip to content
ADVERTISEMENT
Sign In
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
  • More
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
    Upcoming Events:
    Hands-On Career Preparation
    An AI-Driven Work Force
    Alternative Pathways
Sign In
Curricula

Scared of AI? Don’t Be, Computer-Science Instructors Say.

By Maggie Hicks August 2, 2023
Dr. Aniket Bera and Dr. Javier Gomez-Lavin pose for a portrait at Purdue University on Thursday, July 27, 2023 in West Lafayette, Indiana. The professors run AI labs at Purdue. With artificial intelligence (AI) on the rise, Purdue University has invested thousands of dollars in new AI labs and research. The university also recently started two AI majors - a bachelor of science and bachelor of arts.
Aniket Bera, an associate professor of computer science, and Javier Gomez-Lavin, an assistant professor of philosophy, lead AI-focused courses at Purdue that blend technical skills and ethics.Lee Klafczynski for The Chronicle

Since generative-AI models like ChatGPT surfaced last November, they’ve caused a frenzy in college classrooms.

Professors report that students are using ChatGPT to write essays or complete math homework. Many instructors are leveraging anti-plagiarism software or tweaking assignments to try to prevent cheating.

To computer scientists, however, the rise of artificial intelligence is no different than the advent of the pocket calculator or the Google search engine: It’s a tool that, if used correctly, can help people learn faster and think on a deeper level.

To continue reading for FREE, please sign in.

Sign In

Or subscribe now to read with unlimited access for as low as $10/month.

Don’t have an account? Sign up now.

A free account provides you access to a limited number of free articles each month, plus newsletters, job postings, salary data, and exclusive store discounts.

Sign Up

Since generative-AI models like ChatGPT surfaced last November, they’ve caused a frenzy in college classrooms.

Professors report that students are using ChatGPT to write essays or complete math homework. Many instructors are leveraging anti-plagiarism software or tweaking assignments to try to prevent cheating.

To computer scientists, however, the rise of artificial intelligence is no different than the advent of the pocket calculator or the Google search engine: It’s a tool that, if used correctly, can help people learn faster and think on a deeper level.

Since Alan Turing and John McCarthy, both computer scientists, helped create artificial intelligence, in the 1950s, computer scientists have been researching the technology; McCarthy founded Stanford University’s Artificial Intelligence Laboratory in 1963. Several computer-science professors told The Chronicle that they plan to use ChatGPT in classroom assignments.

The field is also grappling with how to use the technology responsibly. Purdue University recently became one of the first institutions to offer AI majors that emphasize ethics and psychology.

Illustration of a female students and torn paper showing A+ grades
Further Reading
  • Should You Add an AI Policy to Your Syllabus?
  • GPT-4 Can Already Pass Freshman Year at Harvard
  • 4 Steps to Help You Plan for ChatGPT in Your Classroom
  • How Professors Scrambled to Deal With ChatGPT
  • Will ChatGPT Change How Professors Assess Learning?

Aniket Bera, an associate professor of computer science at Purdue, said he had been collaborating with psychology and psychiatry professionals to make AI work better for them.

“Instead of running away from it, I think we should strive to embrace it,” Bera said. “Once people start embracing, we’ll try to understand what the problems of these things are and then try to fix it.”

As colleges prepare for a year in which ChatGPT and similar programs will become increasingly pervasive, the field of computer science offers a model for how higher ed might integrate artificial intelligence into learning. At the same time, experts say, computer-science professors should collaborate across disciplines — and connect with their departmental colleagues — to understand and respond to the technology’s pitfalls.

Using AI in the Classroom

Computer-science faculty members like Peter Stone are incorporating AI into the curriculum by emphasizing the technology’s flaws.

While ChatGPT can generate basic code, it makes mistakes, especially as the code gets more complicated, said Stone, a professor of computer science at the University of Texas at Austin. Professors can teach basic coding in introductory classes and have higher-level students edit the code that AI generates.

Bruno Ribeiro, an associate professor of computer science at Purdue, gives students unique coding problems that seem simple on the surface but have slight variations that often trip AI up. He then has students identify where the program went wrong and fix the code.

ADVERTISEMENT

“At the end of the day, what they really learn is how to think and how to check things and how to verify if something is right or something is wrong,” Ribeiro said. “In my classes I tell them, ‘Look, if ChatGPT gives you the answer, that’s great, but if it’s wrong, you are responsible for it.’”

Beyond computer science, a range of liberal-arts classes can use similar methods to promote AI literacy and encourage students to read critically, said Anna Mills, an English instructor at California’s College of Marin who researches AI in writing courses.

In my classes I tell them, ‘Look, if ChatGPT gives you the answer, that’s great, but if it’s wrong, you are responsible for it.’

Students can, for instance, analyze a conversation with ChatGPT as an assignment and identify signs of fabrication, biases, inaccuracies, or shallow reasoning. Johanna Inman, director of the Teaching and Learning Center at Drexel University, in Philadelphia, suggests that faculty members have students use AI to write a first draft of an essay and show what they might change. Or instructors could include AI as a contributor to group discussions.

Identifying AI’s flaws motivates students and helps them build confidence, which can discourage cheating, Mills said.

ADVERTISEMENT

“Pointing out where it still really messes up is very powerful, both for learning about what these systems are and for empowering students to see their own strengths as human thinkers,” Mills said. Certain courses can also teach students the best ways to communicate with AI tools, she added, by using critical thinking and rhetoric.

Many writing instructors still worry that incorporating AI will prevent students from developing the skills that come from learning how to read and write, Mills said. Some computer-science professors expressed similar concerns, saying that students need to learn the foundations of coding to be able to change and advance technology in the future.

Stone suggested instructors still emphasize the need to learn foundational skills in introductory classes by clearly explaining to students when AI is appropriate to use. Once students have learned basic coding, they could use AI to complete their assignments faster in advanced classes, he said.

“My job is to instruct students on what they need to do to learn the concepts that are covered in the syllabus of the course,” Stone said. “If I tell them, ‘Here’s an assignment, do this work without ChatGPT,’ and then they go and do it with ChatGPT, they’ve basically lost that opportunity to learn.”

ADVERTISEMENT

Academic integrity is another pitfall. AI is accessible to anyone who can easily use the internet, and it’s less stigmatized than other methods of cheating, Mills said. Though professors can sometimes detect when a student uses AI, it can be difficult, she said, and many detection devices, such as Turnitin, can be unreliable.

“It’s a significantly increased level of students using it and feeling that temptation,” she said.

UT-Austin has started offering seminars on how professors can incorporate AI into their lessons as well as strategies to communicate when students are allowed to use it, Stone said. Drexel’s Teaching and Learning Center also offers teaching tips and panels on the best ways to use AI in a variety of classes, Inman said.

“There’s more of a danger in not teaching students how to use AI,” she said. “If they’re not being taught under the mentorship of scholars and experts, they may be using it in ways that are either inappropriate or not factual or unethical.”

Combining Forces

Computer science is also taking the lead in understanding AI’s complicated ethical dilemmas, such as lack of access, who controls the technology, and its uses, said Chris Piech, an assistant professor of computer-science education at Stanford.

ADVERTISEMENT

It’s not clear how many AI-specific majors or courses exist, but experts said that more colleges would probably begin offering AI degrees that connect with other departments through teaching and research. UT-Austin recently introduced an online master’s program in AI, and Carnegie Mellon University offers a bachelor of science in AI.

At Purdue, the two new AI majors — a bachelor of science and a bachelor of arts — blend technical skills with critical thinking.

The B.A., which is offered through the university’s philosophy department, requires several introductory courses in computer science that focus on the technical parts of AI, while primarily teaching the ethics and philosophy of the technology.

Dr. Javier Gomez-Lavin, assistant professor with the Department of Philosophy, shows some of the gaming equipment for his Intro to Philosophy through Video Games course at Purdue University on Thursday, July 27, 2023 in West Lafayette, Indiana. Dr. Gomez-Lavin runs an AI lab at Purdue. With artificial intelligence (AI) on the rise, Purdue University has invested thousands of dollars in new AI labs and research. The university also recently started two AI majors - a bachelor of science and bachelor of arts.
Javier Gomez-Lavin, an assistant professor of philosophy at Purdue, unpacks gaming equipment for his “Introduction to Philosophy Through Video Games” course, which is part of a new AI major.Lee Klafczynski for The Chronicle

One course explores how video games pose classic philosophical questions, said Javier Gomez-Lavin, an assistant professor of philosophy, who created the class. Students play games for a portion of the course, and eventually design new games that tackle those questions.

“Given that there’s going to be large language models that are going to have new and kind of unprecedented impacts on the way people work,” Gomez-Lavin said, “how can we actually prepare students to leverage the best of critical thinking and have some inside knowledge of these systems themselves?”

ADVERTISEMENT

The B.S., which is housed in the computer-science department, offers more-advanced technical classes while requiring philosophy, psychology, and ethics courses. Through the major, students understand how human beings interpret and use information that comes from AI, said Chris Clifton, interim head of Purdue’s computer-science department.

“We’re not just looking at the AI system itself and what it says,” Clifton said. “We’re actually looking at the outcome in terms of the final effect on the person who’s impacted.”

Computer-science classes encourage students to analyze AI’s biases as well as problems it causes when it’s used in the real world. Since people created these systems, many of the problems, such as gender and cultural biases, are fundamentally human.

Some of Ribeiro’s students at Purdue are investigating AI’s flaws in accounting for unexpected events. As a case study, they learned about Zillow Offers, a program created by the popular housing website Zillow. The program used an AI algorithm to determine how much to offer in buying a house. The algorithm worked well in tests but didn’t account for changes in the housing market, so many of its predictions were wrong when the company introduced it publicly. The result was a $300-million loss.

ADVERTISEMENT

“As an educator, the best we can do is give them foundations they can build on because it’s very hard to determine in five years what the new method will be,” Ribeiro said. “What we can do is make sure they understand the advantages and drawbacks of these methods.”

Mills also emphasized the need to proceed with caution. While the technology offers many exciting ways to teach and learn, proceeding with little regulation is dangerous, she said.

“People are exploring it, it is very exciting, and we can, to some extent, share that with students,” she said, “even as we’re strongly emphasizing that as soon as you start to learn about it, you learn about the fabrications and bias and ethical concerns.”

AI in the Work Force

Many computer-science professors believe incorporating AI into the classroom is the best way to prepare students for the future of their industry. Ignoring those tools would also be a disservice to students outside computer science who will probably need to use them in their careers as well, said Drexel’s Inman.

ADVERTISEMENT

By learning the flaws of the technology and the basics of how it operates, computer-science students will be able to improve it once they begin their careers, said Stone, the UT-Austin professor.

“It’s not that people are going to lose jobs to AI. People who don’t know how to use AI are going to lose jobs to people who do know how to use AI,” Stone said. “We need to train our students to use the tools and to know what’s out there.”

But that doesn’t mean offering classes like ChatGPT 101.

The AI models that are popular now are likely to change in a few years, said Purdue’s Clifton. He believes that ChatGPT will create an incremental shift, not a fundamental one. In a few years, he said, the technology could be completely different, and students entering the industry need to learn how to adapt to a tool that may not even exist yet.

“One of the key things we teach people is to learn new things because throughout their careers, that’s what they will need to do,” Clifton said. “This is just another new thing.”

We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Tags
Teaching & Learning Technology Innovation & Transformation
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
Maggie-Hicks-Promo.png
About the Author
Maggie Hicks
Maggie Hicks is a reporting fellow at The Chronicle of Higher Education. Follow her on Twitter @maggie_hickss, or email her at maggie.hicks@chronicle.com.
ADVERTISEMENT
ADVERTISEMENT

More News

Graphic vector illustration of a ship with education-like embellishments being tossed on a black sea with a Kraken-esque elephant trunk ascending from the depth against a stormy red background.
Creeping concerns
Most Colleges Aren’t a Target of Trump (Yet). Here’s How Their Presidents Are Leading.
Photo-based illustration of calendars on a wall (July, August and September) with a red line marking through most of the dates
'A Creative Solution'
Facing Federal Uncertainty, Swarthmore Makes a Novel Plan: the 3-Month Budget
Marva Johnson is set to take the helm of Florida A&M University this summer.
Leadership & governance
‘Surprising': A DeSantis-Backed Lobbyist Is Tapped to Lead Florida A&M
Students and community members protest outside of Coffman Memorial Union at the University of Minnesota in Minneapolis, on Tuesday, April 23, 2024.
Campus Activism
One Year After the Encampments, Campuses Are Quieter and Quicker to Stop Protests

From The Review

Glenn Loury in Providence, R.I. on May 7, 2024.
The Review | Conversation
Glenn Loury on the ‘Barbarians at the Gates’
By Evan Goldstein, Len Gutkin
Illustration showing a valedictorian speaker who's tassel is a vintage microphone
The Review | Opinion
A Graduation Speaker Gets Canceled
By Corey Robin
Illustration showing a stack of coins and a university building falling over
The Review | Opinion
Here’s What Congress’s Endowment-Tax Plan Might Cost Your College
By Phillip Levine

Upcoming Events

Ascendium_06-10-25_Plain.png
Views on College and Alternative Pathways
Coursera_06-17-25_Plain.png
AI and Microcredentials
  • Explore Content
    • Latest News
    • Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Virtual Events
    • Chronicle Store
    • Chronicle Intelligence
    • Jobs in Higher Education
    • Post a Job
  • Know The Chronicle
    • About Us
    • Vision, Mission, Values
    • DEI at The Chronicle
    • Write for Us
    • Work at The Chronicle
    • Our Reporting Process
    • Advertise With Us
    • Brand Studio
    • Accessibility Statement
  • Account and Access
    • Manage Your Account
    • Manage Newsletters
    • Individual Subscriptions
    • Group and Institutional Access
    • Subscription & Account FAQ
  • Get Support
    • Contact Us
    • Reprints & Permissions
    • User Agreement
    • Terms and Conditions
    • Privacy Policy
    • California Privacy Policy
    • Do Not Sell My Personal Information
1255 23rd Street, N.W. Washington, D.C. 20037
© 2025 The Chronicle of Higher Education
The Chronicle of Higher Education is academe’s most trusted resource for independent journalism, career development, and forward-looking intelligence. Our readers lead, teach, learn, and innovate with insights from The Chronicle.
Follow Us
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin