Skip to content
ADVERTISEMENT
Sign In
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
  • More
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
    Upcoming Events:
    Hands-On Career Preparation
    An AI-Driven Work Force
    Alternative Pathways
Sign In
Advice

Are We Asking the Wrong Questions About ChatGPT?

Stop agonizing about your syllabus policy and start helping students use AI to extend, not replace, their thinking.

By J.T. Torres and Adam Nemeroff April 15, 2024
08262019-adviceguide-tech-choices/3YmIXsAen8/chronical-tech-shopping-1067x684.jpeg
James Yang for The Chronicle

Since the emergence of ChatGPT, one of the most frequent questions we hear from faculty members who request instructional support is, “What should I say about AI on my syllabus?” Most of the time, what they’re really asking is: “How do I police the use of AI in my classes?”

For good reasons, many educators worry that misuse of one kind of intelligence (“artificial”) will diminish another kind (“human”). But that’s a false binary. Intelligence resists neat categories. After all, even Howard Gardner’s beloved theory of multiple intelligences could not be clearly defined

To continue reading for FREE, please sign in.

Sign In

Or subscribe now to read with unlimited access for as low as $10/month.

Don’t have an account? Sign up now.

A free account provides you access to a limited number of free articles each month, plus newsletters, job postings, salary data, and exclusive store discounts.

Sign Up

Since the emergence of ChatGPT, one of the most frequent questions we hear from faculty members who request instructional support is, “What should I say about AI on my syllabus?” Most of the time, what they’re really asking is: “How do I police the use of AI in my classes?”

For good reasons, many educators worry that misuse of one kind of intelligence (“artificial”) will diminish another kind (“human”). But that’s a false binary. Intelligence resists neat categories. After all, even Howard Gardner’s beloved theory of multiple intelligences could not be clearly defined in empirical studies. People routinely think with the aid of tools: Things like hearing aids, prescription glasses, or heart monitors all extend the capabilities of the humans who need and benefit from them. Not only can technology extend intelligence; it can do so in ways that level the playing field and create a more equitable society.

At this point, the question isn’t so much whether AI will replace other kinds of intelligence, but rather, how it will augment our thinking. Many students have described how generative AI has helped them with tasks such as brainstorming or outlining. Policing something that is already so entangled in students’ lives is an exercise in futility.

As educators, we face an extraordinarily authentic learning moment. Rather than worry about drafting the perfect syllabus policy, faculty members would be better served by asking a different question: How can we prepare students to thrive in a so-called artificially intelligent world? The college classroom is the perfect place to craft a collective answer. In what follows, we offer some strategies to do what educators do best: create an environment for transformative learning.

Prompt students and AI bots to take turns extending one another’s “limits.” During classroom discussions or activities, ChatGPT can come in handy when students run out of steam. Classroom “conversations” with AI can inspire new ideas or directions of thought. Two examples from our own teaching:

  • Torres: I ask students in my writing and capstone courses to develop essential questions that they explore throughout the semester. Formulating good questions requires familiarity with a topic, which many students do not yet have. So they use ChatGPT to summarize prior research on a topic, suggest possible lines of inquiry, and shape their research question. For example, a first-year student wanted to explore the relationship between music and emotion. ChatGPT provided multiple summaries based on disciplinary perspectives (e.g., psychology and neuroscience). It could not provide a summary from a visual-arts perspective, so the student landed on the question: “How do we experience music when consumed visually, such as through YouTube videos?”
  • Nemeroff: I ask my upper-level students in business-information management to use ChatGPT or other large-language models to understand abstract or tricky technical concepts, such as how to use a programming language like SQL. When my students attempt to compare data sets in SQL — say, information on customers and orders — they don’t always understand the process. The syntax in SQL is challenging for students to write, and it trips them up every time. If they can use AI to write the syntax, they are better able to practice comparing data sets.

On a 2023 episode of the Dead Ideas in Teaching and Learning podcast, Cynthia Alby, a professor of teacher education at Georgia College and State University, said AI can operate as training wheels during tricky developmental steps in a student’s learning. For instance, you could invite students to use ChatGPT to draft an outline for a literature review before assigning the actual literature review.

Integrating AI in difficult activities can help students break through key misconceptions more productively. Likewise, that approach can help faculty members to break through conceptual challenges in course design and preparation, such as writing student-friendly learning objectives. In both cases, a human is in the loop, co-creating the content with the AI.

Encourage students to question how ChatGPT and other AI tools know anything. Ask students to do research on what is known and unknown about how these tools are built and how that influences their ability to develop a research topic or extend an argument. Discovering the opacity surrounding these closely guarded trade secrets leads to a healthy skepticism of AI’s perceived magic.

Students (and, frankly, all of us) need to develop an emerging skill set known as AI literacy to gauge when these tools are helpful. Students can begin recognizing the limits of AI usefulness or trustworthiness — such as when they need more recent information than the latest version of ChatGPT provides, when the AI produces an inaccurate response or source, or when meaningful personal reflection is required. AI literacy offers a framework for an important awareness.

Require AI and students to take turns “fact checking” one another. Most concerns around AI focus on writing. One innovative strategy involves asking large-language models to help students with reading. For example, after annotating an entire text, students could feed sections of the same text into ChatGPT and compare its responses with their work.

ADVERTISEMENT

Let’s say students read an excerpt of Michel Foucault’s Discipline and Punish: The Birth of the Prison on panopticism. After discussing the excerpt, they could ask ChatGPT to provide historical context (e.g., what happened during the Plague that inspired some of Foucault’s thinking). With that added background, they could revise their annotations or produce a summary that integrates ChatGPT’s notes with their own. Or students could copy and paste confusing portions of the excerpt into ChatGPT and ask for a translation suitable for a general audience, helping them refine their annotations.

Additionally, students could ask ChatGPT to produce a reference list of other scholarly views on panopticism. Or students might work with the campus library to cultivate a reference list, and then use AI to filter those references based on their relevance. Say a student wants to do research on surveillance in social media; ChatGPT, positioned as a research assistant, can save time by presenting references on the social-media angle and allow the student to focus on making the connection with panopticism.

Encourage students and AI to experiment and follow up. “Prompt engineering” — that is, “designing inputs for AI tools” — is emerging as its own budding career path. These AI whisperers and safety guardians are key to the development of effective and safe models, and many of their design skills are helpful to any AI user. To that end, ask your students to:

  • Explore some publicly shared AI prompts and unconventional, unexpected, and odd responses, such as when GPT-4 started getting “lazy” in November 2023, leading to OpenAI’s acknowledgment and clarification on X. Such public interactions can not only remind students that AI is fallible, but reinforce that human knowledge remains valid.
  • Practice giving the tool more — and less — information at the start of a prompt. Students will see how creating a sharper, most substantive prompt makes it possible for AI to give compelling responses. One of us (Nemeroff) found it transformative to realize that, while the initial prompt is important, the subsequent replies and refinements allow you to get more out of the model.
  • Learn about how AI models and policy makers are “red teamed” and tested for safety before their public release to prevent them from propagating offensive content, disinformation, and official political campaigning. Students can do research on innovative AI policy ideas, like a proposal by the start-up company Anthropic to create “a constitution for an AI system” based on values found in places like the Universal Declaration of Human Rights. Ask students to suggest ethical ways to engage in productive inquiry with an AI model.

In general, try to avoid assignments that seek a stand-alone answer in favor of an extended critical dialogue. We need to move away from such one-step interactions and toward a robust set of procedures and strategies to make AI work for us in smarter, more effective ways. In other words, the wisdom of scaffolding once again becomes a paramount instructional strategy.

ADVERTISEMENT

Ask students and AI to formulate problems, not answers. It’s a very human reaction to seek a quick solution to a problem we barely understand. With AI, however, we might have an opportunity to slow down our problem-solving impulses for meaningful critical thinking.

AI can help students define and delineate historical and social contexts to particular problems. For example, students wanting to investigate climate change can learn via ChatGPT about the industrial revolution, the green revolution, and current environmental laws far more quickly than if they had to navigate pages and pages of Google results. When they run into an issue, students can prompt the AI tool to ask them questions that might help them refine and elaborate on the topic.

In a post on LinkedIn, Ethan Mollick, author of Co-Intelligence: Living and Working With AI and an associate professor of management at the University of Pennsylvania, described an assignment in which he asks students to use AI to “simulate three famous people from history to criticize your business idea,” come up with 10 weak points, and articulate a vision of success.

There are incredible opportunities and limitations in such uses of AI, and the whole point is getting students to realize that in their own practice. While AI can efficiently provide relevant facts, only students can provide the lived experience in a synthesis of problem formulation.

ADVERTISEMENT

As much as faculty members might want to be the gateway into a controlled environment that upholds a particular kind of intelligence, we might be better off as cognitive curators, showcasing constructive ways to integrate emerging AI technology into teaching and learning. By acknowledging the strengths and weaknesses of these tools, educators can empower students to think critically, engage with primary sources, and reflect on their own social identities.

We have to embrace the evolving landscape of AI and educate students to use such tools discerningly. Rather than thinking about syllabus policies and policing our classrooms, we need to think about practicing intelligence plurality.

We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Tags
Teaching & Learning Technology
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
About the Author
J.T. Torres
J.T. Torres is director of the Center for Teaching and Learning and an assistant professor of English at Quinnipiac University.
About the Author
Adam Nemeroff
Adam Nemeroff is director of learning design and technology at Quinnipiac University and an adjunct instructor at Quinnipiac and the University of Connecticut.
ADVERTISEMENT
ADVERTISEMENT

More News

Photo-based illustration of scissors cutting through a flat black and white university building and a landscape bearing the image of a $100 bill.
Budget Troubles
‘Every Revenue Source Is at Risk’: Under Trump, Research Universities Are Cutting Back
Photo-based illustration of the Capitol building dome topping a jar of money.
Budget Bill
Republicans’ Plan to Tax Higher Ed and Slash Funding Advances in Congress
Allison Pingree, a Cambridge, Mass. resident, joined hundreds at an April 12 rally urging Harvard to resist President Trump's influence on the institution.
International
Trump Administration Revokes Harvard’s Ability to Enroll International Students
Photo-based illustration of an open book with binary code instead of narrative paragraphs
Culture Shift
The Reading Struggle Meets AI

From The Review

Illustration of a Gold Seal sticker embossed with President Trump's face
The Review | Essay
What Trump’s Accreditation Moves Get Right
By Samuel Negus
Illustration of a torn cold seal sticker embossed with President Trump's face
The Review | Essay
The Weaponization of Accreditation
By Greg D. Pillar, Laurie Shanderson
Protestors gather outside the Pro-Palestinian encampment on the campus of UCLA in Los Angeles on Wednesday, May 1, 2024.
The Review | Conversation
Are Colleges Rife With Antisemitism? If So, What Should Be Done?
By Evan Goldstein, Len Gutkin

Upcoming Events

Ascendium_06-10-25_Plain.png
Views on College and Alternative Pathways
Coursera_06-17-25_Plain.png
AI and Microcredentials
  • Explore Content
    • Latest News
    • Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Virtual Events
    • Chronicle Store
    • Chronicle Intelligence
    • Jobs in Higher Education
    • Post a Job
  • Know The Chronicle
    • About Us
    • Vision, Mission, Values
    • DEI at The Chronicle
    • Write for Us
    • Work at The Chronicle
    • Our Reporting Process
    • Advertise With Us
    • Brand Studio
    • Accessibility Statement
  • Account and Access
    • Manage Your Account
    • Manage Newsletters
    • Individual Subscriptions
    • Group and Institutional Access
    • Subscription & Account FAQ
  • Get Support
    • Contact Us
    • Reprints & Permissions
    • User Agreement
    • Terms and Conditions
    • Privacy Policy
    • California Privacy Policy
    • Do Not Sell My Personal Information
1255 23rd Street, N.W. Washington, D.C. 20037
© 2025 The Chronicle of Higher Education
The Chronicle of Higher Education is academe’s most trusted resource for independent journalism, career development, and forward-looking intelligence. Our readers lead, teach, learn, and innovate with insights from The Chronicle.
Follow Us
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin