> Skip to content
FEATURED:
  • Student-Success Resource Center
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
Sign In
ADVERTISEMENT
Newsletter Icon

Teaching

Find insights to improve teaching and learning across your campus. Delivered on Thursdays. To read this newsletter as soon as it sends, sign up to receive it in your email inbox.

May 25, 2023
Share
  • Twitter
  • LinkedIn
  • Show more sharing options
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
  • Copy Link URLCopied!
  • Print

From: Beth McMurtrie

Subject: Teaching: Are professors ready for AI?

This week I:

  • Share a case study on using AI in grading.
  • Ask you about your experiences with students using AI.
  • Highlight some resources on AI.

AI and grading

The field of artificial intelligence — particularly generative AI — is evolving so rapidly that keeping up can feel like trying to drink from a firehose. Each iteration comes with its own set of ethical, pedagogical and legal considerations.

We're sorry. Something went wrong.

We are unable to fully display the content of this page.

The most likely cause of this is a content blocker on your computer or network.

Please allow access to our site, and then refresh this page. You may then be asked to log in, create an account if you don't already have one, or subscribe.

If you continue to experience issues, please contact us at 202-466-1032 or help@chronicle.com

This week I:

  • Share a case study on using AI in grading.
  • Ask you about your experiences with students using AI.
  • Highlight some resources on AI.

AI and grading

The field of artificial intelligence — particularly generative AI — is evolving so rapidly that keeping up can feel like trying to drink from a firehose. Each iteration comes with its own set of ethical, pedagogical, and legal considerations.

How will this tech shape teaching and learning? What happens to the data that gets processed through these products? What are the best ways to use them? Layer on top of that the roles and responsibilities of people at every level: students, instructors, department chairs, administrators, and you can see why so many people are unsure of their next move.

I was thinking about these questions as I dove into a case study on the use of an AI grading tool, written by Rahul Kumar in the International Journal for Educational Integrity. An assistant professor in the department of educational studies at Brock University in Canada, Kumar developed the story of a hypothetical adjunct instructor who — like most adjuncts — is crushed by his workload and fearful of how that is harming his professional future and his personal life.

What I appreciate about Kumar’s piece is that it creates a portrait of someone likely to turn to AI to alleviate the demands placed on them. Dr. A.I. Case, as he names his faculty member, stumbles across a company that uses AI to discreetly help professors grade papers. Should he use this tool?

In the “yes” corner, as Kumar describes it, is the fact that it could free up time to devote to research and publishing, which could eventually help him earn tenure. It would restore work-life balance. It could also be more timely and consistent than he might be, given that he teaches twice as many courses — across two universities — as his tenured colleagues.

In the “no” corner are questions of cost, privacy, quality, and ethics, among other things. Professor Case wonders what happens to the students’ work once it’s uploaded, how good the AI is, and whether using an AI tool for a core teaching responsibility is both the right thing to do and a good thing to do. The paper doesn’t end with his choice; instead it reminds the reader of the complexity of the situation.

I spoke to Kumar about what he hopes people will take from his paper. So much of the attention around AI, he said, has been focused on whether students will use these tools inappropriately or without attribution. But the same questions could be applied to faculty members. Given how tenuous and stressful the work environment is for non-tenure-track faculty, colleges need to be alert to these possibilities, and faculty members will inevitably wrestle with them.

“Oftentimes precarity leads to risk taking,” Kumar noted. And while all stakeholders should have a say in decisions about the AI tools that could affect them, that is often not the case. Many professors are, in fact, operating without guidelines. As a result, he said, “there’s no real mechanism to know what people are doing, save for self disclosure.”

However, there is no simple answer to the question of whether to use AI tools, Kumar emphasized. It depends on the type of tool, its purpose, your goals, and even which discipline you work in. If you’re teaching someone to write, then allowing them to use a large language model like ChatGPT may end up short-circuiting the development of writing skills. But if you’re teaching a computer-science class, then maybe the use of an LLM to help with writing doesn’t matter as much. “We have to get more sophisticated,” Kumar said, “and say, Under what conditions, where and when is it OK as opposed to not OK?”

Give Kumar’s piece a read and let me know your thoughts on AI in teaching and the questions it raises around ethics, governance, pedagogy, and transparency. Your ideas could help inform future stories and newsletters on the topic. You can reach me at beth.mcmurtrie@chronicle.com

More cautionary AI tales

The second story that got me thinking about the complex nature of our AI-infused world came from a grading controversy, which you may have seen circulating last week. A professor at Texas A&M University at Commerce ran students’ assignments through ChatGPT to see if they were AI-generated, determined that they were, and then threatened to fail students, potentially holding up their diplomas.

The news made the rounds for several reasons. One is that the professor misunderstood ChatGPT, which is a large language model capable of turning out AI-written essays. It is not itself an AI detector. The other, of course, is that the professor threatened such a dramatic step. (After the incident made national news, a university spokesman told The Washington Post that no student ended up flunking the course or having their diploma withheld and that the university is developing policies on AI use or misuse in the classroom.)

The story highlights the growing concern faculty members have about whether they can trust what their students write. Like many of you, I have read a number of posts on social media in which faculty members have discovered that some of their students used AI to produce papers and complete exams. You may also have seen this opinion piece in The Review by a graduate instructor who argues that administrators need to act quickly to determine the scale of the problem and devise some responses.

Detection tools are on the rise. So far none of them are highly reliable, according to academic-integrity experts, but that hasn’t stopped faculty members from using them. That feeds into the lack of clarity around proper-AI usage. As the Washington Post story put it: “protocols on how and when to use chatbots in classwork are vague and unenforceable, with any effort to regulate use risking false accusations.”

This is an ongoing conversation, of course, and we would like to hear from you. We are hoping to do some short- and longer-term stories on how AI is affecting teaching and learning.

Did any of your students try to cheat with ChatGPT or other AI tools in assignments this past semester? Are you reworking your courses for the fall to address AI? Do you have other concerns or plans around utilizing AI in teaching? Please fill out this Google form and let us know what your experiences have been.

AI insights


  • Looking for resources on how to teach about AI? Check out this evolving project, called CRAFT, being developed by Stanford University’s Graduate School of Education, Institute for Human-Centered AI, and Stanford Digital Education.
  • This essay, which appeared in EduResearch Matters, walks through some issues to consider when it comes to using AI effectively.
  • For a bleak assessment on what AI means for teaching, read Ian Bogost’s piece in The Atlantic.

Thanks for reading Teaching. If you have suggestions or ideas, please feel free to email us at beckie.supiano@chronicle.com or beth.mcmurtrie@chronicle.com.

— Beth

Learn more about our Teaching newsletter, including how to contact us, at the Teaching newsletter archive page.

Beth McMurtrie
Beth McMurtrie is a senior writer for The Chronicle of Higher Education, where she writes about the future of learning and technology’s influence on teaching. In addition to her reported stories, she helps write the weekly Teaching newsletter about what works in and around the classroom. Email her at beth.mcmurtrie@chronicle.com, and follow her on Twitter @bethmcmurtrie.
ADVERTISEMENT
ADVERTISEMENT
  • Explore Content
    • Latest News
    • Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Virtual Events
    • Chronicle Store
    • Chronicle Intelligence
    • Find a Job
    • Post a Job
    Explore Content
    • Latest News
    • Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Virtual Events
    • Chronicle Store
    • Chronicle Intelligence
    • Find a Job
    • Post a Job
  • Know The Chronicle
    • About Us
    • Write for Us
    • Work at The Chronicle
    • Our Reporting Process
    • Advertise With Us
    • Brand Studio
    • DEI Commitment Statement
    • Accessibility Statement
    Know The Chronicle
    • About Us
    • Write for Us
    • Work at The Chronicle
    • Our Reporting Process
    • Advertise With Us
    • Brand Studio
    • DEI Commitment Statement
    • Accessibility Statement
  • Account and Access
    • Manage Your Account
    • Manage Newsletters
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    Account and Access
    • Manage Your Account
    • Manage Newsletters
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
  • Get Support
    • Contact Us
    • Reprints & Permissions
    • User Agreement
    • Terms and Conditions
    • Privacy Policy
    • California Privacy Policy
    • Do Not Sell My Personal Information
    Get Support
    • Contact Us
    • Reprints & Permissions
    • User Agreement
    • Terms and Conditions
    • Privacy Policy
    • California Privacy Policy
    • Do Not Sell My Personal Information
1255 23rd Street, N.W. Washington, D.C. 20037
© 2023 The Chronicle of Higher Education
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin