> Skip to content
FEATURED:
  • The Evolution of Race in Admissions
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Find a Job
    • Post a Job
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Find a Job
    • Post a Job
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Find a Job
    • Post a Job
Sign In
ADVERTISEMENT
News
  • Twitter
  • LinkedIn
  • Show more sharing options
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
  • Copy Link URLCopied!
  • Print

When the Teaching Assistant Is a Robot

Faculty members experiment with artificial intelligence in the classroom

By  Ben Gose
October 23, 2016
Ashok Goel, a computer-science professor  at Georgia Tech, is surrounded by  his teaching assistants, including  “Jill Watson” (on the screen),  a question-answering software program.
Dustin Chambers for The Chronicle
Ashok Goel, a computer-science professor at Georgia Tech, is surrounded by his teaching assistants, including “Jill Watson” (on the screen), a question-answering software program.

Last spring, Ashok K. Goel pulled off one of the great pranks in the history of artificial intelligence. In an online course, Mr. Goel, a computer-science professor at the Georgia Institute of Technology, relied on nine teaching assistants, including one named Jill Watson, to handle questions from the 300 students. Near the end of the term, Mr. Goel revealed to students that Jill was in fact a computerized assistant, powered by IBM’s Watson technology, which is designed to answer questions. A few students had suspicions about Jill along the way, but one thought Mr. Goel might be a computer, too.

We’re sorry. Something went wrong.

We are unable to fully display the content of this page.

The most likely cause of this is a content blocker on your computer or network. Please make sure your computer, VPN, or network allows javascript and allows content to be delivered from c950.chronicle.com and chronicle.blueconic.net.

Once javascript and access to those URLs are allowed, please refresh this page. You may then be asked to log in, create an account if you don't already have one, or subscribe.

If you continue to experience issues, contact us at 202-466-1032 or help@chronicle.com

Last spring, Ashok K. Goel pulled off one of the great pranks in the history of artificial intelligence. In an online course, Mr. Goel, a computer-science professor at the Georgia Institute of Technology, relied on nine teaching assistants, including one named Jill Watson, to handle questions from the 300 students. Near the end of the term, Mr. Goel revealed to students that Jill was in fact a computerized assistant, powered by IBM’s Watson technology, which is designed to answer questions. A few students had suspicions about Jill along the way, but one thought Mr. Goel might be a computer, too.

The successful experiment suggests that higher education — like many other industries — may be in for major change as artificial intelligence becomes more widespread. “Intelligent agents” — including well-known chatbots such as Amazon’s Alexa and Apple’s Siri — are becoming more sophisticated and are expected to eliminate 6 percent of jobs by 2021, primarily in areas like transportation and customer service, according to a new report by Forrester, a technology-research company.

Is higher education next?

Mr. Goel says his goal was to simply free himself and the human teaching assistants from having to answer the more mundane questions, such as when assignments were due. In the course — fittingly, it focuses on artificial intelligence — students ask about 10,000 questions per semester. Jill Watson was “trained” on a data set of 40,000 questions asked in previous classes.

“I don’t intend to put myself out of business,” Mr. Goel says. “I think of this as improving teaching quality. I don’t think of it as decreasing teaching quantity.”

ADVERTISEMENT

Next: The Innovation Issue -- COVER
Next: The Innovation Issue
Nervousness over the economy and questions about the value of a college degree have contributed to growing expectations that colleges must make career services a priority. This special report on innovation examines some of the career-counseling efforts underway — by colleges, start-ups, and collaborations between the two. See the entire issue here.
  • Reinventing the Career Center
  • Shadow Those Students, for Their Own Good
  • It’s Time to Change What We Mean by ‘Credential’
  • For Real Academic Disruption, Try Empathy

Some other professors agree. Amy Ogan, an assistant professor of educational technology at Carnegie Mellon University, envisions spending more time talking to students about computer-science careers, or challenging the top students in the class, if computerized teaching assistants become widespread.

“If we can offload the lower-level questions that these sort of agents are really good at answering right now, that would be a fantastic win for instructors and TAs,” Ms. Ogan says.

Jill Watson’s debut was widely covered by the media — including The Wall Street Journal and The Washington Post — but some researchers aren’t convinced that she is a technological breakthrough. Roger Schank, an independent scholar who once led Yale University’s Artificial Intelligence Project, suggested in a long blog post that Jill Watson illustrated the shrinking ambition of artificial-intelligence researchers. They have retreated from efforts in the 1960s and 1970s to create humanlike artificial intelligence, he wrote, settling instead for “machine learning” that relies on “massive matching capabilities to produce canned responses.”

“The real questions are the same as ever,” he wrote. “What does it mean to have a mind? … How can we get a computer to do what I am doing now — thinking, wondering, remembering, and composing?”

Oren Etzioni, chief executive of the Allen Institute for Artificial Intelligence and a computer-science professor at the University of Washington, says creating machines with humanlike intelligence is one of science’s most vexing challenges. “It may take us 100 years or more to get to where Roger wants to be,” Mr. Etzioni says.

ADVERTISEMENT

Many scholars have come to believe that as long as artificial intelligence is useful and humanlike in presentation, it doesn’t matter if the computer thinks like a human, says Douglas H. Fisher, an associate professor of computer science at Vanderbilt University.

I don’t intend to put myself out of business. I think of this as improving teaching quality.

A research team at Georgia Tech spent more than 1,500 hours creating and refining Jill Watson so that she could answer questions autonomously, Mr. Goel says. Since the story of the computerized teaching assistant broke, Mr. Goel says he has received inquiries from more than 200 people, including educators and professionals in finance and real estate. “They’re asking, Can you build us a Jill Watson so that we can answer questions automatically?” he says.

Mr. Goel is, in fact, starting a company through Georgia Tech’s VentureLab to commercialize the technology. MOOCs are an obvious potential customer — students in those huge classes often complain about lack of feedback, and difficulty in getting answers to questions. Yet Mr. Goel also hopes to eventually make the technology simple enough that middle-school teachers can use it.

He is teaching the artificial-intelligence class again this semester. This time he will employ multiple versions of “Jill Watson” — including one that has been programmed with information from the current schedule and syllabus. He has given pseudonyms to all of the teaching assistants, including the humans.

“Last time, the students were not looking,” he says. “This time they are.”

ADVERTISEMENT

Intelligent agents are making an impact in other areas of the classroom, too. In an introductory computer-science course at the Massachusetts Institute of Technology, students work in teams to program a robot to navigate a maze. A “tutor” within the robot provides students with immediate feedback through short quizzes, and then helps them make progress on the robot based on the skill level they demonstrated on the quiz, says Isaac Chuang, a professor of physics and of electrical engineering and senior associate dean of digital learning. At any time, students can click a help button to get assistance from the professor or teaching assistant.

“We don’t seek to reduce the number of TAs with this,” Mr. Chuang says. “The use of this kind of platform allows you to shift from grading problem sets to spending quality one-on-one time with students.”

Vincent Aleven, associate professor of human-computer interaction at Carnegie Mellon, helped create a similar type of software, called Cognitive Tutor, that is used in more than 2,500 middle schools and high schools. The tutor can track how students are doing in real time and give hints to help them get to the right answer.

Mr. Aleven has tested the technology in a MOOC called “Big Data in Education,” offered by another professor on the EdX platform. He believes artificial intelligence has the potential to make it much easier for students in MOOCs to find help from their peers. A computerized tutor may one day analyze massive data sets involving thousands of students, and link students who are stumbling over a topic with others who have only recently mastered it.

“If I’m struggling, and you also struggled but you overcame it, you are the perfect person to help me,” Mr. Aleven says.

ADVERTISEMENT

He says it is probably only a matter of time before multiple bots, including tutoring bots and teaching assistants like Jill Watson, are offered in the same class.

Ms. Ogan, Mr. Aleven’s colleague at Carnegie Mellon’s Human-Computer Interaction Institute, says it’s important that chatbots be likeable as they migrate into the classroom. Some research finds that students who have a good rapport with each other learn better when they’re paired on an assignment than do strangers working together. So in theory, a computerized tutor will be more effective as it becomes more socially adept.

Ms. Ogan and others at Carnegie Mellon are studying how middle-school students interact when they tutor one another. The researchers have made audio and video recordings of the students’ interactions, looking for cues — such as smiles, eye gaze, speech patterns, and pitch — that help forge connections. Those nuances are incorporated into a virtual math tutor.

“That virtual character needs to look like somebody you would want to interact with,” Ms. Ogan says. “If an adult tried to script the text for kids, you could come across as patronizing or just plain ‘not fun.’”

Mr. Goel, meanwhile, wants to find out how learning in the presence of intelligent agents affects real students in a classroom. He is seeking grant funding to study how students change when computerized teaching assistants like Jill Watson join a class.

ADVERTISEMENT

“How might human discourse change as artificial intelligence comes to life?” Mr. Goel says. “We have just begun research on that.”

A version of this article appeared in the October 28, 2016, issue.
Read other items in this Next: The Innovation Issue package.
We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Ben Gose
Ben Gose is freelance journalist and a regular contributor to the The Chronicle of Higher Education. He was a senior editor at The Chronicle from 1994-2002.
ADVERTISEMENT
ADVERTISEMENT
  • Explore
    • Get Newsletters
    • Letters
    • Free Reports and Guides
    • Blogs
    • Virtual Events
    • Chronicle Store
    • Find a Job
    Explore
    • Get Newsletters
    • Letters
    • Free Reports and Guides
    • Blogs
    • Virtual Events
    • Chronicle Store
    • Find a Job
  • The Chronicle
    • About Us
    • DEI Commitment Statement
    • Write for Us
    • Talk to Us
    • Work at The Chronicle
    • User Agreement
    • Privacy Policy
    • California Privacy Policy
    • Site Map
    • Accessibility Statement
    The Chronicle
    • About Us
    • DEI Commitment Statement
    • Write for Us
    • Talk to Us
    • Work at The Chronicle
    • User Agreement
    • Privacy Policy
    • California Privacy Policy
    • Site Map
    • Accessibility Statement
  • Customer Assistance
    • Contact Us
    • Advertise With Us
    • Post a Job
    • Advertising Terms and Conditions
    • Reprints & Permissions
    • Do Not Sell My Personal Information
    Customer Assistance
    • Contact Us
    • Advertise With Us
    • Post a Job
    • Advertising Terms and Conditions
    • Reprints & Permissions
    • Do Not Sell My Personal Information
  • Subscribe
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    • Manage Newsletters
    • Manage Your Account
    Subscribe
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    • Manage Newsletters
    • Manage Your Account
1255 23rd Street, N.W. Washington, D.C. 20037
© 2023 The Chronicle of Higher Education
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin