Last spring, Ashok K. Goel pulled off one of the great pranks in the history of artificial intelligence. In an online course, Mr. Goel, a computer-science professor at the Georgia Institute of Technology, relied on nine teaching assistants, including one named Jill Watson, to handle questions from the 300 students. Near the end of the term, Mr. Goel revealed to students that Jill was in fact a computerized assistant, powered by IBM’s Watson technology, which is designed to answer questions. A few students had suspicions about Jill along the way, but one thought Mr. Goel might be a computer, too.
The successful experiment suggests that higher education — like many other industries — may be in for major change as artificial intelligence becomes more widespread. “Intelligent agents” — including well-known chatbots such as Amazon’s Alexa and Apple’s Siri — are becoming more sophisticated and are expected to eliminate 6 percent of jobs by 2021, primarily in areas like transportation and customer service, according to a new report by Forrester, a technology-research company.
Is higher education next?
Mr. Goel says his goal was to simply free himself and the human teaching assistants from having to answer the more mundane questions, such as when assignments were due. In the course — fittingly, it focuses on artificial intelligence — students ask about 10,000 questions per semester. Jill Watson was “trained” on a data set of 40,000 questions asked in previous classes.
“I don’t intend to put myself out of business,” Mr. Goel says. “I think of this as improving teaching quality. I don’t think of it as decreasing teaching quantity.”
Nervousness over the economy and questions about the value of a college degree have contributed to growing expectations that colleges must make career services a priority. This special report on innovation examines some of the career-counseling efforts underway — by colleges, start-ups, and collaborations between the two. See the entire issue here.
Some other professors agree. Amy Ogan, an assistant professor of educational technology at Carnegie Mellon University, envisions spending more time talking to students about computer-science careers, or challenging the top students in the class, if computerized teaching assistants become widespread.
“If we can offload the lower-level questions that these sort of agents are really good at answering right now, that would be a fantastic win for instructors and TAs,” Ms. Ogan says.
Jill Watson’s debut was widely covered by the media — including The Wall Street Journal and The Washington Post — but some researchers aren’t convinced that she is a technological breakthrough. Roger Schank, an independent scholar who once led Yale University’s Artificial Intelligence Project, suggested in a long blog post that Jill Watson illustrated the shrinking ambition of artificial-intelligence researchers. They have retreated from efforts in the 1960s and 1970s to create humanlike artificial intelligence, he wrote, settling instead for “machine learning” that relies on “massive matching capabilities to produce canned responses.”
“The real questions are the same as ever,” he wrote. “What does it mean to have a mind? … How can we get a computer to do what I am doing now — thinking, wondering, remembering, and composing?”
Oren Etzioni, chief executive of the Allen Institute for Artificial Intelligence and a computer-science professor at the University of Washington, says creating machines with humanlike intelligence is one of science’s most vexing challenges. “It may take us 100 years or more to get to where Roger wants to be,” Mr. Etzioni says.
Many scholars have come to believe that as long as artificial intelligence is useful and humanlike in presentation, it doesn’t matter if the computer thinks like a human, says Douglas H. Fisher, an associate professor of computer science at Vanderbilt University.
I don’t intend to put myself out of business. I think of this as improving teaching quality.
A research team at Georgia Tech spent more than 1,500 hours creating and refining Jill Watson so that she could answer questions autonomously, Mr. Goel says. Since the story of the computerized teaching assistant broke, Mr. Goel says he has received inquiries from more than 200 people, including educators and professionals in finance and real estate. “They’re asking, Can you build us a Jill Watson so that we can answer questions automatically?” he says.
Mr. Goel is, in fact, starting a company through Georgia Tech’s VentureLab to commercialize the technology. MOOCs are an obvious potential customer — students in those huge classes often complain about lack of feedback, and difficulty in getting answers to questions. Yet Mr. Goel also hopes to eventually make the technology simple enough that middle-school teachers can use it.
He is teaching the artificial-intelligence class again this semester. This time he will employ multiple versions of “Jill Watson” — including one that has been programmed with information from the current schedule and syllabus. He has given pseudonyms to all of the teaching assistants, including the humans.
“Last time, the students were not looking,” he says. “This time they are.”
Intelligent agents are making an impact in other areas of the classroom, too. In an introductory computer-science course at the Massachusetts Institute of Technology, students work in teams to program a robot to navigate a maze. A “tutor” within the robot provides students with immediate feedback through short quizzes, and then helps them make progress on the robot based on the skill level they demonstrated on the quiz, says Isaac Chuang, a professor of physics and of electrical engineering and senior associate dean of digital learning. At any time, students can click a help button to get assistance from the professor or teaching assistant.
“We don’t seek to reduce the number of TAs with this,” Mr. Chuang says. “The use of this kind of platform allows you to shift from grading problem sets to spending quality one-on-one time with students.”
Vincent Aleven, associate professor of human-computer interaction at Carnegie Mellon, helped create a similar type of software, called Cognitive Tutor, that is used in more than 2,500 middle schools and high schools. The tutor can track how students are doing in real time and give hints to help them get to the right answer.
Mr. Aleven has tested the technology in a MOOC called “Big Data in Education,” offered by another professor on the EdX platform. He believes artificial intelligence has the potential to make it much easier for students in MOOCs to find help from their peers. A computerized tutor may one day analyze massive data sets involving thousands of students, and link students who are stumbling over a topic with others who have only recently mastered it.
“If I’m struggling, and you also struggled but you overcame it, you are the perfect person to help me,” Mr. Aleven says.
He says it is probably only a matter of time before multiple bots, including tutoring bots and teaching assistants like Jill Watson, are offered in the same class.
Ms. Ogan, Mr. Aleven’s colleague at Carnegie Mellon’s Human-Computer Interaction Institute, says it’s important that chatbots be likeable as they migrate into the classroom. Some research finds that students who have a good rapport with each other learn better when they’re paired on an assignment than do strangers working together. So in theory, a computerized tutor will be more effective as it becomes more socially adept.
Ms. Ogan and others at Carnegie Mellon are studying how middle-school students interact when they tutor one another. The researchers have made audio and video recordings of the students’ interactions, looking for cues — such as smiles, eye gaze, speech patterns, and pitch — that help forge connections. Those nuances are incorporated into a virtual math tutor.
“That virtual character needs to look like somebody you would want to interact with,” Ms. Ogan says. “If an adult tried to script the text for kids, you could come across as patronizing or just plain ‘not fun.’”
Mr. Goel, meanwhile, wants to find out how learning in the presence of intelligent agents affects real students in a classroom. He is seeking grant funding to study how students change when computerized teaching assistants like Jill Watson join a class.
“How might human discourse change as artificial intelligence comes to life?” Mr. Goel says. “We have just begun research on that.”