At Arizona State University, a high-tech teaching tool with roots in the pre-Internet 1950s has created a bit of a buzz. “I think it’s going to be quite good,” says Philip Regier, dean of ASU Online. “Looking forward to it,” says Arthur Blakemore, senior vice provost of the university. “I’m excited,” says Irene Bloom, a senior lecturer in mathematics at the downtown campus.
All are anticipating this summer’s debut of Knewton, a new computerized-learning program that features immediate feedback and adaptation to students’ learning curves. The concept can be traced back a half-century or so to a “teaching machine” invented by the psychologist B.F. Skinner, then a professor at Harvard University. Based on principles of learning he developed working with pigeons, Skinner came up with a boxlike mechanical device that fed questions to students, rewarding correct answers with fresh academic material; wrong answers simply got them a repeat of the old question. “The student quickly learns to be right,” Skinner said.
IN THE RIGHT COLUMN: More on The Digital Campus
BROWSE THE FULL ISSUE: News, Commentary, and Data
BUY A COPY: Digital and Print Editions at the Chronicle Store
Fifty years later, that basic idea has evolved into a hot concept in education: adaptive learning. Programs like Knewton can pace an entire math course using sophisticated tracking of skill development, instant feedback, and help levels based on mastery of concepts, as well as something the Harvard students did not get: the enjoyment of a video-game-like interface.
Courses can be offered online or blended with face-to-face instruction. “We’re talking, with the best of these programs, about very personal computer tutors,” says Ira H. Fuchs, executive director of Next Generation Learning Challenges, a nonprofit project that recently gave $10.6-million in grants to 29 colleges and organizations to develop programs that frequently feature this kind of technology. Carnegie Mellon University’s Open Learning Initiative already offers adaptive-learning courses in 12 subjects, including statistics and French.
The approach is attractive because of some unattractive numbers. Just 22 percent of students in the United States complete an associate degree within three years of starting, and only 57 percent complete a bachelor’s degree within six years, according to the Education Department. Such statistics, along with the large numbers of students who need remedial courses and drop out, drive the appeal of software that offers individualized attention to get students through basic math and other courses that are essential to college success. “These are high-risk, low-socioeconomic-status students—exactly the kind we have to reach out to,” says Mr. Regier.
And the prices appear to be right: The Carnegie courses are open-source, available free or for a nominal fee. Jose Ferreira, Knewton’s chief executive, says the company is charging Arizona State about $150 per student to use the program in a course.
Critics, however, note one crucial problem: Few good studies have been done on the outcomes of these programs, particularly at the community-college level, which serve the kind of vulnerable students they are supposed to help most.
Still, advocates say an advantage of the software is that it gives these and other students more control in a course. Historically, “students have had to work in the order that the instructor and textbook author think is most relevant,” says A. Daniel Johnson, a senior lecturer in biology at Wake Forest University, who is developing adaptive-learning software called BioBook. “But that order won’t be relevant to a significant population, because they have different background experience.” And large classes prevent instructors from deviating much from the syllabus. “When I’m in a class with 50 students, I have to keep to a certain schedule. But that won’t be right for everyone,” says Ms. Bloom.
Programs like Knewton and the Open Learning Initiative adjust the course to the student. They present every topic as a series of skills and building-block concepts. Animation, videos, interactive diagrams, and other Web-based features pop up, to distinguish ratios from rates in a math course, for instance. Interactive tutors lead students through mastery of each skill, giving short quizzes, scoring them, and offering additional help, such as extra quizzes and more explanations, when requested.
And the software adapts. “Knewton keeps personal profiles. It does that across millions of variables per student,” Mr. Ferreira says. It tracks how long they take on each problem, whether they ask for extra help and what kind, whether they go back and repeat a lesson or rush through it, and what types of questions they answer correctly and incorrectly—all matched against data from other students. “Students don’t move on until they develop proficiency,” says Mr. Regier. “In a face-to-face course, we’re done with Chapter 1 at the end of Week 1 and we’re moving on to Chapter 2 whether you understand Week 1 or not.”
Instructors can decide what counts as “understanding Week 1.” In MyLabs, a series of learning programs from Pearson Education, “the instructors can set levels. They can ‘weight’ a concept up to 100 percent, and a student can’t progress until they get it 100 percent,” says Jason A. Jordan, the company’s director of digital strategy and distribution in the arts and sciences. If the software detects that a student is mastering a concept, it will move through the material quickly; if it detects difficulty, it will offer more help. “The instructor can also weight learning materials, so students won’t get video help as the first help choice all the time if he doesn’t want them to,” Mr. Jordan says.
All of these data go to the instructor, as well.
The Open Learning Initiative, like other programs, has a “dashboard” that gives a professor a view of the class or of a single student. “The dashboard tells the student he is having trouble with learning objective Number 4. It tells the instructor that a lot of students are having trouble with Number 4,” says Marsha C. Lovett, an associate teaching professor of psychology at Carnegie Mellon University who helped develop the program. “Then the instructor can take action. For instance, he can e-mail the student right away.”
In contrast with a lecture course, advocates say, students using an adaptive-learning approach are strongly motivated to advance. That, says Mr. Blakemore of Arizona State, is because with each skill, students accumulate points and badges, as in a video game, and they know they have to get a specific amount of these to proceed to another level. “It gives them rewards in the gaming style, which all the literature says is what works best,” he says. “And it lets them know how much they have left to do, and that keeps them moving forward. They don’t stall out, which is what happened with some earlier programs that we tested.”
The rewards scheme sounds simplistic, but Neil H. Hultgren, a 19-year-old freshman at Arizona State who has been testing Knewton, says it works. “Getting badges as you advance makes you want to go further,” he says. “And you take things step by step rather than worrying about what grade you are going to get at the end.”
But does all this really help students? Carnegie Mellon’s open-learning program is the only one whose results have been studied extensively in actual courses—Knewton is too new—and the answer appears to be yes. Students taking an accelerated open-learning statistics course at Carnegie Mellon in blended form completed it in eight weeks; they learned as much material, and performed as well on tests, as students taking a traditional 15-week course. At a large public university, 99 percent of students taking the program’s formal-logic course online completed it, compared with 41 percent of students in the traditional course.
These results, however, have not convinced everyone. “The main problem is that tests of the effectiveness of these types of software are not done in isolation,” says Shanna Smith Jaggars, a senior research associate at the Community College Research Center of Columbia University who studies online learning. Other classroom changes that accompany the use of adaptive-learning programs might bias the outcome. For example, the acceleration in the first Carnegie Mellon course, rather than the software, could push students to achieve more than in the slower version, she says. (Ms. Lovett responds that the Carnegie Mellon researchers were trying to examine the effects of a multifeatured environment, and deliberately did not isolate one element.)
Only two studies have evaluated adaptive learning in community-college settings, Ms. Jaggars says, and the results were not good. “Both showed much higher withdrawal rates for the online-course students than for the students in the face-to-face version of the course,” she says. But that may not be an indictment of the software. “There are lots of reasons why students have difficulties in online courses—technology problems, a sense of isolation, poor time-management skills—so it’s difficult to know whether the software had anything to do with it.”
The lack of data doesn’t deter Arizona State. This summer, officials plan to roll out basic math classes online, and add blended courses in the fall. “After math, we’ll go on and do the English courses next,” says Mr. Regier. “It’s really these baseline courses where students come in not college-ready that we have to address.”
Ms. Bloom, who will be teaching the blended courses, is ready. “I don’t think I’ll be so stressed-out as a teacher,” she says. “I’ll see who is on track, and who is off-track. And I’ll know when to step in.”