At the University of Iowa, “General Chemistry I” enrolls as many as 1,500 students a semester. They can go to the lectures in person — 8 a.m. Tuesdays and Thursdays in a Chemistry Building auditorium that seats 395 — or watch recordings through the university’s course-management system. In addition to attending a weekly discussion section led by a teaching assistant, students are told to plan on spending six hours a week online doing homework using Mastering Chemistry, a textbook publisher’s product.
The chemistry students are among the first students at Iowa to benefit from a home-grown predictive-analytics project that aims to help make sure they pass the course. The three-year-old project, called Elements of Success, combines data about students who have already taken the course with information about current students’ backgrounds, how long they’re spending on homework, and how well they’re understanding it. Then it offers each student a dashboard with visualizations that show how he or she is doing relative to others in the class, and also forecasts the student’s final grade. For students who aren’t doing well, it suggests what help is available from the Academic Support and Retention office.
Colleges want to track students and help them succeed, to find out what works in the classroom, and to measure professors’ productivity. Read a special report that unpacks what big data can and can’t do.
For some years now, learning-technology visionaries have anticipated a time when analytical tools will harness streams of data about how students fare in courses, generating reliable predictive models that will help make sure they succeed in college. But so far comparatively few institutions have seen those visions become reality. Elsewhere, technical challenges, wariness among faculty and staff members, and commercial offerings that fail to satisfy mean that widespread adoption of learning analytics is yet to come.
Last year’s edition of the annual Campus Computing Survey found that less than a fifth of respondents rated data-analytics investments their institutions had made as “very effective.” Kenneth C. Green, the survey’s director, says that’s not a surprise. “As with so many new technologies in the consumer, corporate, and campus sectors, the actual, implied, and inferred promises often fall short of initial performance.”
Even Ithaka, a nonprofit organization devoted to helping colleges take advantage of digital resources, said in a comprehensive 2016 report that “the potential of these new uses remains underdeveloped,” although the report also said that “research using large-scale learner data is progressing along a number of promising avenues.”
Some large institutions have moved more enthusiastically to deploy campuswide early warning systems for students’ academic performance, among them Arizona State, Georgia State, and Purdue Universities, Rio Salado College, and the University of Michigan. And Unizin, a consortium aimed at “enhancing learner success with digital technology and resources,” has grown to a dozen institutional members, including Iowa. But Iowa’s more cautious approach is by no means unusual.
“General Chemistry I” is a “very large course, a very critical course,” says Sam Van Horne, assessment director in the university’s Office of Teaching, Learning & Technology. In university surveys, a third of first-year students rank the course as their most difficult, he adds, so “this was a class where we wanted to do more to support learners.” Mr. Van Horne worked with several others, including one of the course instructors, Russell Larsen, who wanted to offer students better feedback than was available through commercial products. Elements of Success is now also used in another chemistry course as well as in “Foundations of Biology,” and about 90 percent of students in the courses look at the dashboards, Mr. Van Horne says.
“We see a positive difference for users that are using it more frequently,” he says. It’s also cut down on the number of emails instructors get from students seeking to know how they’re doing. And it’s opened the eyes of some instructors who “believed that women or underrepresented students were doing as well as other students,” Mr. Van Horne says, because “we were able to show that they were not.”
Tammi J. Anderson, a first-year student in “General Chemistry I,” has found Elements of Success to be helpful. “It gives me an idea if I am at where I need to be, and if I’m not, it shows me how far off I am,” says Ms. Anderson, who plans to major in neurobiology and attend medical school.
But Elements of Success also represents some of the difficulties of using “big data” to help students succeed. Mr. Van Horne, a graduate student, and an undergraduate assistant manually pull data for the project from the university student-information system, the course-management system, and the homework site, although the university recently switched to a new course-management provider that will enable them to automate some of the work.
“That’s one of the critical roadblocks to learning analytics,” he says. “Data exists in different third-party systems.” To expand the program, “Either we find better ways to automate or we have to have more resources.”
Elements of Success is customized for each course, relying on whatever information best predicts how students will fare on the first exam. For “General Chemistry,” that means looking at students’ performance on the most recent three homework assignments, but for “Foundations of Biology,” the program instead picks up the results of clicker-based quizzes. There’s also a different “dashboard” for every course. “We don’t want our dashboard to have 50 points of data — you have to distill it down. You want them to get feedback to orient them to take the next action.”
Pennsylvania State University hopes to roll out a data-driven early warning system this coming fall for its STEM-centered Millennium Scholars program, which enrolls a number of students from underrepresented minority groups. “We’re going at it slowly,” says David R. Smith, associate dean for advising and executive director of undergraduate studies.
“We could have all this data that says x, y, or z about students and their behaviors and their patterns, but have you really thought about what you want students to do once you’ve identified that there’s an issue?” he says. “You’ve got to have the resources lined up to help them. My real thing is, data is great, but it’s only as good as the people that are behind it.”
Mr. Smith is also concerned that data could be used to unfairly profile students.
“I think there’s any number of issues with that,” he says. “Doors could get inadvertently closed on the student because you’re reinforcing biases that whoever’s talking to that student may have.
“Does this add to that? Does it detract from it? Can we use data to better understand interventions to know whether or not those interventions help students succeed?”
Meanwhile, many smaller colleges say they don’t see a need for data-driven early-alert systems, although they are taking advantage of the communications and note-sharing capacity offered by commercial advising systems. “We are too small to have big data,” says Kerry E. Pannell, vice president for academic affairs and dean of Agnes Scott College, which has just under 950 students. “We don’t have 10,000 students and an algorithm running to show if a student gets a C in this class. The advisers already know all the students.”
But Agnes Scott and other small colleges are using data to improve admissions and retention by understanding what kinds of students do well at their institutions, and how to attract and retain them. Franklin & Marshall College’s vice president for planning, Alan S. Caniglia, says that a review of admissions data a dozen years ago showed that “financial aid that was not based on need was not increasing the likelihood that the kinds of students we were trying to attract would actually enroll.” So the college dropped so-called merit aid, and is now getting more applications and better students. “If we hadn’t been analyzing the data and been open to what the data would tell us, we would never have gotten to that point.”
And while many colleges are proceeding cautiously where learning analytics are concerned, there’s plenty of interest among faculty members, says Jennifer Sparrow, Penn State’s senior director of teaching and learning with technology.
Kyle Bowen, the university’s director of education-technology services, says faculty members in different departments are working on projects that would use textual-analysis programs, facial-recognition software, and even smartphones or Apple Watches to capture data about how students go about learning. Someday soon such information could be added to the early warning system the university will roll out for its Millennium Scholars next fall, and could even be even augmented with data captured when students swipe into dining halls, dormitories, and recreation facilities, giving the university an ever-more-detailed understanding of how students learn.
“Once you have this really good model to work from, you can begin to use this kind of science to explore engagement questions, course-design questions,” says Mr. Bowen. “The more we layer in additional data to our model, the more accurate it gets.”