Neither of Callie S. Blakey’s parents has a bachelor’s degree, and when she set off to the University of Wisconsin at Oshkosh, a couple of hours from home, she worried about how she’d do. But a program there was set up to make sure she wouldn’t fail.
That first year, Ms. Blakey and her classmates took three surveys through EBI MAP-Works, a retention-and-success program that uses predictive analytics to identify at-risk students. A counselor in Student Support Services, a federally funded program for young people from disadvantaged backgrounds, reviewed Ms. Blakey’s results with her. One thing they found: “I wasn’t studying as much as is typically, you know, good for a freshman for the number of credits I had,” Ms. Blakey says. She worked to improve and is now a rising junior studying finance and is active in three honor societies.
Electronic data systems can help colleges advise students, plan their academic schedules, and see—by way of “early alerts"—who might need some kind of intervention. Some systems use predictive analytics to identify patterns. Proponents of the programs, which have become known as Integrated Planning and Advising Services, or IPAS, hope they will raise retention rates without straining budgets.
Student retention rates from first to second year have remained stagnant for more than a century, according to research by the higher-education theorist Vincent Tinto and others. While the share of students who didn’t make it used to signal a college’s rigor, many colleges are now working to raise retention rates—for the sake of students, as well as accountability measures, performance-based funding, and tuition revenue.
Researchers and administrators point to the gap between research on retention and efforts on campuses as one reason the rates haven’t budged.
While some colleges have been mining data to predict and prevent dropouts for several years, only 5 percent of several hundred institutions surveyed last year by Educause used such analytics programs. Early-alert programs, by contrast, were more common, with nearly half of respondents reporting having them.
Many colleges resort to less-sophisticated retention efforts, higher-education expert say, that aren’t empirically based. “So many times there is programming to address retention that isn’t based off of data, it’s just copying similar schools,” says Willis A. Jones, an assistant professor of education at the University of Kentucky.
At Morgan State University, in Baltimore, Tiffany Beth Mfume, director of student success and retention, had been relying on Google documents and Microsoft Excel, among other resources, to keep track of students. Last year Morgan State started using a system called Starfish with a grant from the Bill & Melinda Gates Foundation. Gates gave 19 institutions grants of $100,000 to buy IPAS systems through vendors.
By creating the term “IPAS,” Greg Ratliff, a senior program officer at the Gates foundation, hoped to spark discussion of the technology. In the past year, Educause and the American Association of Collegiate Registrars and Admissions Officers have featured the services at conferences.
Advocates envision the next frontier of retention as combining advising, degree planning, alerts, and interventions in software programs that can draw on predictive analytics. Student-affairs officers and academic advisers, one vendor says, will become “student-success scientists.”
Unexpected Insights
Already, nascent electronic data systems are challenging commonly held assumptions about retention.
Colleges tend to think of—and measure—retention between students’ first and second years. But upperclassmen need attention, too, officials at Montclair State University discovered through a combination of institutional research data, student surveys, and software from IPAS vendors.
Allyson Straker-Banks, associate vice president for student academic services, found that if a student doesn’t declare a major in time or switches majors, his or her faculty adviser sometimes changes. Confusion about who is advising upperclassmen, she found, can leave some of them without advisers. To help prevent that, Montclair State created the Office of Academic Success and Retention Programs in 2011, to support upperclassmen. The university also identified unique needs among older students, now advised by one person in the office.
‘Murky Middle’
National data also call into question the framing of retention as a first-year issue. The Education Advisory Board, a research, technology, and consulting company, has found that of all students who drop out of college, only about half do so before their second year.
Many who drop out as juniors neither ask for help nor do so badly that they attract the attention of professors or student-affairs officials, the firm has found. It calls those students the “murky middle,” arguing that with support, they might stay enrolled.
Predictive-analytics programs can help the “murky middle,” consultants say, by helping students pick the right courses. Good grades in calculus, for example, might predict success in economics better than “Introduction to Macroeconomics” does.
Other components of predictive analytics systems examine students’ psychological states. Surveys like the one Ms. Blakey took at Oshkosh ask 100 questions. If a student is homesick but has high resilience to stress, maybe the risk of dropping out isn’t so high.
The University of San Diego, which, like Oshkosh, uses MAP-Works, reviews survey results to connect struggling students with peer leaders. A student who gets a very low survey score, noted as “red,” is referred to an administrator. Students who score “yellow” are referred to academic or residential advisers to discuss problems they might be having.
Referrals to student leaders like RAs help them “feel more empowered in their roles” and save staff members time, says Stephanie M. Bernasconi, associate director of San Diego’s Center for Student Success.
Integrated systems can also pinpoint groups of students who may be at risk. Western Kentucky University found that only 45 percent of freshmen who participated in intramural sports returned for their sophomore year. Iowa State University, after seeing low survey scores among minority men, created a Men of Color Collective.
‘Too Much to Check’
To harness the potential of integrated technology, advocates say, colleges must make sure faculty and staff members, as well as students, adapt to its use.
Inherent in any electronic data system are privacy concerns. Among campus officials who have made IPAS technology a priority, about half see individual privacy rights as at least a minor concern, according to an Educause survey conducted in the spring. Roughly three quarters of respondents were concerned about misuse of data and incorrect conclusions drawn from the data.
Some researchers worry that information gleaned from electronic data systems could create a self-fulfilling prophecy for students, especially those from historically marginalized groups.
“The whole field is growing,” Mr. Tinto, an emeritus professor at Syracuse University, says of predictive analytics. The danger is it is still in the formative stages. Social class, gender, race, and performance research is based on aggregation, on the average, but the individual is not an average,” he says. “The individual is an individual.”
Another challenge with the systems is finding the resources to process so much information. James D. Mantooth, director of retention services at Murray State University, received 1,500 alerts through the MAP-Works program in the 2012-13 academic year. “We burned the midnight oil on more than one occasion,” he says.
Of course, privacy concerns, stereotypes, and data pileup become challenges only if a system takes hold on a campus. Without faculty buy-in, advocates say, even the best predictive analytics will fail. In the Educause survey, faculty resistance and disinterest were the top concerns.
R. Trent Haines, an assistant professor of psychology at Morgan State University, calls the Starfish program a “godsend” in helping him keep track of the 50 students he advises. But not everyone likes it, he says. “Some of my colleagues think this is too much to check, especially folks who aren’t as technologically savvy.”
Designing software alone will hardly keep students in college, says Melinda Mechur Karp, assistant director for staff and institutional development at the Community College Research Center at Columbia University’s Teachers College.
“If you build it, they may not come,” says Ms. Karp. “You could build the best product, but if people don’t want to use the product, it doesn’t matter.”
Clarification (8/11/2014, 12:15 p.m.): Education Advisory Board, the company that found that of students who drop out of college, only about half do so before their second year, is a technology and research company as well as a consulting company. This article has been updated to reflect that.