As data on student progress become more robust and easier to gather, more and more colleges are “nudging” students, prompting them to make decisions that will help them stay on track and graduate. We — an academic administrator and a company leader — have worked at the forefront of nudging in higher education since 2012 and are thrilled to see its widespread embrace as a student-success strategy.
Yet we are concerned that colleges are equipped with plenty of data but not enough guidance about how to use that information effectively in reaching out to students.
The job is becoming more professionalized, holistic, and high tech. But colleges are just beginning to learn how to use new masses of data to help their students thrive.
Over the past several years, the University of Washington at Tacoma has used predictive analytics and early-alert and student-nudging tools that align with known student-success strategies. The university has put a lot of thought into how best to act on the real-time student data that those tools produce. Its partnership with Persistence Plus, which supports students with personalized text-message nudges throughout the academic year, has seen consistently strong results — including a study that found a six-percentage-point increase in graduation for the students most at risk of dropping out.
Our experience has led us to offer five key questions that colleges should ask themselves about their efforts to nudge students.
Does the intervention address the actual student challenge? Thoughtfully prompting students to complete important milestones to graduation can be helpful, but to be truly effective, interventions must target the underlying reasons that more students aren’t completing these activities on their own. For example, one college had gently encouraged students who were close to completion to complete their graduation applications. However, it was found that some students did not know if they were actually eligible to graduate.
The fix was not a simple intervention but rather the more complex work of improving the institution’s degree-audit process so students understood their eligibility for graduation. In another case, here at Tacoma, a registration outreach revealed that students mistakenly thought that they were ineligible to file for graduation if they had taken out emergency loans.
Do the data that prompt an intervention make sense and tell the complete story? One can still make mistakes, even when constructing thoughtful prompts. At Tacoma, outreach through our analytics system to students not registered for the next term didn’t take into consideration that some students graduate in the winter quarter. Because registration and graduation-process data were separate, the nudge mistakenly included those students poised to graduate and caused a few of them to worriedly ask if their paperwork had been rejected. We learned that it is important for the team selecting data to drive interventions to be familiar with the graduation process and be made up of people who hold a diversity of roles.
Does the intervention effectively use evidence-based practices such as behavioral science? Thoughtful use of behavioral science offers a number of strategies that — without being punitive, clueless, or coercive — can help students to follow through on their intentions. For instance, a nudge encouraging students to seek tutoring can couch this behavior as an expected college norm rather than as stigmatizing. An intervention that promotes a view of science as collaborative and engaging can help students persist in STEM majors.
If a nudging effort is to be effective, the design team should include experts in social psychology and student interventions. Don’t leave content solely to your IT staff or registrar.
Do interventions consider student challenges that may not show up in early-alert systems? Traditionally, early-alert systems rely on flags, often from the faculty, about a poor grade or missed class. While those are important, they don’t capture the more nuanced reasons that students can suddenly fall off track. Interactive nudging can respond to different kinds of data that may predict withdrawal from college, such as students’ motivation levels, a lack of social fit, or home and life challenges that threaten to derail their progress.
A study at the University of Washington revealed that only 22 percent of lower-division students leave because of any academic distress. They disappear for diverse reasons, often as a result of temporary crises that the university could have helped to mitigate. Nudges can help students stay in college by connecting them to services that support them, such as food pantries, housing aid, and child care when they most need it.
Is the intervention nuanced to reflect the needs of different student populations? Nontraditional students — now the majority of U.S. college enrollment — can have a swirl of characteristics: first-generation, military, underrepresented minorities, part-time, first year, mothers, Pell Grant recipients. Often they feel alone or like outsiders. They can become the high-achieving students who don’t return, with lingering debt and without a degree. But studies have shown that brief interventions can reframe their perspective on college. Using nudges that target the specific needs and strengths of these different populations and the campus resources that are available to them can let students know they’re seen and can foster an internal sense of belonging that leads to degree completion.
Repeatedly, we’ve seen that learner analytics used to generate interventions, reminders, kudos, and nudges can have positive impacts on student success. Insights from the students themselves have helped us tailor our interventions to better meet their needs and be more efficient with our time and resources. The obligation that comes with knowing which students are at risk means that we cannot continue the status quo of failing to extend the right supports to a new population of college students. Consider this a nudge.