Getting to and through college requires navigating a tangled bureaucracy. Students must prepare for college-level courses, apply, and decide where to go. The financial-aid process is a maze in its own right. Once they’re on campus, students have to make sure their academic work adds up to a degree and contend with a host of administrative tasks. All of this is harder, of course, for students who are figuring it out without high-touch, expert guidance.
One increasingly popular strategy for helping these students is to “nudge” them. Nudging, which is grounded in behavioral economics, uses low-cost, low-touch interventions to encourage, but not require, people to take a particular action. Putting healthy food at eye level in the cafeteria is a nudge. So is having employees opt out of, rather than into, a retirement-savings plan.
We’re sorry, something went wrong.
We are unable to fully display the content of this page.
This is most likely due to a content blocker on your computer or network.
Please allow access to our site and then refresh this page.
You may then be asked to log in, create an account (if you don't already have one),
or subscribe.
If you continue to experience issues, please contact us at 202-466-1032 or help@chronicle.com.
Getting to and through college requires navigating a tangled bureaucracy. Students must prepare for college-level courses, apply, and decide where to go. The financial-aid process is a maze in its own right. Once they’re on campus, students have to make sure their academic work adds up to a degree and contend with a host of administrative tasks. All of this is harder, of course, for students who are figuring it out without high-touch, expert guidance.
One increasingly popular strategy for helping these students is to “nudge” them. Nudging, which is grounded in behavioral economics, uses low-cost, low-touch interventions to encourage, but not require, people to take a particular action. Putting healthy food at eye level in the cafeteria is a nudge. So is having employees opt out of, rather than into, a retirement-savings plan.
The strategy was developed and popularized about a decade ago by Richard H. Thaler and Cass R. Sunstein, and helped make them famous. Governments around the world soon embraced nudging and applied it to policy areas like health and finance. The Obama administration created a Social and Behavioral Sciences Teamunder the National Science and Technology Council, and in 2015 the president signed an executive order directing federal agencies to apply behavioral insights to their programs by, for instance, streamlining processes and improving the way information is presented.
Higher education took note. Here was an approach that could make a dent in some of its biggest challenges, improving access for students from less-advantaged backgrounds and helping everyone who enrolled persist and graduate. A handful of high-profile experiments found evidence that it could work. Some of those studies became touchstones among higher-education scholars, the policy community, and administrators seeking to offer students better support.
Papers, for example, released in 2012 and 2013 by Caroline M. Hoxby, a professor of economics at Stanford University, and her co-authors, quantified the number of high-achieving, low-income students who did not apply to the selective colleges their academic profiles suggested they could attend. The researchers went on to show that mailing this pool of students information and application-fee waivers — or what was essentially a nudge (though the researchers didn’t use that term) — changed where a significant number of students applied and enrolled. And it cost just $6 per student.
ADVERTISEMENT
Research by Benjamin L. Castleman, now an associate professor of education and public policy at the University of Virginia, and Lindsay C. Page, now an associate professor of education at the University of Pittsburgh, showed that higher-tech nudges, in this case, text-message interventions, could make a difference at several key points in students’ journeys. One set of experiments, whose results came out in 2013, found that text reminders reduced the number of low-income students who “melt” over the summer, or never enroll in college despite having been admitted and intending to go. In another project, they found that reminders to reapply for aid increased the first- to second-year persistence of low-income students at community colleges, as described in a 2014 paper.
These studies were exciting. Here was evidence that a light-touch approach could make a measurable difference on some of higher ed’s thorniest problems — at a cost of just a few dollars per student. As often happens with promising ideas in higher education, the conversation quickly turned to expanding this work. Could nudging be effective on a broader scale? And, if it helped with enrollment and retention, how might it be extended to other areas, like student success?
A national organization scaled up the student mailings. Two research teams tried financial-aid nudges with larger populations. Another tested various nudges to improve academic performance.
Now the results of those expansions are in, and they are discouraging. In recent months, a report on the academic-success project said that it hasn’t identified anything that made a difference. Both financial-aid studies announced null results. So did the effort to nudge more students to apply to selective colleges. Why didn’t these nudges work for a larger population? And where does this leave students finding their way through college — and all of the people trying to help them?
ADVERTISEMENT
At its core, nudging is about helping people make better decisions. The approach makes sense when they face choices “that are difficult and rare, for which they do not get prompt feedback, and when they have trouble translating aspects of the situation into terms that they can easily understand,” write Thaler and Sunstein in Nudge: Improving Decisions About Health, Wealth, and Happiness, the 2008 book that popularized the idea.
Nudging has some innate appeal. It offers a clever way to address something we all contend with: The difficulty of parsing complex options and, indeed, of following through on our own intentions. But in retrospect, it seems custom-built for the Obama age. Behavioral interventions were an elegant, evidence-based, novel way to effect change without passing legislation. They were cheap enough to appeal to policy makers operating under tight, postrecession budgets.
Colleges, too, experienced financial strain in the wake of the recession. Meanwhile, they were being criticized for exacerbating economic inequality and leaving students, especially those who dropped out, saddled with debt.
ADVERTISEMENT
The researchers who built, tested, and talked up nudges geared toward students were generally careful not to suggest that they’d found a silver bullet. These efforts weren’t positioned as a perfect solution. They didn’t have to be. In this climate, the idea of putting a dent in some of higher ed’s big problems for a few dollars per student struck many as a no brainer.
The year after Hoxby’s 2013 paper, with Sarah E. Turner, a professor of economics and education at the University of Virginia, came out, one researcher toldThe Chronicle, “If you can pay six bucks and move the needle, you should do it tomorrow.” Another said the solution was so cheap it was “like magic.”
One of the earliest examples of a higher-ed nudge was the “H&R Block Fafsa Experiment,” the results of which were described in a 2009 working paper. The experiment tested two ways of encouraging low- to moderate-income families who were completing their taxes at H&R Block to fill out the financial-aid application. The tax professionals gave one group personalized information and encouragement to apply; they gave another group that information along with help completing the form. Only the group that received help completing the form saw increased college enrollment compared with a control group. After seeing other papers support the idea that nudging could help students with completing an application, one of the study’s authors, Philip Oreopoulos, believed that question was settled. Oreopoulos, a professor of economics at the University of Toronto, wondered: Could this approach be transferred to a different domain, academic success?
ADVERTISEMENT
It was a timely question. Higher education was shifting its focus, worrying not just about getting students in the door but also about seeing them through to graduation. As a scholar and a teacher, Oreopoulos says, he thought “how horrible it would be if everyone you got in dropped out along the way.”
Oreopoulos embarked on an ambitious, five-year project. He built up a lab. Introductory economics courses at Toronto gave him a large sample of students. He wanted to help them improve their grades and persistence, so he searched the literature for interventions that had been found to improve those things: setting goals, cultivating a growth mind-set, and making use of various forms of coaching. In each case, he adapted interventions that had shown promise in earlier experiments.
Researchers have used text messages like this one to “nudge” students through some of the trickiest steps on their path to college, like applying for financial aid.
The goal-setting intervention, for instance, built on promising social-psychology research. Oreopoulos had one group of students complete an online exercise in which they reflected on their values and set specific, achievable goals. Half of the treatment group was given the chance to receive text messages with encouragement, study tips, and a reminder about their goals (those who declined receiving texts got the messages via email).
The idea, Oreopoulos says, was to test a bunch of different ways to improve students’ performance, expecting some to work better than others. Given the low cost of these efforts, he thought, even a small effect would be worthwhile. He’d continue to improve upon those. “The goal was not to test this past literature,” he says, “but to iterate on it and see what could work better.”
ADVERTISEMENT
Oreopoulos also hoped his project would shed light on why some of these approaches were effective.
“When I started this project,” Oreopoulos said, “I wanted it to work.”
But it didn’t. This summer, Oreopoulos and a co-author summed up their work in a paper called “The Remarkable Unresponsiveness of College Students to Nudging and What We Can Learn From It.” They reported that none of the interventions they tested made a significant difference in academic outcomes. Some of the efforts did bring benefits: The coaching interventions improved students’ well-being and led them to study more. But the whole idea had been to help students earn higher grades.
It may be, Oreopoulos says, that nudges are best suited to binary tasks students face at a particular moment in time, like applying for financial aid. Habits, he says, are harder to change. Perhaps, he adds, the nudges that work best aren’t the ones that gently push people toward a particular option, but change the default setting, so that they don’t need to do anything at all to benefit.
We have to acknowledge there may be only so much we can do.
ADVERTISEMENT
Oreopoulos is collecting more data, trying to get a better sense of why the interventions didn’t work. And despite the paper’s catchy name, he says he doesn’t want his findings to be overinterpreted. “It’s not black and white if this nudge stuff works or not,” he says.
Still, Oreopoulos says, it makes sense to be cautious about basing policies on early findings.
When it comes to multifaceted challenges like increasing college completion, Oreopoulos says, “We have to acknowledge,” he said, “there may be only so much we can do.”
To the extent that completion can be improved, Oreopoulos now thinks, it may be a matter of providing more comprehensive student support — and perhaps requiring, not just encouraging, students to take advantage of it. “The kind of tweaking that we’re doing now,” he said, “is in my opinion a little fruitless.”
This hasn’t been the only recent bad news about nudging. Three efforts to take to a broad scale some of the most promising interventions from previous research released null findings this spring and summer.
ADVERTISEMENT
After Hoxby and Turner released their 2013 paper on how mailing information packets encouraged high-achieving, low-income students to consider selective colleges, the College Board announced that it would try a version of their approach at scale. Indeed, this was something that both the paper on the experiment’s findings and a separate policy proposal written by the authors had urged the College Board in particular to do.
While the College Board experiment was directly inspired by Hoxby and Turner’s work, it was an expansion rather than a replication. Hoxby’s work focused on a very select group: the 35,000 low-income students her paper with Christopher Avery, a professor of public policy at Harvard University’s Kennedy School, estimates finish high school each year with an A- or higher grade-point-average and test scores in the top 10 percent. The main sample in Hoxby and Turner’s paper included 12,000 students.
The College Board reached out to a much wider population, those with low and middle family incomes and in the top half of PSAT and SAT scores. Its sample included 785,000 students. But the authors also examined the subset of their population that was most similar to Hoxby’s. “Our best attempt at mimicking their sample still produces no statistically significant effects,” the research team wrote in its paper, “so cannot fully explain differences in outcomes.”
Hoxby has distanced her work from that of the College Board. “There are so many differences that the College Board’s initiative is not anything close to what we would have put forward,” Hoxby wrote in an email to a reporter for Chalkbeat. “Some of the ‘branding’ elements are highly problematic.” Hoxby did not respond to a request for comment from The Chronicle, and Turner declined to comment for this story.
Meanwhile, two recent efforts tested on a broad scale Castleman and Page’s 2014 study on text-messaging nudges to file the Fafsa. They failed to generate the same kind of encouraging results.
ADVERTISEMENT
For one of the projects, Castleman and Page worked with two other scholars to test several variations of text-message nudges — one set of messages, for instance, was framed around the idea that students respond to information about the behavior of their peers — in a national sample of students. In a new paper released on Wednesday, they write that students who received the text messages reapplied for financial aid earlier, but they did not reapply at a higher rate. It’s possible that by filing earlier, students may have gotten additional state or institutional aid — that’s something the available data do not show. But there was no difference in the amount of the federal financial aid students received, and no improvement in their college persistence or completion.
The other project, which also counts Castleman as a co-author, nudged both high-school and college students to complete the Fafsa in a large state and at the national level in partnership with the Common Application.
As we scale larger and larger, that necessarily means that the person or the entity sending the messages may be less well connected to the student.
It tested a number of variations on Fafsa-nudging: Different kinds of framing language (designed, for instance, to tap into students’ motivation or sense of identity), different media (mail, email, or text), and a variation in which students could chat with an adviser via text. “We find no impact in any of these settings,” the researchers wrote in August, “despite having similar content and even similar researchers working on these projects.”
What’s going on here? Scholars are still trying to understand why their attempts to bring nudges to a broad scale didn’t work, and wrestling with the implications.
ADVERTISEMENT
For the most part, efforts to nudge college students have taken the form of reminder messages. “We’ve been using a pretty narrow set of the tools in the broader kind of nudge tool kit,” Castleman says. “There are many additional tools that I think we could be leveraging more,” he says, things like prepopulating forms, and changing the default settings, or the status-quo option people are left with if they don’t actively pick something else.
As for reminder-style nudges, there’s emerging consensus that the context is really important. It seems to matter a lot, for instance, who’s doing the nudging. That raises real questions about how widely the approach can be applied.
“As we scale larger and larger,” Page says, “that necessarily means that the person or the entity sending the messages may be less well connected to the student.” Back when she and Castleman began their research, texting students en mass was a novel approach — the technology was mainly used for personal communication. But now, “as text-messaging gets used by your doctor’s office and your hair salon and people who want to recruit you to vote for a particular candidate,” Page says, the credibility of the message sender will only become more important.
One conclusion that seems to be emerging from recent studies is how important a previously existing relationship is to the effectiveness of nudging. Let’s look again at that College Board effort to steer students toward more-selective colleges. It was unsuccessful. But when Susan Dynarski, a professor of public policy, education, and economics at the University of Michigan at Ann Arbor, tried something similar with high-achieving high-schoolers in her state, it did work.
ADVERTISEMENT
Dynarski’s project sent high-achieving, low-income students in Michigan information packets branded in the University of Michigan’s maize and blue with a letter from the president encouraging them to apply — and a promise of four years of free tuition and fees upon admission. It also reinforced this message in information sent to their parents and principals. The researchers tested this effort in two cohorts of about 2,000 students each.
That intervention, the authors wrote, “closed by half the income gaps in college choice among Michigan’s high-achieving students.” The sender in that case: The university, the state’s flagship and highest-ranked institution. It makes sense that students would feel a different emotional pull to their state’s dominant university than they would to a group best-known for bringing them the SAT.
Another project that’s nudging students at the single-university level is “Pounce,” a chatbot named for the Georgia State University’s panther mascot. Pounce can send students reminders about tasks they must complete, and also respond to their questions by pulling answers from a pool and, when needed, having a staff member respond by hand. Messages sent by Pounce, which Page helped design and is studying, led to a reduction in Georgia State’s summer melt. Based on those findings, the university decided to use Pounce to help current students, too.
Students’ connection with their own institution is just one advantage of sending nudges from a university, Page said. Because Georgia State has lots of data on its own students, it’s able to send messages that are quite tailored. A message about a housing-deposit deadline, for instance, can be sent only to those who haven’t yet handed it in.
ADVERTISEMENT
Still, there are limitations, Page is finding. Students are pretty responsive to Pounce when it’s helping them address an immediate need, like settling a campus parking ticket so they can register for classes. They’re less responsive about a reminder to, for instance, attend a career fair — an activity whose payoff is less clear.
Importantly, Page adds, Georgia State sees Pounce as a supplement to, not a replacement for, other forms of student services, which rely on a human touch. Its perspective, she said, is that “this is a tool that we can use to help students along, but it’s not the exclusive tool that we’re hanging our hats on.”
Even university-level efforts to nudge come with complications, says Sara Goldrick-Rab, a professor of higher education policy and sociology at Temple University and one of the co-authors of the new paper with Castleman and Page. Colleges don’t all possess the ability to send out text messages beyond their emergency-notification systems. Students don’t always trust their own colleges.
And there’s still an important distinction, she said, between nudges from an organization students trust and counseling from a person they know. Castleman and Page’s original Fafsa nudges, she notes, were sent from a college-access group to students it had a history of working with.
So where does this leave the field? The answer is probably different for college officials than policy makers, Goldrick-Rab says. Colleges engaged in nudging, she said, should probably take a close look at who’s sending those messages. Goldrick-Rab says she’s tried to put herself in the shoes of a college president. Her take? “If all I had was $5 a head for students, literally $5 a person, fine.”
ADVERTISEMENT
Still, Goldrick-Rab thinks there are other relatively low-cost ways that colleges can help their students. That includes by supporting their basic needs, an idea her research focuses on. “If it was a bit more money than that and I could buy them lunch every day, or I could make sure I had a campus food pantry, I might go there over the nudging,” Goldrick-Rab says.
But which approaches should be run on a broader scale is a different question, Goldrick-Rab says. “It’s what do we want to do nationally? What do we want our Department of Ed doing, what do we want our big companies, what do we want our systems doing? And I think that the answer is: We want them to do a lot more than this.”
Beckie Supiano is a senior writer for The Chronicle of Higher Education, where she covers teaching, learning, and the human interactions that shape them. She is also a co-author of The Chronicle’s free, weekly Teaching newsletter that focuses on what works in and around the classroom. Email her at beckie.supiano@chronicle.com.