Research

Yelp for Colleges? An Economist Rates Its Usefulness

June 12, 2017

Courtesy of Jonathan Rothwell
Jonathan Rothwell says his research suggests that a consumer-based ratings system for colleges is feasible.
What if there were a consumer-based, Yelp-like ratings system for colleges? A study by Jonathan T. Rothwell, a visiting scholar at the George Washington University Institute of Public Policy, has found that asking alumni about their college experiences can be an accurate indicator of the overall quality of an institution.

The study, published last week, was based on information gathered by the Gallup and Strada Education Network’s Education Consumer Pulse, a research platform that began collecting data in 2016. Over a three-year period, the platform is collecting data from 350 adults per day in the United States who are surveyed on their educational experiences.

In his study, Mr. Rothwell, a senior economist at the Gallup polling firm, compared people’s responses about their college experiences with data from their respective institutions.

Consumers who rated their colleges more highly tended to have higher levels of income and "subjective well-being," the study found. And higher ratings tended to be predictive of a respondent’s having attended a "better" college, measured by publicly available data like alumni income levels, faculty rankings, and percentage of graduates who earned doctorates.

“If you can do it with something as subjective as someone's taste for movies, I think we could do it for higher education, where people's goals are varied.”
Mr. Rothwell spoke with The Chronicle about how consumer data could change the higher-education market. The interview has been edited for length and clarity.

Q. What are the larger implications of your findings? How would this affect students or universities?

A. I think the most important implication is that the higher-education system needs to do a better job of collecting and acting on feedback from its consumers, from alumni. There have been so many efforts to try to evaluate colleges, rate them in various ways, usually based on objective qualities that are based on inputs like test scores of incoming students, the selectivity of the college, the ratio of students they admit relative to the number of applications.

All those things I think are valuable and important, but — as the Obama administration learned when it tried to roll out its College Scorecard — it’s very difficult to ascertain what qualities should matter most in evaluating colleges and then to measure those concepts accurately and comprehensively across the vast diversity of institutions that we have.

The nice thing about a consumer-oriented approach that you see in other industries that are also dealing with complex services that are being evaluated — like Yelp, like Fandango, which rates movies — is if you see that a very small percentage of the audience recommends or likes the movie, chances are it’s not going to be a very good movie, and chances are critics, whose job it is professionally to evaluate the movies, aren’t going to like it, either.

If you can do it with something as subjective as someone’s taste for movies, I think we could do it for higher education, where people’s goals are varied.

Q. How do you foresee these consumer-data analyses being translated into a tool that students or universities could use?

A. This is part of a larger project that Strada Education network, formerly USA Funds, has embarked upon with Gallup. The goal is to inform young people, as well as adults who may be looking to retrain for a new career, to make the smartest decisions that will benefit them in the long run.

At this point, we wouldn’t have enough data, necessarily, for every college to provide the kinds of rankings that I’m talking about. We probably have enough information for maybe 100 of the larger schools, where we have enough responses at the school level. But with that being said, Strada could certainly decide that this is something they want to do in a product they want to offer to the public in some form. Or another organization, like the existing ratings companies, may decide that this is the approach they want to take.

I also think colleges could use this information in a variety of ways. I could imagine trustees in some cases insisting that the college have this kind of survey so that they could evaluate where their school stands. Or go into more in-depth analysis and try to figure out if they see that their college ranks in a way or rates in a way that is unsatisfactory, follow up with alumni, and try to tease out what exactly was the problem.

Q. Most students probably don’t have other institutions of higher learning to compare their college experiences with. How would these data or a potential tool created from the data take that into account?

A. I can imagine that someone rating a restaurant has gone to 30 different restaurants, … [so] they have a lot of things to compare it with. Definitely in higher education, that’s not the case. A part of it comes down to how the questions are worded. We’ve got 14 questions here that get at aspects of quality. "Your education was worth the cost" is one example. "You received a high-quality education" is another. And it could be that people answer these questions without a basis of comparison and give information that doesn’t end up being useful.

But that isn’t what I found. What I think happens is that people are able to imagine alternative scenarios: "What if I had gone to a different college that had this factor different, where I studied a different major?" Or "What if I had gone to the same school but had different professors?"

Despite the complexity of evaluating colleges and services, whatever the cognitive process that people use seems to be generalizable enough to capture the experience of other people. The reason I come to that conclusion is that if you go to a school where other people rate it highly, you’re much more likely to rate it highly, and you’re also much more likely to have higher income and higher subjective well-being if other people rate your school higher. So that leads me to believe that people are experiencing real things that are happening at this school, whether it’s high-quality teaching, whether it’s access to courses that are particularly relevant to their career, or other sets of factors that just about anyone who’s in the same experience would recognize and remember.

Q. Given that there are several factors that go into how a student decides which colleges to apply to, which one they ultimately attend, and whether or not they succeed there, how realistic do you think it is for consumer-ratings data or a consumer-ratings tool to actually change student behavior?

A. We know that people often go to colleges that are in their city or in their state. In some cases, there are clear financial advantages to going to a school that’s in-state — you get a discount — so that part of it makes complete sense. But then there are also some students picking a school based on convenience, even if it’s not the best fit for them.

“I do think a Yelp-like tool could still be effective if it were based on simple, transparent questions: Would you recommend this college to other people?”
I am worried about the decision processes of nontraditional students or students who have families or work obligations that they’re trying to balance. My concern there is that they’re more likely to go to a school based solely on convenience, and not really consider the qualities of the school. I would argue that in many cases, they could get a better outcome if they went to a higher-quality school, and so I want them to be as informed and empowered as possible to make that decision. Giving them information, even if they have a fairly limited range of options, could still make a difference.

I don’t want to overstate the potential for radically transforming higher education. I think things need to happen other than just informing consumers, even if that could be done perfectly well. But I do think that a significant problem that students face is lack of information, lack of clear information, and difficulty of accessing it. So I do think a Yelp-like tool could still be effective if it were based on simple, transparent questions: Would you recommend this college to other people? That may be enough to change people’s behavior in a positive way.

Q. A lot of colleges might try to game the system. Comparing it with the Yelp example, there are sponsored reviews that businesses might pay for people to leave. Could that be a concern here?

A. I think one would have to be careful about something like that. The nice thing about our survey is that it’s a random survey of Americans, using phone numbers. The process is called random digit dialing, and there really isn’t any way to game that, because no one knows if they’re going to be called. Gallup, doing the survey, doesn’t know who’s going to be called. That’s probably the safest way of doing it.

I think companies would just have to be careful to watch out for that kind of thing But the way our survey’s conducted should be a fairly safe check against that sort of thing.