You know it, I know it, U.S. News & World Report knows it. The annual college rankings matter a lot. They are, as one data cruncher put it, the gorilla in the room: weighty, demanding attention, and often besides the point.
The relationship is a strange match. How does a consumer magazine — that also ranks cars, diets, and nursing homes — maintain dominance in the persistent chase for ivory-tower prestige? Why do colleges tolerate it?
Every year, the two parties hold a summit of sorts, at the Association for Institutional Research’s annual conference.
We're sorry. Something went wrong.
We are unable to fully display the content of this page.
The most likely cause of this is a content blocker on your computer or network.
If you continue to experience issues, please contact us at 202-466-1032 or email@example.com
The relationship is a strange match. How does a consumer magazine — that also ranks cars, diets, and nursing homes — maintain a dominating influence in the persistent chase for ivory-tower prestige? Why do colleges tolerate it?
Every year, the two parties hold a summit of sorts, at the Association for Institutional Research’s annual conference. U.S. News depends on colleges to self-report data for its rankings, and institutional researchers are usually responsible for assembling and submitting their colleges’ data in response to the magazine’s surveys. At the conference, Robert Morse, who has long presided over the influential rankings, flies in to advise them on how to do that.
I traveled to Phoenix this month to attend the conference in hopes of understanding that relationship better. Not all the talk was about rankings. On the second day of the conference, I met two attendees who never have to think about U.S. News. They are institutional researchers at Michigan community colleges, and the magazine ranks only institutions that grant bachelor’s degrees. Over tacos in a vast exhibit hall in the Phoenix Convention Center, the Michiganders talked about the challenges of submitting numbers to the federal government, which virtually every college must do, and the real-time data colleges need to help keep students on track to graduate.
Later, Susan Moreno, executive director of institutional research at the University of Houston, sat down at the table. Houston lists entering the U.S. News top-50 public universities as an aim in its strategic plan.
Moreno had just come from an invite-only session for subscribers to Academic Insights, U.S. News’s software product for colleges seeking to climb the rankings. She was describing how the subscriptions work when she nudged me suddenly. “Do you know what Bob Morse looks like?” she asked, and looked up toward the center of the room.
There he was, striding into the exhibit hall with two other men in suits and ties. They entered the taco line.
Several minutes later, Eric Brooks, principal data analyst for U.S. News’s college rankings; Kenneth Hines, director of data projects; and Morse sat down at our table.
Reporting data to U.S. News may not be the only thing that institutional researchers do. It may not be important to most prospective students, who attend community and regional public colleges close to home. But here, at the institutional researchers’ conference, it’s inescapable.
“Generally, if you talk to most IR people,” Moreno said, “if you go and ask them individually, not necessarily to represent their institution, but on their personal level, what they think about the rankings, I suspect the large majority would say they don’t like them. They don’t find them helpful.”
U.S. News has made some changes in response to a decade and a half of commentary. In the rankings published in 2018, editors added to their formula how many low-income students colleges graduate, and eliminated acceptance rates.
Changes like those are closely watched by the college employees who submit the data. For that reason, Morse has for decades led a panel at the conference to announce changes in the surveys and take questions from institutional researchers. “Everybody goes,” said Justin Shepherd, assistant vice provost for institutional research at Emory University. “It’s standing room only.”
When Morse and his colleagues sat down at my lunch table, they were courteous but seemed ill at ease after I introduced myself as a Chronicle reporter. They declined to speak on the record until they could consult with the U.S. News PR shop. Nevertheless, we walked together to the ballroom where they would hold their session. The room was full, as usual. Latecomers clustered around the doorway and leaned against the beige, cubicle-style walls.
Morse and his colleagues spent more than 40 minutes going over fairly minor adjustments planned for next year’s survey. After that, there was a Q&A. Several attendees had technical questions. Then came the more pointed queries, most of which were also technical. The mood shifted, even as everyone remained polite.
“There’s always one guy who gets angry,” Heather Novak, director of institutional research, planning, and effectiveness at Colorado State University at Fort Collins, said later.
I talked to two of this year’s angry-sounding guys, Shepherd, of Emory, and Darrell Tyler, senior research analyst at the University of Richmond. Tyler asked, as part of a longer question, whom the rankings really served, students or institutions. (Morse didn’t answer.)
There’s always one guy who gets angry.
Tyler and Shepherd both conveyed respect to Morse. They had served on panels with him before. And Shepherd told me, “U.S. News is a wonderful tool for students.” The popular rankings spurred a movement toward data transparency in higher education. U.S. News and the reaction to it are part of why statistics such as colleges’ graduation rates, student-body makeup, and cost are free and easy to look up. “All of that information is a consumer good,” Shepherd said. “It is both for prospective students and to hold institutions accountable.”
It’s only the numerical rankings, he went on, that he objects to: the labeling of College A as better than College B. For any students who care to dig deeper than a college’s rank, U.S. News makes plenty of information available that could help them determine if a campus is a good personal fit, especially to people who pay for College Compass, a subscription service that gives students more data and lets them create their own ranking, based on the factors they care about. Many conference attendees praised it. It costs about $40 a year.
Emory benefits from U.S. News because it’s highly ranked, No. 21 among national universities, and because students at private institutions like Emory are the most likely to have relied heavily on rankings to decide where to go. At another session, Yang Zhang, director of institutional research at the University of Hawaii-Manoa, laid out the benefits Hawaii receives from rankings. U.S. News and other lists aid Hawaii in attracting international students and bolstering support for the institution at home. “Sometimes people don’t relate this name, ‘Hawaii,’ with great research and great teaching, great students,” she said. Popular rankings help.
Another common, major complaint about U.S. News is that it’s an outside force that didn’t come into higher education with noble intent. In writings and interviews, Alvin P. Sanoff, the founding managing editor of the rankings, was frank that the project started as a way to “garner attention and sell magazines.”
Yet if U.S. News started as an outsider, academe has since opened the door, let the gorilla into the room, and invited it to the table.
Brooks, Hines, Morse, and I talked at the back of the presentation area, while potential customers gathered at the front, asking salespeople questions.
Morse said he and his colleagues had come to the conference to learn about higher education, get feedback from the people who fill out their surveys, and explain to institutional researchers the support they can receive from U.S. News. “It is the most important higher-education event that we go to,” Morse said. “There’s no question.”
The U.S. News analysts defended their rankings against charges of elitism. They have moved toward more heavily weighting students’ outcomes over the characteristics of admitted students, such as their test scores. They learned how to measure colleges’ ability to raise students’ socioeconomic status at this conference. They continue to believe that “financial resources matter,” Morse said. “Students from schools with a lot of resources are getting more services, a broader range of course offerings, and likely richer financial aid than a school that is tuition-dependent.”
They also believe that SAT and ACT scores continue to matter. In their internal analyses, they’ve found test results are correlated with timely graduation, independent of students’ family income. How well test scores foretell students’ success in college is a hotly debated topic; several colleges have found the SAT doesn’t help them predict who will graduate.
Morse pointed out that Florida has written the U.S. News metrics into the funding formula for its public universities, a move that The Chronicle has found exacerbated inequities among the state’s colleges. “While there may be critics,” Morse said, “there’s also large educational systems that believe that what we’re doing is credible.”
While there may be critics, there’s also large educational systems that believe that what we’re doing is credible.
Among conference attendees, Morse has a reputation as a rather dry speaker. Though the rankings involve high stakes, his annual presentation is technical and his delivery measured. He is long-winded but plainspoken. He defends himself and acknowledges others’ points with equanimity. When he was asked about arguments that U.S. News rankings are stacked against historically Black colleges and other minority-serving institutions, he said he thought some critics misunderstood what goes into the rankings formula. It does include some measures of equity. But, he added, “everybody has a right to their opinion.”
The only time he seemed offended was when asked if he planned to retire. “Wow,” he said. “Wow.”
Morse has been at U.S. News since 1976, according to his author page on the magazine’s website. If he decided to retire, it would shake up the conference. Several attendees said people come to the U.S. News session every year just to see Bob Morse, this placid, grandfatherly data-cruncher slouched in a wire-frame chair in the corner of an echoey conference hall.
“It’s a fair question,” Morse said, after a pause. “I’m not sure how many AIR Forums I’ve been to, but it may be around 30. I would be one of the few people to have gone to that many. But I have no immediate plans to retire.”
The format was more collaborative. Everybody sat at large circular tables. There was coffee and tea, a luxury not provided at other sessions.
The conversation plunged quickly into the weeds. Several attendees suggested U.S. News tighten its definitions of concepts such as “instructional spending” and “terminal degree” in the surveys it sends to institutional-research offices. What a college classifies as money spent on teaching, or the highest degree an instructor can hold, can affect its final rank. But deciding what counts can feel like an act of “scriptural exegesis,” said Braden Hosch, associate vice president for institutional research at Stony Brook University, part of the State University of New York, and one of the session’s moderators.
Morse countered that the survey definitions seemed detailed enough to him. Also: “U.S. News viewed our role as not being the ones who should set the definitional standards for higher education.”
The discussion was granular but touched on foundational questions: Whose responsibility is it to ensure U.S. News gets accurate data? Does U.S. News need to police its surveys better, or does the higher-education sector need to cooperate, to agree on best practices, to take on the responsibility of answering in good faith? Just this year, news broke that graduate schools at the University of Southern California and Rutgers University had allegedly cheated in their data submissions.
Outside of the room, commentators have posed even bigger questions, and pitched more radical steps. Should college ranking even exist? Should colleges refuse to cooperate with them on ethical grounds?
Within the room, however, tweaks seemed to be the preferred solution. Attendees concluded that they could put together a committee, with representatives from a diversity of college types, to stress-test rankings surveys, identifying which questions need clarification and where survey-takers might be tempted to cheat.
Morse liked the idea: “We would welcome it with open arms.”