Reed College took a tumble in this year’s U.S. News & World Report ranking of the top liberal-arts institutions in the country, but not because of any real change on its campus.
Like many college officials, Reed’s president, Steven S. Koblik, thinks it is simplistic to rank colleges the way consumer magazines rank cars or stereos. Unlike virtually everyone else, he said No when U.S. News asked the college for data on its students and financial resources.
As a result, because the magazine penalized colleges that provided no information, Reed plummeted from the respectable “second tier” of national liberal-arts colleges to near the bottom of the pile. “The issue,” says Mr. Koblik, “is whether cooperating with U.S. News at the intimate level we have been is good for higher education.”
Mel Elfin, the executive editor of the U.S. News rankings, doesn’t think Reed’s decision was based on principle. “Ask Reed about their graduation rate,” he says. He points out that Reed’s figure of 66 per cent last year fell far short of the near90-per-cent figures posted by such institutions as Bucknell University and Colgate University. Yet the survey found that Reed enjoys a better academic reputation than theirs.
In other words, Mr. Elfin argues, Reed’s reputation surpasses its performance in serving students, and the college doesn’t like someone pointing that out.
The Reed debate is the latest flare-up in a long-running argument in academe over the U.S. News rankings, which have grown in complexity and influence since the magazine started them in 1983. The rankings have evolved from one that measured only reputation -- a “beauty pageant,” Mr. Elfin calls it now -- to one that uses algorithms to process reams of enrollment and financial statistics.
Admissions officials ignore it at their peril. A few years ago, after Swarthmore College had been named the best liberal- arts college for two years running, the college had so many students accept its offers of admission that it had to house some in dormitory lounges.
As always, the magazine faces the complaint that its numbers miss such important things as how well a college teaches and how well it prepares students to find jobs.
But this year, U.S. News also had to confront the issue of what to do about bogus or incomplete data, after a Wall Street Journal article documented in devastating detail what admissions officials have known for years: Colleges often send guidebooks inaccurate numbers, for reasons ranging from sloppiness to unethical marketing strategy. In addition, about 50 of 1,400 colleges declined to take part in the survey this year, although Reed was the only one of those that usually ranks high.
Mr. Elfin, a veteran Washington journalist, seems to relish his new role as a higher-education arbiter. He scoffs that the academy, which he says is full of talk about assessing performance the way business does, can only complain when someone actually tries to measure quality. “The academic community stands in judgment of the entire world,” he says. “One little magazine standing in judgment of them is not an inequitable tradeoff.”
The magazine ranks colleges in several categories. Most prestigious are the national universities, a category that Harvard University has topped for the last six years, and the national liberal-arts colleges, which were led this year by Amherst College. There are also categories for regional liberal-arts and regional universities, which have different winners in different parts of the country, and a few special categories, such as art schools.
The rankings are based on statistics measuring colleges’ reputation, selectivity, faculty resources, financial resources, retention of students, and alumni satisfaction.
Students and parents eat this stuff up. “I’d say I look at it almost every day,” says Eric O’Dell, a senior at Bay High School, in Bay Village, Ohio, who is considering Washington University, Cornell University, and possibly a few other Ivy League institutions. He says he values the magazine’s information not so much for the overall rankings but for the way he can compare such figures as the proportion of alumni who donate, or how much money a college spends per student.
The college issue is among the magazine’s best sellers, and a college guidebook based on the issue sold nearly a million copies last year. Colleges that fare well in the rankings churn out press releases bragging about the results, and administrators make pilgrimages to U.S. News’s Washington, D.C., office to argue about what factors should be considered.
Some observers think the survey’s influence is overestimated. Art & Science Group, a higher-educationconsulting firm in Baltimore, surveyed a random sample of 500 college-bound seniors. While just more than half had consulted various magazines’ rankings, they said the rankings’ influence on them had been merely a fraction of that exerted by college literature and the opinions of counselors.
Among colleges at the top, a common complaint is that the rankings imply a precision that is silly. For Amherst to be crowned “No. 1,” when even by U.S. News standards it is almost indistinguishable from its rivals, is meaningless, they say. Some institutions would be more comfortable if colleges were ranked in groups.
A more common charge from colleges farther down the rankings is that the magazine is wedded to an Ivy League model of excellence, in which colleges get credit mostly for how wealthy they are and for the grades and test scores of the students they accept.
“They grade us down because we are designed to serve a kind of student they don’t seem to be fixated on,” says Rick Moore, a spokesman for San Diego State University. Most of the California State campuses, designed to fill the niche between the University of California and community colleges, languished last year in the second and third tiers of Western regional universities -- lists published only in the guidebook, not in the magazine.
“Outcomes” is the jargon for what some colleges think is more important: How students are changed by their education. Reed requires its students to pass a rigorous series of examinations in their junior year, and to write and defend a senior thesis. St. Lawrence University wants credit for a 94- per-cent job-placement rate.
U.S. News says that it is as interested in outcomes as anyone, but that some of them are tough to quantify. Most job- placement records, for example, don’t distinguish between an executive-training position, clerical work, and waiting on tables.
Even its critics give the magazine credit for listening to them and making changes in response. This year, U.S. News reduced the weight of the percentage of students a college accepted. One reason is that campus officials can manipulate that figure by sending out mailings to students who are long shots for admission. The magazine increased the role of graduation rates -- one of the few outcomes to which a number can be attached.
U.S. News defends the figures it uses, including those that measure the ability of incoming students. Colleges that serve weaker students are providing “a wonderful service,” Mr. Elfin says. “But you pay a price for that,” because relations with peers are an important part of education, he adds.
Claire L. Gaudiani, the president of Connecticut College, is one of the few academics who see the survey as something positive, not a necessary evil spawned by a competitive market. She says her college has used the rankings to set goals in such areas as faculty salaries and endowment investment. “Until we determine how we should evaluate ourselves, I think it’s a bit inappropriate for institutions to complain so bitterly about an outside evaluation,” she says.
There’s an embarrassed tone to many discussions of college surveys this year, because of the reports that some colleges were handing out bogus information to the people who do rankings.
U.S. News tried to reduce the fudge factor this year by cross-checking some enrollment information with the records of Moody’s Investors Service, a debt-rating agency that collects data from colleges that it rates, and with some other sources. In its tables this year, U.S. News footnotes information that it considers suspect. The magazine did not knowingly use any questionable data in calculating its rankings, it says.
If a college excluded some students from its Scholastic Assessment Test averages, for example, then the magazine increased the weight of students’ high-school class rank or the college’s selectivity. If those were unavailable, too, the college was penalized with a lower ranking.
There are good reasons to leave some students out of the averages, some colleges argue. Boston University omitted the S.A.T. scores of 680 freshmen -- out of a total of 4,450 -- who are enrolled in a general-studies program. They do not meet the admissions standards for the university’s regular academic divisions. “This is a program that should be considered a worthy endeavor, rather than a point of controversy that helps U.S. News in its marketing effort,” says Kevin R. Carleton, a spokesman for the university.
Boston also left out the verbal scores, but not the math scores, of international students for whom English is not a first language. Verbal scores of those students are not considered in the admission process, Mr. Carleton says.
The University of Massachusetts left out the test scores of about 200 learning-disabled and foreign students.
These colleges “are doing this in a very calculated way,” says Robert J. Morse, the research director for the rankings. “The net result is their scores are higher.”
Like Reed, Boston University would have preferred to be removed from the survey. But it isn’t in the interest of U.S. News to let participation be optional. Reed’s Mr. Koblik says ranking those colleges that choose not to provide data “underscores the fundamental lack of credibility” of the lists.
Some college press releases about the survey strain credibility themselves. “Susquehanna University Named #1 in College Rankings,” reads a typical one, referring to that university’s placement among Northern regional liberal-arts colleges.
The regional listings of less-selective institutions -- which are based on classifications by the Carnegie Foundation for the Advancement of Teaching -- give a boost to colleges that otherwise might be back in the pack. Illinois Wesleyan University, for example, was No. 1 in the category of Midwest regional universities five years in a row. Its applications rose 25 per cent over the period, says James R. Ruoti, the admissions director.
The college was reclassified as a national university for last year’s rankings, and Mr. Ruoti says he’s almost relieved. He had put the No. 1 ranking on the college’s brochures, but says he felt a little guilty about visitors to the campus who seemed overly impressed with the distinction.
“Eight out of 10 people, believe me, were saying, `You’re number one!’ They didn’t really know what we were number one in.”