A college guidebook arriving in stores this week states, with what appears to be scientific precision, that the best undergraduate history program in the United States can be found at Yale University, which received a score of 4.93 on a scale of 2.01 to 4.99. The guide, published by the Princeton Review and Random House, gives second place to the University of California at Berkeley (4.92), and third to Princeton University (4.91).
The close scores are hardly unique to the history-program rankings in the book. Most of the scores for the 140 disciplines and 1,273 undergraduate institutions ranked in the 394-page guide differ by only 1/100th of a point, with no wider gaps and no ties -- an outcome that researchers call a near-impossibility, statistically.
What’s more, college officials say they have no idea how the rankings were determined, because no one ever contacted the institutions for the information.
The book itself -- The Gourman Report -- hailed on its cover as containing “the most authoritative and accurate assessments of higher education,” provides no information on its methodology.
In some college circles, though, the book is no mystery. Self-published by a retired professor, Jack Gourman, since 1967, it has built a small following, although it has never had the impact of the guides produced by U.S. News & World Report. Colleges generally have ignored it. But now that it is backed by the Princeton Review, a popular test-preparation company, and is distributed by Random House, many educators are worried and angry about a book they consider to be of dubious value.
“The rankings are wretched,” says David S. Webster, an associate professor of education at Oklahoma State University, who met Dr. Gourman in 1980 while studying such rankings for his doctorate. Since then, Dr. Webster has criticized Dr. Gourman for refusing to reveal his methodology and for failing to separate large and small colleges in the rankings.
“Gourman tried his best to do a good rankings book the first time around, and after he was seriously criticized for it, he figured it wasn’t worth the expense or effort to put much work into it in the future,” Dr. Webster says.
Editors at the Princeton Review say the criticism is unfounded and comes mostly from institutions that score poorly. The editors point to dozens of colleges that have cited it in their public-relations materials, and to other institutions that send letters of support.
The new edition grades not only undergraduate institutions and undergraduate majors, but also administrative areas, such as libraries, alumni associations, and boards of trustees.
The scoring system seems to favor research universities. Amherst College, typically considered a top liberal-arts college, received the best score among such institutions and is ranked No. 61 in the report, behind Georgetown University at No. 60 and the Colorado School of Mines at No. 59.
Williams College and Swarthmore College, two other prestigious liberal-arts institutions, are ranked 97th and 100th, respectively, behind institutions with less rigorous admissions standards, such as Wayne State University (64th) and the Ohio State University (33rd).
“It just doesn’t make sense, especially when you don’t know the methodology,” says Tom Krattenmaker, a spokesman for Swarthmore. The colleges being ranked “are so different in obvious ways, it’s like comparing apples and chickens.”
Most troubling to many college officials is the apparent stealth campaign that Dr. Gourman conducts to collect the data. “No one here has ever received a call or a survey from Gourman, so it’s really hard to comment,” says Lisa Baker, a spokeswoman for the University of Michigan, which ranked third in the guide.
“While we may disagree with U.S. News & World Report, at least we know on what basis they measure,” says Michal Regunberg, vice-president for public affairs at Brandeis University.
Even the Princeton Review has difficulty explaining Dr. Gourman’s research. A press release about The Gourman Report from the company says the numerical scores are “based on a [sic] complex criteria developed by Dr. Jack Gourman.” Asked to explain the criteria, Evan Schnittman, editor in chief of Princeton Review, says he knows them only “in general terms.”
“If I attempted to explain, I’d be a bit reductive,” he says. “I’m comfortable with his methodology. Dr. Gourman told us we could own the information, but he needs to own his methodology.”
A retired professor of political science at California State University at Northridge, Dr. Gourman says he arrives at his conclusions by averaging scores -- ranging from zero to five -- on 10 factors that influence the quality of an academic department. Those factors, he says, are facilities, administration policies, the relationship between professors and administrators, support of faculty members, cooperation among professors, methods of communication between professors and the administration, the openness of the administration, the use of consultants and committees to solve problems, attitudes about scholarly research, and the overall cooperation of the administration.
He refuses to elaborate on his criteria and on how those factors can be quantified for a numerical scale. When asked, for example, if support of faculty members means financial support, he answers, “That could be.”
Mr. Schnittman, the Princeton Review editor, says no explanation of the methodology is needed in the book. Only reporters, not consumers, he says, typically read the methodology section in rankings reports. “I don’t understand the methodology behind the Consumer Price Index, and yet I believe it,” he says.
Alexander W. Astin, director of the Higher Education Research Institute at the University of California at Los Angeles, says he worries that the supposed precision with which the Gourman rankings are reported will lead students and their families to have faith in them.
“An intelligent U.S. senior in high school could sit down and make up rankings that are similar to Gourman,” Dr. Astin says. “As long as your favorite institution ranks where you expect it to rank in any of these guides, you tend not to question the method.”
Dr. Astin is highly skeptical of The Gourman Report, noting how so many departments and colleges differ by just 1/100th of a point. “If you’re computing scores from other sets of data, it’s impossible to have no gaps and no ties,” he says.
Dr. Gourman says he computes a college’s overall score by averaging those he assigns to each of the institution’s majors. But this method does not compute based on the scores listed in the book. For example, Brandeis has an overall score of 4.44, but the average of the 14 majors ranked there comes out to 4.21. Top-ranked Princeton has an overall score of 4.95, but the average of its 43 ranked disciplines is 4.72.
Dr. Gourman explains these and other discrepancies by saying the overall score includes majors rated at the institution but not printed in the book.
Even if the scores are accurate, C. Anthony Broh, Princeton’s registrar, says extending the scores two places past the decimal point “adds a level of precision that doesn’t exist in the concept being measured.” He asks, for example, how Dr. Gourman could conclude that Princeton is first and Harvard second over all when the two scores differ by only 1/100th of a point.
Similar complaints about the scoring system in the U.S. News rankings prompted the magazine to round to the nearest whole number in this year’s edition, resulting in many ties.
During his 13 years at Princeton, Dr. Broh says, he has never received a call or a survey from Dr. Gourman. During that time, Princeton has ranked first in every edition of The Gourman Report.
Dr. Gourman says he does not receive information from institutional research officials, admissions offices, or public-relations offices.
Instead, he says, faculty members, department chairmen, deans, and even presidents write letters to him about their universities and departments. With that information, he says, “50 trained people working for me around the country” evaluate the data and help assign scores. The letters used for the rankings are destroyed to protect the individuals who provide him with information, he adds. “We’re not hiding anything,” he says. “What is there to hide, anyway?”
At first, Dr. Gourman refused to share the names of his employees or his correspondents on college campuses. Two days later, he called to provide two names.
Raymond E. Fielding, one of the two correspondents, is a professor of motion-picture, television, and recording arts at Florida State University.
He confirms that he has included Dr. Gourman on a mailing list for the department’s newsletter, but he refuses to comment on the rankings.
Jerry Zoffer, the other person named, is a former dean of the Katz Graduate School of Business at the University of Pittsburgh. He says he began sending Dr. Gourman a yearly update on the business school out of concern over where Dr. Gourman was getting his information.
But Dr. Zoffer says that he never learned how Dr. Gourman used the data, and that judging by the number of institutions and academic programs ranked in the book, Dr. Gourman “only has a tenth of a minute to review and grade each one.”
Despite the complaints about the rankings, some colleges do publicize their standing in the book. An Internet search of The Gourman Report turned up more than 690 hits, many from colleges’ World-Wide Web sites. Although Mr. Schnittman, at the Princeton Review, says he has received two letters calling him “an idiot” for publishing the book, he says other educators write in support of Dr. Gourman.
Dr. Webster, who wrote his dissertation on rankings and who has tracked The Gourman Report ever since, says the author deserves some credit.
“Gourman seems to have done a lot of research, probably from college catalogues, just to determine which colleges offer which programs,” Dr. Webster says. “But he has done nothing to deserve his work being published by the Princeton Review.”