Universities just love comparing themselves. And one of the most widely recognized measures of the strength of an academic research department is the amount of grant money it takes in.
But nearly a decade ago, the National Institutes of Health, the largest federal provider of basic research money to universities, stopped publishing grant-based rankings, after concluding that it wasn’t worth the trouble and didn’t fit its mission.
Into the breach stepped Robert Roskoski Jr., a former professor of biochemistry and molecular biology who retired from Louisiana State University.
Working from the basement of his house in the foothills of North Carolina’s Blue Ridge Mountains, Dr. Roskoski now synthesizes raw NIH data tables into an annual set of reports that are closely watched by universities across the country as a key measure of their competitive value. His tables list university departments as well as individual researchers by the amounts of their annual NIH grant support.
“It’s a service for the academic community,” Dr. Roskoski says.
And it’s certainly appreciated, at least in some quarters. His reports, posted with little fanfare to the low-frills Web site of Dr. Roskoski’s one-man Blue Ridge Institute for Medical Research, generate thousands of citations. Many involve universities that see the numbers as black-and-white proof of their dominance in particular fields of medical research.
The University of Minnesota has offered Blue Ridge Institute data as evidence that its medical school ranks in the top 25 percent nationally. The University of California at San Francisco has relied on Blue Ridge information to praise its department of obstetrics, gynecology and reproductive sciences as ranking first in the country.
Duke University has used the numbers to tout its medical school; the University of Pittsburgh has highlighted the ranking of its department of physical medicine and rehabilitation; and Emory University has proudly boasted of the $16-million in NIH grants awarded to the head of its cancer center. That much money, Emory said in a news release last month, places the executive director of the Winship Cancer Institute, Walter J. Curran Jr., “number 22 among 35,000 researchers nationwide.”
“I’m flabbergasted at how serious so many schools make this,” Dr. Roskoski says.
The NIH doesn’t have any problem with Dr. Roskoski or anyone else deriving rankings from its publicly available grant statistics. “We just put it out there and let people use it as they see fit,” said Sally J. Rockey, the NIH deputy director in charge of extramural research.
The agency, however, stopped compiling such ranking tables because it became increasingly clear that the results wouldn’t be meaningful. Comparative tables can be affected by “the way you interpret it and how you divide the departments, and in any given time where we are in the fiscal year, that kind of thing can make a difference,” Ms. Rockey said. “So we concluded that it was just best for us not to do it.”
Some see that problem inherent in Dr. Roskoski’s work. Richard H. Goodman, a professor of cell and developmental biology and biochemistry and molecular biology at Oregon Health and Science University, said he was surprised recently to see an internal promotion of the university’s microbiology department as top-ranked nationally.
By most measures of reputation, the microbiology department is considered about average, said Dr. Goodman, who also directs the university’s privately endowed Vollum Institute. But microbiology departments can fare well in the Blue Ridge tally when universities choose to include related research in other areas, such as infectious diseases and immunology, in the classifications they provide to the NIH, Dr. Goodman said. By comparison, an “extraordinary” microbiology department, such as that at Yale University, ranks poorly in Blue Ridge reports because Yale intentionally keeps its program small, he said.
Dr. Goodman sees it as a problem of more than just publicity and bragging rights. Deans are constantly looking for ways to measure the value of their departments and their faculty, and statistics such as the Blue Ridge rankings can prove irresistible, he said.
“No dean would say that they use these things,” Dr. Goodman said, “but they in fact do use them.”
Dr. Roskoski doesn’t disagree. In retirement, he spends most of the year writing reviews of research on targeted cancer therapies, much as he did at LSU. Then, for about two weeks each year, when the NIH issues its annual grant data, he sets about converting it all into tables.
He does it, he said, simply because he was grateful for the ranking tables when the NIH produced them. “And since I found it so valuable as a faculty member I just kind of picked it up.”
As head of the biochemistry department at Louisiana State, Dr. Roskoski often used the NIH rankings to “bolster my case” when asking the dean for more money to make the department more competitive. “It never worked,” he said, “but that’s what I did.”