Intellectual heft may seem like a tricky thing to measure, but Academic Analytics says it can be done.
The for-profit company, owned in part by the State University of New York at Stony Brook, recently compiled its third annual Faculty Scholarly Productivity Index — a ranking of graduate programs at research universities based on what purports to be the first objective measurement of per-capita scholarly accomplishment. The measurement this year considers several new factors, causing some surprising fluctuations in the rankings.
The concept behind the index is alluringly simple: Take the number of professors in a given program, the number of books and journal articles they have written, the number of times other scholars have cited them, and the awards, honors, and grant dollars they have received, and plug all those into a neat algorithm.
The number that comes out is the key to how productive, on average, your department’s faculty members are. Compare that with the numbers for other departments at other research institutions and you should have a good idea of how your doctoral program stacks up against peers, says Lawrence B. Martin, dean of the graduate school at Stony Brook, who is the company’s founder and chief number cruncher.
His magic numbers can be had for a price. Universities buy reports comparing their departments with those of 10 peer institutions of their choosing. While Academic Analytics has yet to turn a profit, says Mr. Martin, it has about 50 subscribers who pay between $10,000 and $40,000 a year, depending on the amount of data they request.
In addition to journal and citation data taken from the Scopus publication database, the index now includes information on 64,000 books purchased by university libraries. It also incorporates grant data from federal agencies such as the National Institutes of Health, the National Science Foundation, and the National Endowment for the Humanities, as well as 22,000 scholarly honors and awards. Those variables are weighted differently according to discipline. (In civil engineering, for example, journal citations receive more weight than they do in, say, art history, where books count more.)
Some programs that previously touted their rankings may choose not to mention them this year. Several programs that figured at the top of their fields last year have dropped off the top-10 list entirely, while others that languished in lowly spots have rocketed up the charts. As with previous years, the latest rankings yielded a number of eyebrow raisers. In the humanities, for example, Carnegie Mellon University and the University of Nevada at Las Vegas appear among the top-15 most productive programs, alongside Columbia, Harvard, and Yale Universities and the University of California at Berkeley.
Mr. Martin relishes some of the rankings’ more “counterintuitive” results, which he hopes will highlight programs that are strong in certain niches or whose reputations lag behind their growth. Such is the case, he notes, with the Gerstner Sloan-Kettering Graduate School of Biomedical Sciences, in New York City, whose program ranks as the second most productive in the field. The newly created Ph.D. program is not yet accredited, but its high productivity score has already marked it as a heavyweight, says Mr. Martin.
Data Problems?
Ranking UNLV near the top may draw snickers from some, but critics of the index object most to the way the company collects data. Academic Analytics compiles faculty lists from departmental and graduate-program Web sites, which it then submits to universities for verification. Last year 172 out of 375 institutions sent back corrected lists of names. Even subscribers who say the index’s data about their own university are watertight are skeptical about the accuracy of data on other institutions.
Mr. Martin stands by his numbers. “I think people greatly overestimate the number of errors that are out there,” he says.
Other naysayers take issue with the weights assigned to the data and the rankings they produce. “Sometimes they simply give you results that are nuts,” says Lewis M. Siegel, a biochemistry professor and vice provost for graduate education at Duke University.
Among the nuttier results of last year’s rankings, in his view, was that they listed Duke’s math program as the second most productive in the nation, and the university’s English department — famed for the legacy of the literary luminary Stanley Fish — at 105th place.
“Duke’s math program is a good program.” he says. “But it’s not No. 2 in the country. They would have been ecstatic to have broken into the teens.” On the other hand, he says, when Duke’s English department is ranked that low, “you have to think that something strange is going on.” (This year Duke’s math program dropped out of the top 10, and its English program ranked seventh — the result, says Mr. Martin, of more-comprehensive data on books.)
Mr. Siegel approves of Academic Analytics’ move away from the reputational rankings that anchored the National Research Council’s last doctoral-program rankings, in 1995, but the last thing academe needs, he says, is another facile ranking of universities.
Mr. Martin couldn’t agree more. He is the first to admit that the rankings are relative. Academic Analytics can and will rejigger them to emphasize whatever the client feels is most interesting or important. If, for example, a university wants to see how it compares with peers on federal grants, the company can draw up rankings focusing only on that variable. But however you slice it, says Mr. Martin, the data give institutions a useful perspective on how they measure up.
Many institutions that have latched onto the rankings as a marketing tool fail to grasp that the index’s real value lies in its raw data, he says, which can be used to compare programs not just with their peers but with institutions across the country. Mr. Martin now has more information on faculty members’ honors and publications than many of their universities do, he says. His data, he believes, can help administrators monitor departmental progress and gauge the impact of strategic investments, new hires, an influx of grant money, or an increase in graduate stipends.
So far, most universities have been tentative in their embrace of Academic Analytics’ data. Many want to see the National Research Council’s long-anticipated new data, expected in February, before deciding how much legitimacy to grant the council’s more commercial rival.
The State University of New York at Albany has subscribed to Academic Analytics’ data since the beginning, says the dean of graduate studies, Marjorie Pryse, who has been impressed with the index’s increasing sophistication. But, she says, the university has no plans to use those numbers for any sort of strategic decision making until officials see how Mr. Martin’s findings correlate with the NRC’s.
Only One Piece of Puzzle
George E. Walker, dean of Florida International University’s graduate school, also urges caution in using the data, even though he is a great fan. In fact, this month he plans to use Academic Analytics’ reports as he begins a six-year review of 30 university doctoral programs. He intends to start with the six programs that fare best in this year’s index to determine how the university might replicate their successes.
Still, says Mr. Walker, a faculty’s average scholarly productivity represents only one piece of the puzzle. It does not, for example, tell you how graduate students experience a doctoral program. It says nothing about the program’s retention rates, how long students take to obtain a degree, the quality of relationships with mentors, or even what students are learning from the highly productive faculty.
Administrators should also be mindful of how they present the index reports to their departments. Some faculty members, he says, are understandably wary that the data might be used “mischievously or unwisely as a blunt instrument by administrators looking for a quick answer to something.”
He has found the index most useful as a conversation starter. When Mr. Walker joined Florida International a year ago, he used the Academic Analytics data to get a quick handle on his new institution. Even when the index’s results were off-kilter, they prompted lively discussions with department heads about their programs’ strengths, quirks, and self-assessment methods.
He compares the process to a physical checkup. “You go into the doctor’s and you take lab tests,” he says, “It may be that morning you made a mistake and didn’t fast and ate lots of cholesterol. But it also may be there’s something you really want to look at.”
http://chronicle.com Section: The Faculty Volume 54, Issue 12, Page A10