“Opaque, gameable, and just plain wrong!” That was the judgment rendered by one of my colleagues on Academic Analytics, our university’s shiny new measurement tool. Why had Rutgers University — proud of its intellectual culture and its rigorous standards for faculty promotion — chosen possibly the worst bibliometric program available? Let me explain.
In 2013 our administration signed a contract under which this company would make its database accessible to approved users. Academic Analytics crawls the Internet and, it says, has assembled profiles of more than 270,000 scholars at more than 385 colleges in the United States and abroad. The database enumerates “scholarly productivity” in a handful of categories: books, journal articles, citations, published conference proceedings, federal funding, and honorific awards. In the world of Academic Analytics, nothing else counts. In other words, the database tells faculty members what they already know about themselves in a fashion that is incomplete and often erroneous. From 2013 to 2017, Rutgers will waste $492,000 on this digital lemon.
Worse still, Academic Analytics actually presents a danger to higher education everywhere.
Acting on members’ concerns, Rutgers’s faculty union, of which I am president, has sought to protect the integrity of tenure procedures. As at many universities, a tenure candidate at Rutgers may inspect the entirety of his or her tenure file with the sole exception of external reference letters. Department chairs and deans may consult bibliometric data sets — Google Scholar, for instance — but only because the candidate also has access to those public resources.
Academic Analytics is different. The firm restricts access to its proprietary data set. In theory, subscribing universities could approve all faculty members as users, but, to my knowledge, none does. I obtained my scores after a request under New Jersey’s Open Public Records Act. At Rutgers, rank-and-file faculty members do not receive the necessary passwords. In December, therefore, we asked the administration formally to exclude Academic Analytics from the tenure-and-promotion process. The faculty of the School of Arts and Sciences in New Brunswick voted 92-20 in favor of this demand. In response, administrators have promised orally not to use Academic Analytics — but only one dean on our main New Brunswick campus has put that commitment in writing.
We have made even less progress on our second demand: that data from Academic Analytics not be used in any decisions involving the allocation of resources within the university. The Arts and Sciences faculty voted overwhelmingly in favor of that demand as well. Deans acknowledge using the database in comparing and ranking departments, and that is why I believe Academic Analytics poses a profound, long-term threat.
Because it counts achievement along only the axes mentioned above, the database — and any administrator relying even partially upon it — establishes incentives to do only what counts. Under this logic, the strategically minded professor or department might then stop engaging in less conventional and less measurable activities, such as public scholarship, community engagement, software, patents, films, book chapters, articles in less-well-known journals, and nonfederal grants — not to mention teaching and service. The database even discourages book publishing, by conflating edited and single-author works.
Ultimately, then, this company — and the universities that deal with it — may significantly distort and narrow the contributions that faculty members make to collective wisdom and democratic discourse. Colleges that subscribe to Academic Analytics are, in effect, recasting faculty members as makers of knowledge widgets.
Here, as in Britain, where metrics to evaluate research have been highly controversial, we need to claw our way back to standards that are qualitative, faculty-determined, and institution-specific.
I believe that Academic Analytics will fail for a simpler reason: its own embarrassing inaccuracies. Even within the narrow range it measures, the firm makes unpredictable mistakes. I obtained my profile after a freedom-of-information request. I learned that I had published two books and three articles in the given time windows. In fact, I had published two books and one article. Where did Academic Analytics find the two (possibly brilliant) texts I didn’t write? Because of such errors, the database is losing legitimacy. None of the many deans with whom I have spoken actually trust the spreadsheet. Still, they consider Academic Analytics useful for sales and branding. With metrics, an administration can claim to have the best [name of most opportune department] in the country.
Indeed, wily colleges can “massage” the data. The firm’s voluntary submission process allows institutions to curate the faculty members reviewed by Academic Analytics. To raise the institution’s profile, a savvy dean could keep out of view any professors who aren’t producing articles at full tilt. A Rutgers dean confessed his regret to me at having overlooked this loophole.
Rutgers is wasting its money. Your college may have entered this same high-priced contest of implausible boasts. At the end of this story, Big Data — which ought to promote honesty and transparency — kicks off a deceptive, meaningless game of numbers.
David M. Hughes is president of the AAUP-AFT faculty union and a professor of anthropology at Rutgers University at New Brunswick.