Can Data Measure Faculty Productivity? Rutgers Professors Say No
By Ellen WexlerDecember 11, 2015
After submitting a records request, David M. Hughes received an Excel spreadsheet summing up his productivity as a scholar.
The numbers seemed straightforward: He had written three articles, won two awards, and published two books. He had received no grant money. And that, according to context provided on the spreadsheet, put him above the national average for publishing and awards.
But Mr. Hughes, president of the Rutgers University faculty union, knows that his data were flawed. In the time periods measured, he had written only one article and received one award. He had also received a $37,500 grant.
We’re sorry, something went wrong.
We are unable to fully display the content of this page.
This is most likely due to a content blocker on your computer or network.
Please allow access to our site and then refresh this page.
You may then be asked to log in, create an account (if you don't already have one),
or subscribe.
If you continue to experience issues, please contact us at 202-466-1032 or help@chronicle.com.
After submitting a records request, David M. Hughes received an Excel spreadsheet summing up his productivity as a scholar.
The numbers seemed straightforward: He had written three articles, won two awards, and published two books. He had received no grant money. And that, according to context provided on the spreadsheet, put him above the national average for publishing and awards.
But Mr. Hughes, president of the Rutgers University faculty union, knows that his data were flawed. In the time periods measured, he had written only one article and received one award. He had also received a $37,500 grant.
“They are undercutting my reputation in some ways and actually inflating my reputation in other ways,” he said. “It’s all intellectually dishonest.”
The data come from Academic Analytics, a company that measures scholarly productivity. It adds up professors’ journal articles, citations, books, research grants, and awards, and compares those numbers with national benchmarks. At the moment, the database includes more than 270,000 faculty members.
ADVERTISEMENT
Rutgers bought a license for the service in 2013. And on Monday the School of Arts and Sciences faculty will vote on a resolution calling for the university to limit how it uses the data. On Wednesday union leaders will meet with the university’s academic and labor-relations team to discuss the issue.
“The way scholarship aids public discourse is by being innovative, being interdisciplinary, taking risks,” Mr. Hughes said. “Academic Analytics doesn’t measure or value those kinds of unconventional forms of research and publishing.”
Faculty-union leaders are wary of how the database will affect their profession. They’re worried that professors will feel obligated to produce work that’s reflected in their scores, and that the university will use flawed data to make decisions.
The resolution calls on the School of Arts and Sciences to promise not to consider the company’s data in tenure, promotion, or resource allocation. It also demands that the data be distributed to faculty members, who would be able to check the accuracy of the information and see how they compare to professors in their discipline across the country.
E.J. Miranda, a university spokesman, declined to comment on the matter, saying only that “if a resolution is forwarded, we will review it.” Academic Analytics also declined to comment.
ADVERTISEMENT
Some deans have said informally that the database won’t be used for tenure or promotion. “It is a misconception to believe in or be worried about such use,” Robert Goodman, executive director of the Rutgers New Jersey Agricultural Experiment Station and executive dean of the Rutgers School of Environmental and Biological Sciences, wrote in an email to the faculty.
But Mr. Hughes wants an agreement in writing. According to the license agreement between the university and Academic Analytics, Rutgers is paying $492,500 for the database over four years. For that price, Mr. Hughes said, Rutgers must be taking the data seriously.
Under the union’s collective-bargaining agreement, faculty members already have access to most of the material in their personnel files. Most professors aren’t privy to what’s in the database, however, and Mr. Hughes thinks that’s a violation of the agreement.
‘We have a right to know what is said about us, and therefore what the administration believes about us.’
He said it’s like knowing your credit score: When you’re buying a home, you need to know your credit score to see how you’ll fare in the market. As a faculty member, you need to be able to see how you’re judged and how you’ve done so far.
ADVERTISEMENT
“We have a right to know what is said about us, and therefore what the administration believes about us,” he said.
Mr. Hughes got access to his data through a public-records request. But when he received the material, the date ranges weren’t included. Did the data reflect two years of work, or 20? Without knowing the dates, Mr. Hughes couldn’t tell whether the numbers were accurate.
He asked a dean to grant him access to the password-protected database, where he found the dates. He said most of his data were incorrect.
When Yolanda Martínez-San Miguel, a professor of Latino and Hispanic Caribbean studies and comparative literature at Rutgers, viewed her data through her department chair, she made the same discovery. According to the database, she had published one book and two articles. In reality, she had published four books and 30 articles.
“I cannot go and correct their records,” Ms. Martínez-San Miguel said. “There’s no way I can actually tell them, ‘You’re missing three books and 28 articles.’”
ADVERTISEMENT
Ms. Martínez-San Miguel does “what many people would consider nontraditional scholarship,” and she thinks that’s why the database missed so much of her work: It was published in smaller, nontraditional outlets. If the university continues to use the database, she worries that smaller disciplines will suffer.
“We’re putting aside not only quality but the fact that there are emerging fields, interdisciplinary knowledge, people like me who want to take risks,” she said. “Diversity of knowledge is curtailed.”
Mr. Hughes said the database could tempt professors to “game the system": They may choose to submit only to well-known American journals, rather than to, say, a smaller Puerto Rican journal read by many in their discipline. They may focus less on other aspects of their work — like teaching and service — that aren’t represented in the data.
“If I’m a chair, and I know that my department rating will generate a greater chance of a new faculty line or a new seminar series,” he said, “I would be crazy not to tell my faculty to do the things that count.”