Skip to content
ADVERTISEMENT
Sign In
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
  • More
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
    Upcoming Events:
    An AI-Driven Work Force
    University Transformation
Sign In
Publishing

Better Than Impact Factor? NIH Team Claims Key Advance in Ranking Journal Articles

By Paul Basken September 7, 2016

George Santangelo, director of the NIH’s Office of Portfolio Analysis: “Our claim is that this is an excellent method, and we haven’t seen any others that are better.”
George Santangelo, director of the NIH’s Office of Portfolio Analysis: “Our claim is that this is an excellent method, and we haven’t seen any others that are better.”NIH

Universities and funding agencies typically measure the value of published research by the number of citations an article attracts or by how often the journal in which it appears is cited. Both methods have long been accepted as imperfect but necessary shorthands.

To continue reading for FREE, please sign in.

Sign In

Or subscribe now to read with unlimited access for as low as $10/month.

Don’t have an account? Sign up now.

A free account provides you access to a limited number of free articles each month, plus newsletters, job postings, salary data, and exclusive store discounts.

Sign Up

George Santangelo, director of the NIH’s Office of Portfolio Analysis: “Our claim is that this is an excellent method, and we haven’t seen any others that are better.”
George Santangelo, director of the NIH’s Office of Portfolio Analysis: “Our claim is that this is an excellent method, and we haven’t seen any others that are better.”NIH

Universities and funding agencies typically measure the value of published research by the number of citations an article attracts or by how often the journal in which it appears is cited. Both methods have long been accepted as imperfect but necessary shorthands.

Going beyond pure citation numbers to assign value to an individual article can be both complicated and uncertain. But one leading attempt to do just that took a major leap forward on Tuesday with the formal endorsement of a team of analysts at the National Institutes of Health.

Known as relative citation ratio, it works by counting an article’s network of citations, then weighting the result by using a comparison group within the article’s field. The developers of relative citation ratio said its methodology therefore better reflects how experts assess the influence of a paper, rather than just its total number of citations.

“In that context,” said one of the ratio’s developers, George M. Santangelo, director of the NIH’s Office of Portfolio Analysis, “our claim is that this is an excellent method, and we haven’t seen any others that are better.”

That’s a significant proclamation, given the degree to which citation-based rankings drive hiring and promotion decisions at universities, and grant allocations by funding agencies. Citation counts alone can vary widely between disciplines. And the most popular journal-wide ranking methodology, known as journal impact factor, is facing growing skepticism due to the wide variety of articles that can appear in a single publication.

Impact factor’s dominance suffered a major hit in July, when a team of publishers posted an analysis to the bioRxiv preprint database showing that most published papers typically attract far fewer citations than their journals’ impact-factor rankings. At both Science and Nature, about 75 percent of published articles attract fewer citations than their journal-wide impact factors of 34.7 and 38.1, respectively. Such findings helped prompt the American Society for Microbiology, the world’s largest life-science society, to announce it would stop using impact factor in promoting its journals.

Impact factor “is a too-dominant metric,” said Ludo Waltman, a bibliometrics researcher at Leiden University, in the Netherlands. “There is too much in today’s scientific system that depends on the JIF. This is not a healthy situation.”

RCR, however, has its own shortcomings, Mr. Waltman said. One of the most glaring, he said, is that its complicated system for weighting networks of papers that cite other papers is field-specific, so it appears to discount the value of interdisciplinary science.

Mr. Santangelo said that complaint, which had been lodged against a previous version of RCR, has since been largely resolved. “We demonstrate that the values generated by this method strongly correlate with the opinions of subject-matter experts in biomedical research, and suggest that the same approach should be generally applicable to articles published in all areas of science,” he and his team of NIH analysts wrote on Tuesday in a report published by the open-access journal PLOS Biology.

ADVERTISEMENT

Mr. Waltman said he does believe that it’s necessary for universities and funders to use statistics to measure the value of published science. He has developed his own standard, known as source normalized impact per paper, or SNIP. But that is also a “quite complex metric,” Mr. Waltman said, and neither SNIP nor any other measurement should be used as the sole basis for gauging the scientific value of research.

As a publisher, PLOS shares that point of view. PLOS recognizes the need for “a new, robust quantitative metric that focuses on the article” rather than the journal in which it appears, said David Knutson, a spokesman for PLOS. And PLOS agrees with Mr. Waltman on the need for even more alternative methods of assessment, Mr. Knutson said. “Metrics should support, not replace, expert judgment,” he said.

Paul Basken covers university research and its intersection with government policy. He can be found on Twitter @pbasken, or reached by email at paul.basken@chronicle.com.


We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Tags
Scholarship & Research
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
Paul Basken Bio
About the Author
Paul Basken
Paul Basken was a government policy and science reporter with The Chronicle of Higher Education, where he won an annual National Press Club award for exclusives.
ADVERTISEMENT
ADVERTISEMENT

Related Content

The Number That’s Devouring Science
The Journal That Couldn’t Stop Citing Itself
Search for Best Way to Rate Research Papers Finds No Good Options
Academic Publishers Experiment With ‘Altmetrics’ to Track Reach and Impact
Researchers and Scientific Groups Make New Push Against Impact Factors

More News

Illustration showing the logos of Instragram, X, and TikTok being watch by a large digital eyeball
Race against the clock
Could New Social-Media Screening Create a Student-Visa Bottleneck?
Mangan-Censorship-0610.jpg
Academic Freedom
‘A Banner Year for Censorship’: More States Are Restricting Classroom Discussions on Race and Gender
On the day of his retirement party, Bob Morse poses for a portrait in the Washington, D.C., offices of U.S. News and World Report in June 2025. Morse led the magazine's influential and controversial college rankings efforts since its inception in 1988. Michael Theis, The Chronicle.
List Legacy
‘U.S. News’ Rankings Guru, Soon to Retire, Reflects on the Role He’s Played in Higher Ed
Black and white photo of the Morrill Hall building on the University of Minnesota campus with red covering one side.
Finance & operations
U. of Minnesota Tries to Soften the Blow of Tuition Hikes, Budget Cuts With Faculty Benefits

From The Review

A stack of coins falling over. Motion blur. Falling economy concept. Isolated on white.
The Review | Opinion
Will We Get a More Moderate Endowment Tax?
By Phillip Levine
Photo illustration of a classical column built of paper, with colored wires overtaking it like vines of ivy
The Review | Essay
The Latest Awful Ed-Tech Buzzword: “Learnings”
By Kit Nicholls
William F. Buckley, Jr.
The Review | Interview
William F. Buckley Jr. and the Origins of the Battle Against ‘Woke’
By Evan Goldstein

Upcoming Events

07-16-Advising-InsideTrack - forum assets v1_Plain.png
The Evolving Work of College Advising
Plain_Acuity_DurableSkills_VF.png
Why Employers Value ‘Durable’ Skills
Lead With Insight
  • Explore Content
    • Latest News
    • Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Virtual Events
    • Chronicle Store
    • Chronicle Intelligence
    • Jobs in Higher Education
    • Post a Job
  • Know The Chronicle
    • About Us
    • Vision, Mission, Values
    • DEI at The Chronicle
    • Write for Us
    • Work at The Chronicle
    • Our Reporting Process
    • Advertise With Us
    • Brand Studio
    • Accessibility Statement
  • Account and Access
    • Manage Your Account
    • Manage Newsletters
    • Individual Subscriptions
    • Group and Institutional Access
    • Subscription & Account FAQ
  • Get Support
    • Contact Us
    • Reprints & Permissions
    • User Agreement
    • Terms and Conditions
    • Privacy Policy
    • California Privacy Policy
    • Do Not Sell My Personal Information
1255 23rd Street, N.W. Washington, D.C. 20037
© 2025 The Chronicle of Higher Education
The Chronicle of Higher Education is academe’s most trusted resource for independent journalism, career development, and forward-looking intelligence. Our readers lead, teach, learn, and innovate with insights from The Chronicle.
Follow Us
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin