> Skip to content
FEATURED:
  • The Evolution of Race in Admissions
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
Sign In
ADVERTISEMENT
News
  • Twitter
  • LinkedIn
  • Show more sharing options
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
  • Copy Link URLCopied!
  • Print

Peer Review in Flux

The internet era has changed the landscape

By  Paul Basken
March 4, 2018
Peer Review in Flux NU
Eduardo Luzzatti for The Chronicle

Beaten down by technological change and economic pressures, the long-held notion of scientific peer review is losing its status as the “gold standard” measure of scholarly reliability.

The problem facing universities in 2018, however, isn’t so much that peer review has inevitably evolved, but that scientists collectively have failed to respond with a better replacement.

Among the many troubles for peer-reviewed publications:

  • Subscription-based journals are proving far too slow for the speed of scientific exchange in the internet era and have long generated resentment about costs, while more streamlined open-access models raise widespread questions about their sustainability and reliability.
  • Publishing priorities and the financial rewards of research breakthroughs discourage scientists from reporting negative results and don’t sufficiently guard against potential bias.
  • Universities too often reward faculty members based on the quantity rather than the quality of publications.
  • Citation-based measures of journal reputation — a proxy for peer-review quality — have long been recognized as flawed and susceptible to manipulation.
  • Overworked researchers show a growing resistance to serving as reviewers or devoting adequate time to the task.

We’re sorry. Something went wrong.

We are unable to fully display the content of this page.

The most likely cause of this is a content blocker on your computer or network. Please make sure your computer, VPN, or network allows javascript and allows content to be delivered from c950.chronicle.com and chronicle.blueconic.net.

Once javascript and access to those URLs are allowed, please refresh this page. You may then be asked to log in, create an account if you don't already have one, or subscribe.

If you continue to experience issues, contact us at 202-466-1032 or help@chronicle.com

Beaten down by technological change and economic pressures, the long-held notion of scientific peer review is losing its status as the “gold standard” measure of scholarly reliability.

The problem facing universities in 2018, however, isn’t so much that peer review has inevitably evolved, but that scientists collectively have failed to respond with a better replacement.

Among the many troubles for peer-reviewed publications:

  • Subscription-based journals are proving far too slow for the speed of scientific exchange in the internet era and have long generated resentment about costs, while more streamlined open-access models raise widespread questions about their sustainability and reliability.
  • Publishing priorities and the financial rewards of research breakthroughs discourage scientists from reporting negative results and don’t sufficiently guard against potential bias.
  • Universities too often reward faculty members based on the quantity rather than the quality of publications.
  • Citation-based measures of journal reputation — a proxy for peer-review quality — have long been recognized as flawed and susceptible to manipulation.
  • Overworked researchers show a growing resistance to serving as reviewers or devoting adequate time to the task.

One result: The notion of what it means to have a highly respected “peer reviewed” work of science has become diminished, if not lost entirely. Another: Scientists caught up in uncertainties over the meaning and standards of “peer reviewed” research aren’t doing all they can to share their work and collectively advance their fields.

ADVERTISEMENT

The solution for scientists, say analysts studying the problem, lies in helping scholars — and their employers and funders — better understand how researchers can collaborate, share, and self-correct their work, and be credited for it.

“We have a too-narrow focus on peer review at the stage of publication,” says Brian A. Nosek, co-founder and director of the Center for Open Science, “at the cost of appreciating how evidence becomes credible over time with all the other parts of continuous peer review in the community.”

Traditionally, peer review has meant the formal evaluation process of a scientist’s manuscript — by academic counterparts of the author — as a condition for journal publication.

Now, with advanced electronic methods of communication, the concept of peer review is evolving to mean any number of ways that a scientist receives useful feedback from colleagues, from the earliest stages of project design to post-publication critiques. The nonprofit Center for Open Science alone offers at least 15 pre-print servers (online repositories for publicly sharing manuscripts with no pretense of peer review) in fields that include business, education, engineering, law, and the life sciences.

TAKEAWAY

Open Science Needs Further Review

  • Scientific-journal peer review, long revered as the “gold standard” of scientific reliability, is increasingly seen as failing in the accelerating whirl of the internet era.
  • Key factors include heavy demand for published work due to financial pressures for scientific breakthroughs and stubbornly formulaic university reward structures.
  • Solutions may lie in greater “openness” and in making the scientific process itself more transparent and susceptible to sharing and feedback.
  • Universities, which control researchers’ salaries and other financial incentives, must be willing to make changes in their reward systems.

Major academic publishers, including Elsevier, are also joining in, offering a variety of online tools to help scientists record and immediately share their notes and data findings with colleagues around the world.

ADVERTISEMENT

That’s a good thing, many experts argue. “It allows us to keep going, to be current, to be at the vanguard, and to understand what’s happening,” says Harlan M. Krumholz, a professor of medicine at Yale University who studies accuracy in science.

At the same time, however, many journals and universities cling to the idea that a final published article that passes some measure of “peer review” remains a defining measure of academic accomplishment — even in the face of growing evidence that the standards of those reviews are slipping.

At last year’s quadrennial Congress on Peer Review and Scientific Publication, Krumholz called on leading academic journals to tolerate the open sharing of findings among scientists and to stop making such activity a disqualification for the eventual publication of a manuscript. “If we wait for peer-review publication,” Krumholz said of his own research team, “we’ll be years behind in the field.”

Howard C. Bauchner, editor in chief of the Journal of the American Medical Association, pushed back, saying there had not yet been enough study of whether online sharing prior to peer-reviewed publication might produce more harm than benefit in fields like medicine. Nonscientists, for example, might see a preliminary finding and act upon it, with harmful results.

“I know it always feels better if we’re more transparent, if there’s more science, if there’s more information out there,” Bauchner, a professor of pediatrics at Boston University, told Krumholz. “But I think we’ve seen, over the last 10 or 15 years, there is the real capacity to do harm.”

ADVERTISEMENT

Amid such fundamental disagreements, there appears to be little coordinated effort to determine what, exactly, “peer review” should look like in the future. Even among journals that make a good-faith effort at peer review, there’s no common understanding of whether the process should mean a single reader giving a quick scan for obvious errors, a team of highly qualified reviewers offering multiple rounds of feedback to the author, or something in between.

That uncertainty has helped erode our collective trust in our science, says Bruce V. Lewenstein, a professor of science communication at Cornell University. The solution, in the eyes of many reformers, centers on greater openness. But in the world of academic publishing, debates over “openness” have mostly meant the push to eliminate subscription fees, rather than the openness of peer review and broader scientific processes.

Some journals are experimenting with notions of crowdsourced peer review. Pre-print servers may be the most developed form of that idea. But other variants not yet widely adopted include the open publication of exchanges between authors and their reviewers.

Advocates of that idea include Erin K. O’Shea, president of the Howard Hughes Medical Institute, who outlined the approach at a conference the institute hosted last year. Along with publishing peer reviews — either anonymously or with attribution — O’Shea called for journals to establish systems that display “robust post-publication evaluations.” She also suggested that authors, rather than editors, decide whether their manuscript is ultimately published, “removing the notion that publication itself is a quality-defining step.”

ADVERTISEMENT

But in the end, says Michail Kovanis, a French researcher who studies ways of improving peer review, universities themselves hold the power over the future of peer review, because they control promotions and salaries, and therefore can insist on practices that reflect quality rather than quantity.

If they fail to do that, says Kovanis, a data scientist at Inserm, the French Institute of Health and Medical Research, journals will continue to grow beyond the realistic capacity of their reviewers to meaningfully evaluate scientific work. “The ones who give money,” he says, “are the ones who can enforce.”

Paul Basken covers university research and its intersection with government policy. He can be found on Twitter @pbasken, or reached by email at paul.basken@chronicle.com.

A version of this article appeared in the March 9, 2018, issue.
Read other items in this The 2018 Trends Report package.
We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Paul Basken
Paul Basken was a government policy and science reporter with The Chronicle of Higher Education, where he won an annual National Press Club award for exclusives.
ADVERTISEMENT
ADVERTISEMENT
  • Explore
    • Get Newsletters
    • Letters
    • Free Reports and Guides
    • Blogs
    • Virtual Events
    • Chronicle Store
    • Find a Job
    Explore
    • Get Newsletters
    • Letters
    • Free Reports and Guides
    • Blogs
    • Virtual Events
    • Chronicle Store
    • Find a Job
  • The Chronicle
    • About Us
    • DEI Commitment Statement
    • Write for Us
    • Talk to Us
    • Work at The Chronicle
    • User Agreement
    • Privacy Policy
    • California Privacy Policy
    • Site Map
    • Accessibility Statement
    The Chronicle
    • About Us
    • DEI Commitment Statement
    • Write for Us
    • Talk to Us
    • Work at The Chronicle
    • User Agreement
    • Privacy Policy
    • California Privacy Policy
    • Site Map
    • Accessibility Statement
  • Customer Assistance
    • Contact Us
    • Advertise With Us
    • Post a Job
    • Advertising Terms and Conditions
    • Reprints & Permissions
    • Do Not Sell My Personal Information
    Customer Assistance
    • Contact Us
    • Advertise With Us
    • Post a Job
    • Advertising Terms and Conditions
    • Reprints & Permissions
    • Do Not Sell My Personal Information
  • Subscribe
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    • Manage Newsletters
    • Manage Your Account
    Subscribe
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    • Manage Newsletters
    • Manage Your Account
1255 23rd Street, N.W. Washington, D.C. 20037
© 2023 The Chronicle of Higher Education
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin