The cause of scientific transparency and accuracy got a boost on Tuesday with the decision by the publishing giant Elsevier to endorse a broad set of standards for open articles and data.
Elsevier agreed to add its 1,800 journals to the 3,200 that already accept the “Transparency and Openness Promotion” guidelines drafted in 2015 by a group of university researchers, funders and publishers.
The standards expand article-citation practices so that authors get credit for making clear the data, methods, and materials needed for replicating their work.
The guidelines also set expectations for preregistering research plans so that random unexpected findings can’t be claimed as meaningful outcomes.
Elsevier’s acceptance, joining Springer Nature and Wiley, means that the guidelines will be used by three of the world’s four largest scientific-journal publishers.
It’s Elsevier’s “biggest affirmative step toward promoting data sharing across its entire journal portfolio,” said Brian A. Nosek, co-founder and director of the Center for Open Science, who helped organize the drafting of the principles.
The center, in Charlottesville, Va., creates web-based platforms for scientists to compile and share their data. The Transparency and Openness Promotion effort is part of its accompanying work to get researchers to use such platforms.
It’s one of several approaches across higher education to improve scientific accuracy, amid deep disagreements over goals and best methods.
And even to the degree that improved data-sharing mitigates problems of scientific accuracy, Mr. Nosek and other advocates of the approach acknowledged that the success of Elsevier’s agreement depends on the publisher’s degree and mode of cooperation. Acceptance means only that Elsevier will ask its journal editors to adopt the standards. And each of the eight standards in the guidelines offers a three-level range of implementation.
For example, with the standard aimed at promoting the sharing of the coding system used to compile data, the first level requires authors only to state whether and where the code is available. The second level requires editors to verify that the code has been placed in a legitimate recognized repository. The third level requires independent analysts to conduct tests on the data.
Elsevier’s embrace of that system appears to be “a great move” for both the publisher and advocates of data sharing, said Mark Hahnel, founder of figshare, an online service for managing, storing, and sharing research data.
The question is whether Elsevier might put the standards into effect in ways that are more about bolstering its own proprietary data-management systems than about making data freely available to everyone, Mr. Hahnel said. “I think we can all understand how everyone else in the community is looking for the asterisk,” he said.
An Elsevier spokesman said the company would allow its journals the use of both its own company-owned data repositories and outside options. And Mr. Nosek said he was confident in Elsevier’s intentions, saying the company formally endorsed the standards only after implementing them at some of its journals. That puts Elsevier ahead of some earlier signatories that have not yet actually adopted or implemented the standards, Mr. Nosek.
“Elsevier doing it,” he said, “is an important signal to the publishers that are slower to adopt open practices that this is becoming the new normal and it is time to catch up.”
Practical Objections
Not everyone agrees, however, on the wisdom of the data-sharing standards or their central role in alleviating any problems that may exist with scientific reproducibility. The journal Epidemiology, published by Wolters Kluwer Health, refused an invitation to join the standards in 2015, saying editorial evaluations of manuscripts should remain focused on the quality of the research and its presentation. The journal, in an editorial, also said it disputed the suggestion that scientific results are either “right or wrong.”
That argument was reiterated by Rebecca E. Irwin, an associate professor of applied ecology at North Carolina State University who was part of a team that last month published a study largely contradicting its earlier work suggesting that anabasine, a compound found in tobacco, might help protect bumblebees from a common parasitic infection..
Ms. Irwin and a colleague on the project, Lynn S. Adler, a professor of biology at the University of Massachusetts at Amherst, said that their continuing tests cast doubt on the protective benefits of anabasine — a difference probably due to random biological variations in the later tests — and that it seemed obvious that the updated findings should be made known.
Elsevier doing it is an important signal to the publishers that are slower to adopt open practices that this is becoming the new normal and it is time to catch up.
Just putting that new information in a shared database probably would not be sufficient to make other scientists aware of the different conclusion, said Evan C. Palmer-Young, a doctoral student in organismic and evolutionary biology at UMass who helped write the new report.
Rather than point to the need for better data-sharing policies, the team saw practical obstacles to others’ doing what it did in developing its new assessment of anabasine. First, Mr. Palmer-Young said, prestigious journals typically decline to publish “negative” findings. His experience with such requests, he said, had taught him to not even ask the Proceedings of the Royal Society B: Biological Sciences, which published the original findings on anabasine and bumblebees in 2015, to publish the updated discovery.
Second, because the likelihood of such refusals meant that the team would need to find an open-access journal, it had to use scarce resources to pay the author fee, he said. And third, the original authors needed to agree to the idea of publishing a contradictory finding.
“It takes a certain kind of adviser” to agree to that, Mr. Palmer-Young said. “It helps that Professor Adler has tenure, but even so, a lesser faculty member might have been much less keen to disseminate results that might diminish the impact of the initial positive results.”
Paul Basken covers university research and its intersection with government policy. He can be found on Twitter @pbasken, or reached by email at paul.basken@chronicle.com.
Correction (9/5/2017, 1:13 p.m.): This article originally misstated the year “Transparency and Openness Promotion” guidelines were drafted. It was 2015, not 2005. The article has been updated to reflect this correction.