Something is wrong with science, or at least with how science is often done. Flashy research in prestigious journals later proves to be bogus. Researchers have built careers on findings that are dubious or even turn out to be fraudulent. Much of the conversation about that trend has focused on flaws in social psychology, but the problem is not confined to a single field. If you keep up with the latest retractions and scandals, it’s hard not to wonder how much research is trustworthy.
But Tuesday might just be a turning point. A new organization, called the Center for Open Science, is opening its doors in an attempt to harness and focus a growing movement to clean up science. The center’s organizers don’t put it quite like that; they say the center aims to “build tools to improve the scientific process and promote accurate, transparent findings in scientific research.” Now, anybody with an idea and some chutzpah can start a center. But what makes this effort promising is that it has some real money behind it: The center has been given $5.25-million by the Laura and John Arnold Foundation to help get started.
It’s also promising because a co-director of the center is Brian Nosek, an associate professor of psychology at the University of Virginia (the other director is a Virginia graduate student, Jeffrey Spies). Mr. Nosek is the force behind the Reproducibility Project, an effort to replicate every study from three psychology journals published in 2008, in an attempt to gauge how much published research might actually be baseless.
Mr. Nosek is one of a number of strong voices in psychology arguing for more transparency and accountability. But up until now there hasn’t been an organization solely devoted to solving those problems. “This gives real backing to show that this is serious and that we can really put the resources behind it to do it right,” Mr. Nosek said. “This whole movement, if it is a movement, has gathered sufficient steam to actually come to this.”
‘Rejigger Those Incentives’
So what exactly will the center do? Some of that grant money will go to finance the Reproducibility Project and to further develop the Open Science Framework, which already allows scientists to share and store findings and hypotheses. More openness is intended to combat, among other things, the so-called file-drawer effect, in which scientists publish their successful experiments while neglecting to mention their multiple flubbed attempts, giving a false impression of a finding’s robustness.
The center hopes to encourage scientists to “register” their hypotheses before they carry out experiments, a procedure that should help keep them honest. And the center is working with journals, like Perspectives on Psychological Science, to publish the results of experiments even if they don’t pan out the way the researchers hoped. Scientists are “reinforced for publishing, not for getting it right in the current incentives,” Mr. Nosek said. “We’re working to rejigger those incentives.”
Mr. Nosek and his compatriots didn’t solicit funds for the center. Foundations have been knocking on their door. The Arnold Foundation sought out Mr. Nosek because of a concern about whether the research that’s used to make policy decisions is really reliable.
“It doesn’t benefit anyone if the publications that get out there are in any way skewed toward the sexy results that might be a fluke, as opposed to the rigorous replication and testing of ideas,” said Stuart Buck, the foundation’s director of research.
Other foundations have been calling too. With more grants likely to be on the way, Mr. Nosek thinks the center will have $8-million to $10-million in commitments before writing a grant proposal. The goal is an annual budget of $3-million. “There are other possibilities that we might be able to grow more dramatically than that,” Mr. Nosek said. “It feels like it’s raining money. It’s just ridiculous how much interest there is in these issues.”
The reason for the windfall, the foundations tell Mr. Nosek, is that they’re worried that science—not just psychology and not just social science—is not churning out the kind of real, reliable knowledge that the world needs.
“There is a concern that we are not making the kind of progress in translating basic science into effective application. A lot of research looks very promising in the first study, and it dies quickly,” Mr. Nosek said. “They’re asking, ‘What’s happening to our science that we can’t get any of this stuff to translate into Stage 2?’”