When Reproducibility Reformers Fail Their Own Test

A prime piece of scientific research intended to ameliorate the irreproducibility crisis has itself been withdrawn for failing to adhere to proper reproducibility standards. One of the prime directives of reproducibility reformers is to preregister your research—say in advance what you intend to do and how you will do it—so we know you didn’t repurpose your data to fit a hypothesis you made up after you got the data. It turns out that this new research didn’t preregister its experiment precisely because it started out as research for a very different project, where the data came up with no effect. The reproducibility reformers were doing precisely what they knew they shouldn’t do!

To make matters worse, some disciplinary defensiveness—groupthink—was going on. Reproducibility reformers were reluctant to face the criticisms head-on, presumably for fear that it would endanger their own scientific project. It’s not a good look.

All this matters because, as we wrote in our 2018 report on the subject, the irreproducibility crisis is the major challenge to modern scientific practice. As John Ioannidis pointed out as far back as 2005, more than half of modern scientific research is probably wrong due to procedural sloppiness, a professional culture that prizes positive results—and cares less whether they are true results—as the benchmark of success, disciplinary and political groupthink, and errors in statistical procedure that particularly contaminate the many, many scientific and social scientific disciplines that depend on statistical operations. Skewed research hasn’t just damaged science; National Association of Scholar’s Shifting Sands reports have detailed how a great many government programs and policies are based on irreproducible research. The new subdiscipline of meta-researchers has founded itself on ways to improve scientific procedures, especially by promoting openness and transparency in scientific research.

That new subdiscipline is still challenging the scientific establishment—some of them political activists with an ideological stake in preserving convenient sloppy procedures, a larger number just a herd of rhinos with thick skins and reluctant to change their ways. This very embarrassing withdrawal of a meta-research article, alas, is likely to encourage the activist and the inert to say, with great complacency, why should we pay attention to those incompetent hypocrites?

Scientists should not be complacent. The irreproducibility crisis continues to be a grave challenge to the scientific status quo. But meta-researchers should profit from this incident by some self-examination and humility. Indeed, they must be scrupulous to practice what they preach, for the flock, predictably, hearkeneth less to an erring pastor.

They also should be aware that a rabbit hole is at the center of meta-research. We need meta-research, and meta-meta-research, and meta-meta-meta research, ad infinitum, and all of it must be done scrupulously. It would also be helpful to set up procedures to figure out exactly how many metas it’s productive to add to research. An arbitrary standard of “reproductive significance” could be as much gamed as the arbitrary standards of “statistical significance”—but at some point, we need to stop changing rabbits.

Then, too, this particular scandal may make scientists focus too much on pre-registration as a proxy for transparency and openness. Pre-registration matters, but it can also be gamed. In this particular case, the scientists could have inserted a pre-registration halfway through the process, and who would have known, absent a whistleblower? Pre-registration is a useful tool, not a cure-all.

Meta-researchers may be embarrassed, but they also should be encouraged. The habits of skepticism and presumptions of openness that they have fostered have gotten this paper retracted in one year instead of 50—or never. Their embarrassment is a sign that their struggle is succeeding. This must be a wan comfort at present—but, in the long run, it should cheer them.

We are confident that meta-researchers will learn from this upset and that their next article will not need to be withdrawn.


Image by NDABCREATIVITY — Adobe Stock — Asset ID#: 170607617

Author

Leave a Reply

Your email address will not be published. Required fields are marked *