The reproducibility challenge – what researchers need

Within the Open Science discussions, the current call for “reproducibility” comes from the raising awareness that results as presented in research papers are not as easily reproducible as expected, or even contradicted those original results in some reproduction efforts. In this context, transparency and openness are seen as key components to facilitate good scientific practices, as well as scientific discovery. As a result, many funding agencies now require the deposit of research data sets, institutions improve the training on the application of statistical methods, and journals begin to mandate a high level of detail on the methods and materials used. How can researchers be supported and encouraged to provide that level of transparency? An important component is the underlying research data, which is currently often only partly available within the article. At Elsevier we have therefore been working on journal data guidelines which clearly explain to researchers when and how they are expected to make their research data available. Simultaneously, we have also developed the corresponding infrastructure to make it as easy as possible for researchers to share their data in a way that is appropriate in their field. To ensure researchers get credit for the work they do on managing and sharing data, all our journals support data citation in line with the FORCE11 data citation principles – a key step in the direction of ensuring that we address the lack of credits and incentives which emerged from the Open Data analysis (Open Data - the Researcher Perspective https://www.elsevier.com/about/open-science/research-data/open-data-report ) recently carried out by Elsevier together with CWTS. Finally, the presentation will also touch upon a number of initiatives to ensure the reproducibility of software, protocols and methods. With STAR methods, for instance, methods are submitted in a Structured, Transparent, Accessible Reporting format; this approach promotes rigor and robustness, and makes reporting easier for the author and replication easier for the reader.

Code and Data for the Social Sciences: A Practitioner’s Guide

This handbook is about translating insights from experts in code and data into practical terms for empirical social scientists. We are not ourselves software engineers, database managers, or computer scientists, and we don’t presume to contribute anything to those disciplines. If this handbook accomplishes something, we hope it will be to help other social scientists realize that there are better ways to work. Much of the time, when you are solving problems with code and data, you are solving problems that have been solved before, better, and on a larger scale. Recognizing that will let you spend less time wrestling with your RA’s messy code, and more time on the research problems that got you interested in the first place.

Negative Results: A Crucial Piece of the Scientific Puzzle

Scientific advance relies on transparency, rigour and reproducibility. At PLOS ONE we have always supported the publication of rigorous research, in all its forms, positive or negative, as showcased in our earlier Missing Pieces Collection. In this 10th Anniversary Collection, A Decade of Missing Pieces Senior Editor Alejandra Clark revisits this important theme and highlights a decade of null and negative results, replication studies and studies refuting previously published work.

Science retracts paper after Nobel laureate’s lab can’t replicate results

In January, Bruce Beutler, an immunologist at University of Texas Southwestern Medical Center and winner of the 2011 Nobel Prize in Physiology or Medicine, emailed Science editor-in-chief Jeremy Berg to report that attempts to replicate the findings in "MAVS, cGAS, and endogenous retroviruses in T-independent B cell responses" had weakened his confidence in original results. The paper had found that virus-like elements in the human genome play an important role in the immune system’s response to pathogens. Although Beutler and several co-authors requested retraction right off the bat, the journal discovered that two co-authors disagreed, which Berg told us drew out the retraction process. In an attempt to resolve the situation, the journal waited for Beutler’s lab to perform another replication attempt. Those findings were inconclusive and the dissenting authors continued to push back against retraction.