Since our debut in late 2006, PLOS ONE has strived to promote best practices in research reporting as a way to improve reproducibility in research. We have supported initiatives towards increased transparency, as well as the gathering of evidence that can inform improvements in the quality of reporting in research articles. In line with this commitment, PLOS ONE collaborated in a randomized controlled trial (RCT) to test the impact of an intervention asking authors to complete a reporting checklist at the time of manuscript submission. The results from this trial have recently been posted on bioRxiv (1) and provide a further step toward building the necessary evidence base to inform editorial interventions towards improving reporting quality.
The Open Science Framework (OSF) has the mission to increase openness, integrity, and reproducibility in research. The Journal of Neurochemistry became a signatory of their Transparency and Openness guidelines in 2016, which provides eight modular standards (Citation standards, Data Transparency, Analytic Methods/Code Transparency, Research Materials Transparency, Design and Analysis Transparency, Study Pre‐registration, Analysis Plan Transparency, Replication) with increasing levels of stringency. Furthermore, OSF recommends and offers a collection of practices intended to make scientific processes and results more transparent and available in a standardized way for reuse to people outside the research team. It includes making research materials, data, and laboratory procedures freely accessible online to anyone. This editorial announces the decision of the Journal of Neurochemistry to introduce Open Science Badges, maintained by the Open Science Badges Committee and by the Center for Open Science (COS). The Open Science Badges, visual icons placed on publications, certify that an open practice was followed and signal to readers that an author has shared the corresponding research evidence, thus, allowing an independent researcher to understand how to reproduce the procedure.
Data sharing is increasingly recommended as a means of accelerating science by facilitating collaboration, transparency, and reproducibility. While few oppose data sharing philosophically, a range of barriers deter most researchers from implementing it in practice. To justify the significant effort required for sharing data, funding agencies, institutions, and investigators need clear evidence of benefit. Here, using the International Neuroimaging Data-sharing Initiative, we present a case study that provides direct evidence of the impact of open sharing on brain imaging data use and resulting peer-reviewed publications. We demonstrate that openly shared data can increase the scale of scientific studies conducted by data contributors, and can recruit scientists from a broader range of disciplines. These findings dispel the myth that scientific findings using shared data cannot be published in high-impact journals, suggest the transformative power of data sharing for accelerating science, and underscore the need for implementing data sharing universally.
This article summarizes motivations, organization, and activities of the Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE5.1) held in Manchester, UK in September 2017. The WSSSPE series promotes sustainable research software by positively impacting principles and best practices, careers, learning, and credit. This article discusses the Code of Conduct, idea papers, position papers, experience papers, demos, and lightning talks presented during the workshop. The main part of the article discusses the speed-blogging groups that formed during the meeting, along with the outputs of those sessions.
With contributions from the global natural product (NP) research community, and continuing the Raw Data Initiative, this review collects a comprehensive demonstration of the immense scientific value of disseminating raw nuclear magnetic resonance (NMR) data, independently of, and in parallel with, classical publishing outlets. A comprehensive compilation of historic to present-day cases as well as contemporary and future applications show that addressing the urgent need for a repository of publicly accessible raw NMR data has the potential to transform natural products (NPs) and associated fields of chemical and biomedical research. The call for advancing open sharing mechanisms for raw data is intended to enhance the transparency of experimental protocols, augment the reproducibility of reported outcomes, including biological studies, become a regular component of responsible research, and thereby enrich the integrity of NP research and related fields.
Many have raised concerns about the reproducibility of biomedical research. In this Perspective, the authors address this "reproducibility crisis" by distilling discussions around reproducibility into a simple guide to facilitate understanding of the topic.Reproducibility applies both within and across studies. The following questions address reproducibility within studies: "Within a study, if the investigator repeats the data management and analysis will she get an identical answer?" and "Within a study, if someone else starts with the same raw data, will she draw a similar conclusion?" Contrastingly, the following questions address reproducibility across studies: "If someone else tries to repeat an experiment as exactly as possible, will she draw a similar conclusion?" and "If someone else tries to perform a similar study, will she draw a similar conclusion?"Many elements of reproducibility from clinical trials can be applied to preclinical research (e.g., changing the culture of preclinical research to focus more on transparency and rigor). For investigators, steps toward improving reproducibility include specifying data analysis plans ahead of time to decrease selective reporting, more explicit data management and analysis protocols, and increasingly detailed experimental protocols, which allow others to repeat experiments. Additionally, senior investigators should take greater ownership of the details of their research (e.g., implementing active laboratory management practices, such as random audits of raw data [or at least reduced reliance on data summaries], more hands-on time overseeing experiments, and encouraging a healthy skepticism from all contributors). These actions will support a culture where rigor + transparency = reproducibility.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.