Posts about reproducible paper (old posts, page 4)

Open Science Badges in the Journal of Neurochemistry

The Open Science Framework (OSF) has the mission to increase openness, integrity, and reproducibility in research. The Journal of Neurochemistry became a signatory of their Transparency and Openness guidelines in 2016, which provides eight modular standards (Citation standards, Data Transparency, Analytic Methods/Code Transparency, Research Materials Transparency, Design and Analysis Transparency, Study Pre‐registration, Analysis Plan Transparency, Replication) with increasing levels of stringency. Furthermore, OSF recommends and offers a collection of practices intended to make scientific processes and results more transparent and available in a standardized way for reuse to people outside the research team. It includes making research materials, data, and laboratory procedures freely accessible online to anyone. This editorial announces the decision of the Journal of Neurochemistry to introduce Open Science Badges, maintained by the Open Science Badges Committee and by the Center for Open Science (COS). The Open Science Badges, visual icons placed on publications, certify that an open practice was followed and signal to readers that an author has shared the corresponding research evidence, thus, allowing an independent researcher to understand how to reproduce the procedure.

Assessment of the impact of shared brain imaging data on the scientific literature

Data sharing is increasingly recommended as a means of accelerating science by facilitating collaboration, transparency, and reproducibility. While few oppose data sharing philosophically, a range of barriers deter most researchers from implementing it in practice. To justify the significant effort required for sharing data, funding agencies, institutions, and investigators need clear evidence of benefit. Here, using the International Neuroimaging Data-sharing Initiative, we present a case study that provides direct evidence of the impact of open sharing on brain imaging data use and resulting peer-reviewed publications. We demonstrate that openly shared data can increase the scale of scientific studies conducted by data contributors, and can recruit scientists from a broader range of disciplines. These findings dispel the myth that scientific findings using shared data cannot be published in high-impact journals, suggest the transformative power of data sharing for accelerating science, and underscore the need for implementing data sharing universally.

The value of universally available raw NMR data for transparency, reproducibility, and integrity in natural product research†

With contributions from the global natural product (NP) research community, and continuing the Raw Data Initiative, this review collects a comprehensive demonstration of the immense scientific value of disseminating raw nuclear magnetic resonance (NMR) data, independently of, and in parallel with, classical publishing outlets. A comprehensive compilation of historic to present-day cases as well as contemporary and future applications show that addressing the urgent need for a repository of publicly accessible raw NMR data has the potential to transform natural products (NPs) and associated fields of chemical and biomedical research. The call for advancing open sharing mechanisms for raw data is intended to enhance the transparency of experimental protocols, augment the reproducibility of reported outcomes, including biological studies, become a regular component of responsible research, and thereby enrich the integrity of NP research and related fields.

A Guide to Reproducibility in Preclinical Research

Many have raised concerns about the reproducibility of biomedical research. In this Perspective, the authors address this "reproducibility crisis" by distilling discussions around reproducibility into a simple guide to facilitate understanding of the topic.Reproducibility applies both within and across studies. The following questions address reproducibility within studies: "Within a study, if the investigator repeats the data management and analysis will she get an identical answer?" and "Within a study, if someone else starts with the same raw data, will she draw a similar conclusion?" Contrastingly, the following questions address reproducibility across studies: "If someone else tries to repeat an experiment as exactly as possible, will she draw a similar conclusion?" and "If someone else tries to perform a similar study, will she draw a similar conclusion?"Many elements of reproducibility from clinical trials can be applied to preclinical research (e.g., changing the culture of preclinical research to focus more on transparency and rigor). For investigators, steps toward improving reproducibility include specifying data analysis plans ahead of time to decrease selective reporting, more explicit data management and analysis protocols, and increasingly detailed experimental protocols, which allow others to repeat experiments. Additionally, senior investigators should take greater ownership of the details of their research (e.g., implementing active laboratory management practices, such as random audits of raw data [or at least reduced reliance on data summaries], more hands-on time overseeing experiments, and encouraging a healthy skepticism from all contributors). These actions will support a culture where rigor + transparency = reproducibility.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.

Preserving Workflow Reproducibility: The RePlay-DH Client as a Tool for Process Documentation

In this paper we present a software tool for elicitation and management of process metadata. It follows our previously published design idea of an assistant for researchers that aims at minimizing the additional effort required for producing a sustainable workflow documentation. With the ever-growing number of linguistic resources available, it also becomes increasingly important to provide proper documentation to make them comparable and to allow meaningful evaluations for specific use cases. The often prevailing practice of post hoc documentation of resource generation or research processes bears the risk of information loss. Not only does detailed documentation of a process aid in achieving reproducibility, it also increases usefulness of the documented work for others as a cornerstone of good scientific practice. Time pressure together with the lack of simple documentation methods leads to workflow documentation in practice being an arduous and often neglected task. Our tool ensures a clean documentation for common workflows in natural language processing and digital humanities. Additionally, it can easily be integrated into existing institutional infrastructures.

Writing Empirical Articles: Transparency, Reproducibility, Clarity, and Memorability

This article provides recommendations for writing empirical journal articles that enable transparency, reproducibility, clarity, and memorability. Recommendations for transparency include preregistering methods, hypotheses, and analyses; submitting registered reports; distinguishing confirmation from exploration; and showing your warts. Recommendations for reproducibility include documenting methods and results fully and cohesively, by taking advantage of open-science tools, and citing sources responsibly. Recommendations for clarity include writing short paragraphs, composed of short sentences; writing comprehensive abstracts; and seeking feedback from a naive audience. Recommendations for memorability include writing narratively; embracing the hourglass shape of empirical articles; beginning articles with a hook; and synthesizing, rather than Mad Libbing, previous literature.