Practical open science: tools and techniques for improving the reproducibility and transparency of your research

Science progresses through critical evaluation of underlying evidence and independent replication of results. However, most research findings are disseminated without access to supporting raw data, and findings are not routinely replicated. Furthermore, undisclosed flexibility in data analysis, such as incomplete reporting, unclear exclusion criteria, and optional stopping rules allow for presenting exploratory research findings using the tools of confirmatory hypothesis testing. These questionable research practices make results more publishable, though it comes at the expense of their credibility and future replicability. The Center for Open Science builds tools and encourages practices that incentivizes work that is not only good for the scientist, but also good for science. These include open source platforms to organize research, archive results, preregister analyses, and disseminate findings. This poster presents an overview of those practices and gives practical advice for researchers who want to increase the rigor of their practices.

Promoting and supporting credibility in neuroscience

Over the coming years, a core objective of the BNA is to promote and support credibility in neuroscience, facilitating a cultural shift away from ‘publish or perish’ towards one which is best for neuroscience, neuroscientists, policymakers and the public. Among many of our credibility activities, we will lead by example by ensuring that our journal, Brain and Neuroscience Advances, exemplifies scientific practices that aim to improve the reproducibility, replicability and reliability of neuroscience research. To support these practices, we are implementing some of the Transparency and Openness Promotion (TOP) guidelines, including badges for open data, open materials and preregistered studies. The journal also offers the Registered Report (RR) article format. In this editorial, we describe our expectations for articles submitted to Brain and Neuroscience Advances.

Open and Reproducible Research on Open Science Framework

By implementing more transparent research practices, authors have the opportunity to stand out and showcase work that is more reproducible, easier to build upon, and more credible. Scientists gain by making work easier to share and maintain within their own laboratories, and the scientific community gains by making underlying data or research materials more available for confirmation or making new discoveries. The following protocol gives authors step‐by‐step instructions for using the free and open source Open Science Framework (OSF) to create a data management plan, preregister their study, use version control, share data and other research materials, or post a preprint for quick and easy dissemination.

Rigor, Reproducibility, and Responsibility: A Quantum of Solace

Lack of reproducibility in biomedical science is aserious and growing issue. Two publications, in 2011 and 2012, along with other analyses, documented failures to replicate key findings and other fundamental flaws in high-visibility research articles. This triggered action among funding bodies, journals, and other change-agents. Here, I examine well-recognized and underrecognized factors that contribute to experimental failure andsuggest individual and community approaches that can be used to attack these factors and eschew the SPECTRE of irreproducibility.

Encouraging Reproducibility in Scientific Research of the Internet

Reproducibility of research in Computer Science (CS) and in the field of networking in particularis a well-recognized problem. For several reasons, including the sensitive and/or proprietarynature of some Internet measurements, the networking research community pays limited attentionto the of reproducibility of results, instead tending to accept papers that appear plausible.This article summarises a 2.5 day long Dagstuhl seminar on Encouraging Reproducibility inScientific Research of the Internet held in October 2018. The seminar discussed challenges toimproving reproducibility of scientific Internet research, and developed a set of recommendationsthat we as a community can undertake to initiate a cultural change toward reproducibility ofour work. It brought together people both from academia and industry to set expectations andformulate concrete recommendations for reproducible research. This iteration of the seminar wasscoped to computer networking research, although the outcomes are likely relevant for a broaderaudience from multiple interdisciplinary fields.