Posts about reproducibility report

Rigour and reproducibility in Canadian research: call for a coordinated approach

Shortcomings in the rigour and reproducibility of research have become well-known issues and persist despite repeated calls for improvement. A coordinated effort among researchers, institutions, funders, publishers, learned societies, and regulators may be the most effective way of tackling these issues. The UK Reproducibility Network (UKRN) has fostered collaboration across various stakeholders in research and are creating the infrastructure necessary to advance rigorous and reproducible research practices across the United Kingdom. Other Reproducibility Networks, modelled on UKRN, are now emerging in other countries. Canada could benefit from a comparable network to unify the voices around research quality and maximize the value of Canadian research.

Replication in the social sciences

In the social sciences in recent years, replication has received increased attention, as there has been a growing understanding that for research to be credible it is crucial that studies can be repeated. Replication allows for determining the validity of scientific conclusions. This FORS Guide provides an overview of the concept of replication. Moreover, it provides some practical recommendations to social science researchers regarding what replication materials should include and elaborates on the role of scientific journals in encouraging replications.

The Art of Publishing Reproducible Research Outputs: Supporting emerging practices through cultural and technological innovation

Reproducibility and transparency can be regarded (at least in experimental research) as a hallmark of research. The ability to reproduce research results in order to check their reliability is an important cornerstone of research that helps to guarantee the quality of research and to build on existing knowledge. The digital turn has brought more opportunities to document, share and verify research processes and outcomes. Consequently, there is an increasing demand for more transparency with regard to research processes and outcomes. This fits well with the open science agenda requiring, amongst others, open software, open data, and open access to publications even if openness alone does not guarantee reproducibility. The purpose of this activity of Knowledge Exchange was to explore current practices and barriers in the area of research reproducibility, with a focus on the publication and dissemination stage. We wanted to determine how technical and social infrastructures can support future developments in this area. In this work, we defined research reproducibility as cases where data and procedures shared by the authors of a study are used to obtain the same results as in their original work. We captured the views of research funding organisations, research performing organisations, learned societies, researchers, academic publishers and infrastructure and service providers.

Practical open science: tools and techniques for improving the reproducibility and transparency of your research

Science progresses through critical evaluation of underlying evidence and independent replication of results. However, most research findings are disseminated without access to supporting raw data, and findings are not routinely replicated. Furthermore, undisclosed flexibility in data analysis, such as incomplete reporting, unclear exclusion criteria, and optional stopping rules allow for presenting exploratory research findings using the tools of confirmatory hypothesis testing. These questionable research practices make results more publishable, though it comes at the expense of their credibility and future replicability. The Center for Open Science builds tools and encourages practices that incentivizes work that is not only good for the scientist, but also good for science. These include open source platforms to organize research, archive results, preregister analyses, and disseminate findings. This poster presents an overview of those practices and gives practical advice for researchers who want to increase the rigor of their practices.

Promoting and supporting credibility in neuroscience

Over the coming years, a core objective of the BNA is to promote and support credibility in neuroscience, facilitating a cultural shift away from ‘publish or perish’ towards one which is best for neuroscience, neuroscientists, policymakers and the public. Among many of our credibility activities, we will lead by example by ensuring that our journal, Brain and Neuroscience Advances, exemplifies scientific practices that aim to improve the reproducibility, replicability and reliability of neuroscience research. To support these practices, we are implementing some of the Transparency and Openness Promotion (TOP) guidelines, including badges for open data, open materials and preregistered studies. The journal also offers the Registered Report (RR) article format. In this editorial, we describe our expectations for articles submitted to Brain and Neuroscience Advances.

The State of Sustainable Research Software: Results from the Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE5.1)

This article summarizes motivations, organization, and activities of the Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE5.1) held in Manchester, UK in September 2017. The WSSSPE series promotes sustainable research software by positively impacting principles and best practices, careers, learning, and credit. This article discusses the Code of Conduct, idea papers, position papers, experience papers, demos, and lightning talks presented during the workshop. The main part of the article discusses the speed-blogging groups that formed during the meeting, along with the outputs of those sessions.