"Promoting Responsible Scientific Research" is the title of a new report just released by the American Academy of Microbiology, a component of ASM. It grew out of an Academy colloquium held last October to tackle an issue that is unfortunately becoming well known both inside and outside scientific circles—the lack of rigor in science. I am delighted that the Academy and ASM are taking on this difficult issue and am grateful to all the participants, the Academy steering committee, and especially to Dr. Arturo Casadevall of Johns Hopkins University, who chaired the colloquium.
Sanjay Srivastava’s joke syllabus ("A Joke Syllabus With a Serious Point: Cussing Away the Reproducibility Crisis," The Chronicle, August 15) and Lee Jussim's blog post on Psychology Today about educating psychology students in light of the reproducibility crisis led me to reflect on my department’s recent curriculum changes. We have retooled or created from scratch multiple courses that engage something few of my colleagues seem to consider relevant to the problem: intellectual history. They instead hold firmly to the dictates of positivism, insisting that better training in the methods of science will be the source of rescue. Where does this prejudice come from? Might it be time to pave a new way?
Amid discussions around scientific reproducibility, the leading biomedical journal Cell will introduce a redesigned methods section to help authors clearly communicate how experiments are conducted. The first papers using Structured, Transparent, Accessible Reporting (STAR) Methods, which promotes guidelines encouraged by reagent labeling and animal experimentation initiatives, appear in Cell on August 25. The format will then be adopted by other Cell Press journals over the next year, starting with Cell Systems in the fall.
A major contributor to the scientific reproducibility crisis has been that the results from homogeneous, single-center studies do not generalize to heterogeneous, real world populations. Multi-cohort gene expression analysis has helped to increase reproducibility by aggregating data from diverse populations into a single analysis. To make the multi-cohort analysis process more feasible, we have assembled an analysis pipeline which implements rigorously studied meta-analysis best practices. We have compiled and made publicly available the results of our own multi-cohort gene expression analysis of 103 diseases, spanning 615 studies and 36,915 samples, through a novel and interactive web application. As a result, we have made both the process of and the results from multi-cohort gene expression analysis more approachable for non-technical users.
Reproducibility is a defining feature of science. Lately, however, serious concerns have been raised regarding the extent to which the results of research, especially biomedical research, are easily replicated. In this Editorial, we discuss to what extent reproducibility is a significant issue in chemical research and then suggest steps to minimize problems involving irreproducibility in chemistry.
To make replication studies more useful, researchers must make more of them, funders must encourage them and journals must publish them.No scientist wants to be the first to try to replicate another’s promising study: much better to know what happened when others tried it. Long before replication or reproducibility became major talking points, scientists had strategies to get the word out. Gossip was one. Researchers would compare notes at conferences, and a patchy network would be warned about whether a study was worth building on. Or a vague comment might be buried in a related publication. Tell-tale sentences would start "In our hands", "It is unclear why our results differed …" or "Interestingly, our results did not …".