University of Minnesota School of Public Health Assistant Professor Julian Wolfson was named an associate editor for reproducibility for the Journal of the American Statistical Association (JASA). The appointment is in support of the journal’s new requirement for authors to submit scientific code and data for review along with their papers.
Scientists, public servants, and patient advocates alike increasingly question the validity of published scientific results, endangering the public’s acceptance of science. Here, I argue that emerging flaws in the integrity of the peer review system are largely responsible. Distortions in peer review are driven by economic forces and enabled by a lack of accountability of journals, editors, and authors. One approach to restoring trust in the validity of published results may be to establish basic rules that render peer review more transparent, such as publishing the reviews (a practice already embraced by some journals) and monitoring not only the track records of authors but also of editors and journals.
"Promoting Responsible Scientific Research" is the title of a new report just released by the American Academy of Microbiology, a component of ASM. It grew out of an Academy colloquium held last October to tackle an issue that is unfortunately becoming well known both inside and outside scientific circles—the lack of rigor in science. I am delighted that the Academy and ASM are taking on this difficult issue and am grateful to all the participants, the Academy steering committee, and especially to Dr. Arturo Casadevall of Johns Hopkins University, who chaired the colloquium.
Sanjay Srivastava’s joke syllabus ("A Joke Syllabus With a Serious Point: Cussing Away the Reproducibility Crisis," The Chronicle, August 15) and Lee Jussim's blog post on Psychology Today about educating psychology students in light of the reproducibility crisis led me to reflect on my department’s recent curriculum changes. We have retooled or created from scratch multiple courses that engage something few of my colleagues seem to consider relevant to the problem: intellectual history. They instead hold firmly to the dictates of positivism, insisting that better training in the methods of science will be the source of rescue. Where does this prejudice come from? Might it be time to pave a new way?
Amid discussions around scientific reproducibility, the leading biomedical journal Cell will introduce a redesigned methods section to help authors clearly communicate how experiments are conducted. The first papers using Structured, Transparent, Accessible Reporting (STAR) Methods, which promotes guidelines encouraged by reagent labeling and animal experimentation initiatives, appear in Cell on August 25. The format will then be adopted by other Cell Press journals over the next year, starting with Cell Systems in the fall.
A major contributor to the scientific reproducibility crisis has been that the results from homogeneous, single-center studies do not generalize to heterogeneous, real world populations. Multi-cohort gene expression analysis has helped to increase reproducibility by aggregating data from diverse populations into a single analysis. To make the multi-cohort analysis process more feasible, we have assembled an analysis pipeline which implements rigorously studied meta-analysis best practices. We have compiled and made publicly available the results of our own multi-cohort gene expression analysis of 103 diseases, spanning 615 studies and 36,915 samples, through a novel and interactive web application. As a result, we have made both the process of and the results from multi-cohort gene expression analysis more approachable for non-technical users.