The Solution to Science's Replication Crisis

The solution to science's replication crisis is a new ecosystem in which scientists sell what they learn from their research. In each pairwise transaction, the information seller makes (loses) money if he turns out to be correct (incorrect). Responsibility for the determination of correctness is delegated, with appropriate incentives, to the information purchaser. Each transaction is brokered by a central exchange, which holds money from the anonymous information buyer and anonymous information seller in escrow, and which enforces a set of incentives facilitating the transfer of useful, bluntly honest information from the seller to the buyer. This new ecosystem, capitalist science, directly addresses socialist science's replication crisis by explicitly rewarding accuracy and penalizing inaccuracy.

Reproducibility in wireless experimentation: need, challenges, and approaches

Wireless networks are the key enabling technology of the mobile revolution. However, experimental mobile and wireless research is still hindered by the lack of a solid framework to adequately evaluate the performance of a wide variety of techniques and protocols proposed by the community. In this talk, I will motivate the need for experimental reproducibility as a necessary aspect for healthy progress as accepted by other communities. I will illustrate how other research communities went through similar processes. I will then present the unique challenges of mobile and wireless experimentation, and discuss approaches, past, current, and future to address these challenges. Finally, I will discuss how reproducibility extends to mobile and wireless security research.

AN INTERNATIONAL INTER-LABORATORY DIGITAL PCR STUDY DEMONSTRATES HIGH REPRODUCIBILITY FOR THE MEASUREMENT OF A RARE SEQUENCE VARIANT

This study tested the claim that digital PCR (dPCR) can offer highly reproducible quantitative measurements in disparate labs. Twenty-one laboratories measured four blinded samples containing different quantities of a KRAS fragment encoding G12D, an important genetic marker for guiding therapy of certain cancers. This marker is challenging to quantify reproducibly using qPCR or NGS due to the presence of competing wild type sequences and the need for calibration. Using dPCR, eighteen laboratories were able to quantify the G12D marker within 12% of each other in all samples. Three laboratories appeared to measure consistently outlying results; however, proper application of a follow-up analysis recommendation rectified their data. Our findings show that dPCR has demonstrable reproducibility across a large number of laboratories without calibration and could enable the reproducible application of molecular stratification to guide therapy, and potentially for molecular diagnostics.

Validate your antibodies to improve reproducibility? Easier said than done

It seems like the most elementary of research principles: Make sure the cells and reagents in your experiment are what they claim to be and behave as expected. But when it comes to antibodies—the immune proteins used in all kinds of experiments to tag a molecule of interest in a sample—that validation process is not straightforward. Research antibodies from commercial vendors are often screened and optimized for narrow experimental conditions, which means they may not work as advertised for many scientists. Indeed, problems with antibodies are thought to have led many drug developers astray and generated a host of misleading or irreproducible scientific results.

A Reproducibility Study of Information Retrieval Models

Developing effective information retrieval models has been a long standing challenge in Information Retrieval (IR), and significant progresses have been made over the years. With the increasing number of developed retrieval functions and the release of new data collections, it becomes more difficult, if not impossible, to compare a new retrieval function with all existing retrieval functions over all available data collections. To tackle thisproblem, this paper describes our efforts on constructing a platform that aims to improve the reproducibility of IR researchand facilitate the evaluation and comparison of retrieval functions.