If computer science offers a large set of tools for prototyping, writing, running, testing, validating, sharing and replicating results, computational science still lags behind. In the best case, authors may provide the sources of their research as a compressed archive and feel confident their research is replicable. But this is not exactly true. Buckheit & Donoho (1995) explained almost 20 years ago that, an article about computational result is advertising, not scholarship. The actual scholarship is the full software environment, code and data that produced the result. The computational part in computational sciences implies the use of computers, operating systems, tools, frameworks, libraries and data. This leads to such a large number of combinations (taking into account the version for each component) that the chances to have the exact same configuration as one of your colleagues are nearly zero. This draws consequences in our respective computational approaches in order to make sure that computational research can be actually and faithfully replicated.
ReScience is a peer-reviewed journal that targets computational research and encourages the explicit replication of already published research, promoting new and open-source implementations in order to ensure that the original research is reproducible. To achieve this goal, the whole publishing chain is radically different from any other traditional scientific journal. ReScience lives on GitHub where each new implementation of a computational study is made available together with comments, explanations and tests. Each submission takes the form of a pull request that is publicly reviewed and tested in order to guarantee that any researcher can re-use it. If you ever replicated computational results from the literature in your research, ReScience is the perfect place to publish your new implementation.
The extent of the problem
Here is a (very limited) list of publications on the problem of replicability in computational science that motivated the creation of ReScience:
Writing Software Specifications
K. Hinsen, Computing in Science and Engineering, 2015.
A long journey into reproducible computational neuroscience
M. Topalidou, A. Leblois, T. Boraud and N.P. Rougier, Frontiers in Computational Neuroscience, 2015.
Computational science: shifting the focus from tools to models
K. Hinsen, F1000Research, 2014.
Best Practices for Scientific Computing
Greg Wilson , D. A. Aruliah, C. Titus Brown, Neil P. Chue Hong, Matt Davis, Richard T. Guy, Steven H. D. Haddock, Kathryn D. Huff, Ian M. Mitchell, Mark D. Plumbley, Ben Waugh, Ethan P. White, Paul Wilson, PLOS Biology, 2014.
Learning from the Past: Approaches for Reproducibility in Computational Neuroscience
S.M. Crook, A.P. Davison, H.E. Plesser, 20 Years of Computational Neuroscience, 2013
(→ contact Sharon Crook to request a copy).
Four aspects to make science open “by design” and not as an after-thought
Y.O. Halchenko, M.Hanke, GigaScience, 2015.