Reproducible Science is good. Replicated Science is better.
ReScience C is a platinum open-access peer-reviewed journal that targets computational research and encourages the explicit replication of already published research, promoting new and open-source implementations in order to ensure that the original research is reproducible. You can read about the ideas behind ReScience C in the article Sustainable computational science: the ReScience initiative
To achieve this goal, the whole publishing chain is radically different from other traditional scientific journals. ReScience C lives on GitHub where each new implementation of a computational study is made available together with comments, explanations and tests. Each submission takes the form of an issue that is publicly reviewed and tested in order to guarantee that any researcher can re-use it. If you ever replicated computational results (or failed at) from the literature in your research, ReScience C is the perfect place to publish your new implementation.
ReScience C is collaborative and open by design. Everything can be forked and modified. Don’t hesitate to write a submission, join us and to become a reviewer.
Latest publications
-
Replication
in Computational Neuroscience
(Python)
| 10.5281/zenodo.10257800
| PDF
| Code
| Data
| Review
| BibTeX
Lima, V., Shimoura, R.O., Kamiji, N.L., Battaglia, D., and Roque, A.C. 2023. [Re] Inter-areal Balanced Amplification Enhances Signal Propagation in a Large-Scale Circuit Model of the Primate Cortex. ReScience C 9, 1, #3.
-
Replication
in ecology
(R)
| 10.5281/zenodo.10371655
| PDF
| Code
| Review
| BibTeX
Doyen, G., Picoche, C., and Barraquand, F. 2023. [Re] Biodiversity of plankton by species oscillations and chaos. ReScience C 9, 1, #4.
-
Replication
in Computer Science
(C)
| 10.5281/zenodo.10275726
| PDF
| Code
| Review
| BibTeX
Legrand, A. and Velho, P. 2023. [Re] Velho and Legrand (2009) - Accuracy Study and Improvement of Network Simulation in the SimGrid Framework. ReScience C 6, 1, #20.
Latest News
-
Ten Years Reproducibility Challenge
Did you ever try to run an old code that you wrote for a scientific article you published years ago? Did you encounter any problems? Were you successful? We are curious to hear your story. This is the reason why we are editing a special issue of ReScience to collect these stories.
The ten years reproducibility challenge is an invitation for researchers to try to run the code they’ve created for a scientific publication that was published more than ten years ago. This code can be anything (statistical analysis, numerical simulation, data processing, etc.), can be written in any language and can address any scientific domain. The only mandatory condition to enter the challenge is to have published a scientific article before 2010, in a journal or a conference with proceedings, which contains results produced by code, irrespectively of whether this code was published in some form at the time or not.
Note that we do not ask you to write a new version of your old code. We ask you instead to try to make your old code to run on modern hardware/software (with minimal modifications) in order to check if you can obtain the exact same results that were published at least ten years ago.
Sounds easy? We have good reasons to think this might be more difficult than you think. And maybe the first problem you’ll have to solve is to find your own source code.
More information at: rescience.github.io/ten-years