Ten Years Reproducibility Challenge
Did you ever try to run an old code that you wrote for a scientific article you published years ago? Did you encounter any problems? Were you successful? We are curious to hear your story. This is the reason why we are editing a special issue of ReScience to collect these stories.
The ten years reproducibility challenge is an invitation for researchers to try to run the code they’ve created for a scientific publication that was published more than ten years ago. This code can be anything (statistical analysis, numerical simulation, data processing, etc.), can be written in any language and can address any scientific domain. The only mandatory condition to enter the challenge is to have published a scientific article before 2010, in a journal or a conference with proceedings, which contains results produced by code, irrespectively of whether this code was published in some form at the time or not.
Note that we do not ask you to write a new version of your old code. We ask you instead to try to make your old code to run on modern hardware/software (with minimal modifications) in order to check if you can obtain the exact same results that were publised at least ten years ago.
Sounds easy? We have good reasons to think this might be more difficult than you think. And maybe the first problem you’ll have to solve is to find your own source code.
More information at: rescience.github.io/ten-years
Four years ago, we launched ReScience, a new scientific journal aimed at publishing the replication of existing computational research. Since ReScience published its first article, things have been going steadily. We are still alive, independent and without a budget. In the meantime, we have published around 25 articles and the initial has grown from around 10 to roughly 100 members (editors and reviewers), we have advertised ReScience at several conferences worldwide, gave some interviews, and published an article introducing ReScience in PeerJ Computer Sience. Based on our experience at managing the journal during these four years, we thought that time was ripe for some changes. Read our editorial if you want to know more.
Sustainable computational science: the ReScience initiative
We just published our white paper on ReScience in PeerJ
Abstract: Computer science offers a large set of tools for prototyping, writing, running, testing, validating, sharing and reproducing results; however, computational science lags behind. In the best case, authors may provide their source code as a compressed archive and they may feel confident their research is reproducible. But this is not exactly true. James Buckheit and David Donoho proposed more than two decades ago that an article about computational results is advertising, not scholarship. The actual scholarship is the full software environment, code, and data that produced the result. This implies new workflows, in particular in peer-reviews. Existing journals have been slow to adapt: source codes are rarely requested and are hardly ever actually executed to check that they produce the results advertised in the article. ReScience is a peer-reviewed journal that targets computational research and encourages the explicit replication of already published research, promoting new and open-source implementations in order to ensure that the original research can be replicated from its description. To achieve this goal, the whole publishing chain is radically different from other traditional scientific journals. ReScience resides on GitHub where each new implementation of a computational study is made available together with comments, explanations, and software tests.
References: N.P. Rougier et al., (2017) Sustainable computational science: the ReScience initiative. PeerJ Computer Science 3:e142. DOI doi.org/10.7717/peerj-cs.142
- Ten Years Reproducibility Challenge
- ReScience (R)evolution
- Sustainable computational science: the ReScience initiative
- Call for replication
- You can help by becoming a reviewer
- New year, new challenges ahead
- ReScience: A Scientific Journal Living on Github
- Official creation
- Review process test is over