Philosophy:ReScience C

From HandWiki
Revision as of 09:00, 5 February 2024 by Sherlock (talk | contribs) (fixing)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
ReScience C  
|Subject |Discipline}}Reproducibility
LanguageEnglish
Edited byOlivia Guest, Benoît Girard, Konrad Hinsen, Nicolas Rougier[1]
Publication details
History2015–present[1]
Publisher
diamond/platinum
LicenseCC BY 4.0[2]
Standard abbreviations
ISO 4ReSci. C
Indexing
ISSN2430-3658
Links

ReScience C is a journal created in 2015 by Nicolas Rougier and Konrad Hinsen with the aim of publishing researchers' attempts to replicate computations made by other authors, using independently written, free and open-source software (FOSS), with an open process of peer review.[1] The journal states that requiring the replication software to be free and open-source ensures the reproducibility of the original research.[3]

Creation

ReScience C was created in 2015 by Nicolas Rougier and Konrad Hinsen in the context of the replication crisis of the early 2010s, in which concern about difficulty in replicating (different data or details of method) or reproducing (same data, same method) peer-reviewed, published research papers was widely discussed.[4] ReScience C's scope is computational research, with the motivation that journals rarely require the provision of source code, and when source code is provided, it is rarely checked against the results claimed in the research article.[5]

Policies and methods

The scope of ReScience C is mainly focussed on researchers' attempts to replicate computations made by other authors, using independently written, free and open-source software (FOSS).[1] Articles are submitted using the "issues" feature of a git repository run by GitHub, together with other online archiving services, including Zenodo and Software Heritage. Peer review takes place publicly in the same "issues" online format.[2]

In 2020, Nature reported on the results of ReScience C's "Ten Years' Reproducibility Challenge", in which scientists were asked to try reproducing the results from peer-reviewed articles that they had published at least ten years earlier, using the same data and software if possible, updated to a modern software environment and free licensing.[1] (As of August 2020), out of 35 researchers who had proposed to reproduce the results of 43 of their old articles, 28 reports had been written, 13 had been accepted after peer review and published, among which 11 documented successful reproductions.[1]

References

  1. 1.0 1.1 1.2 1.3 1.4 1.5 Perkel, Jeffrey M. (2020-08-24). "Challenge to scientists: does your ten-year-old code still run?". Nature 584 (7822): 656–658. doi:10.1038/d41586-020-02462-7. PMID 32839567. Bibcode2020Natur.584..656P. 
  2. 2.0 2.1 "Overview of the submission process". 2020. https://rescience.github.io/write. 
  3. "Reproducible Science is good. Replicated Science is better.". 2020. https://rescience.github.io/. 
  4. Pashler, Harold; Wagenmakers, Eric Jan (2012). "Editors' Introduction to the Special Section on Replicability in Psychological Science: A Crisis of Confidence?". Perspectives on Psychological Science 7 (6): 528–530. doi:10.1177/1745691612465253. PMID 26168108. 
  5. Rougier, Nicolas P.; Hinsen, Konrad (2017-12-18). "Sustainable computational science: the ReScience initiative". PeerJ Computer Science 3: e142. doi:10.7717/peerj-cs.142. ISSN 2376-5992. PMID 34722870. Bibcode2017arXiv170704393R. 

External links