CODECHECK: An Open Science initiative for the independent execution of computations underlying research articles during peer review to improve reproducibility

Research output: Contribution to journalResearch articleContributedpeer-review

Contributors

  • Daniel Nüst - , University of Münster (Author)
  • Stephen J. Eglen - , University of Cambridge (Author)

Abstract

The traditional scientific paper falls short of effectively communicating computational research. To help improve this situation, we propose a system by which the computational workflows underlying research articles are checked. The CODECHECK system uses open infrastructure and tools and can be integrated into review and publication processes in multiple ways. We describe these integrations along multiple dimensions (importance, who, openness, when). In collaboration with academic publishers and conferences, we demonstrate CODECHECK with 25 reproductions of diverse scientific publications. These CODECHECKs show that asking for reproducible workflows during a collaborative review can effectively improve executability. While CODECHECK has clear limitations, it may represent a building block in Open Science and publishing ecosystems for improving the reproducibility, appreciation, and, potentially, the quality of non-textual research artefacts. The CODECHECK website can be accessed here: https://codecheck.org.uk/.

Details

Original languageEnglish
Article number253
JournalF1000Research
Volume10
Publication statusPublished - 2021
Peer-reviewedYes
Externally publishedYes

External IDs

PubMed 34367614
ORCID /0000-0002-0024-5046/work/142255093

Keywords

Keywords

  • Code sharing, Data sharing, Open Science, Peer review, Quality control, Reproducibility, Reproducible research, Scholarly publishing