CODECHECK: An Open Science initiative for the independent execution of computations underlying research articles during peer review to improve reproducibility
Research output: Contribution to journal › Research article › Contributed › peer-review
Contributors
Abstract
The traditional scientific paper falls short of effectively communicating computational research. To help improve this situation, we propose a system by which the computational workflows underlying research articles are checked. The CODECHECK system uses open infrastructure and tools and can be integrated into review and publication processes in multiple ways. We describe these integrations along multiple dimensions (importance, who, openness, when). In collaboration with academic publishers and conferences, we demonstrate CODECHECK with 25 reproductions of diverse scientific publications. These CODECHECKs show that asking for reproducible workflows during a collaborative review can effectively improve executability. While CODECHECK has clear limitations, it may represent a building block in Open Science and publishing ecosystems for improving the reproducibility, appreciation, and, potentially, the quality of non-textual research artefacts. The CODECHECK website can be accessed here: https://codecheck.org.uk/.
Details
Original language | English |
---|---|
Article number | 253 |
Journal | F1000Research |
Volume | 10 |
Publication status | Published - 2021 |
Peer-reviewed | Yes |
Externally published | Yes |
External IDs
PubMed | 34367614 |
---|---|
ORCID | /0000-0002-0024-5046/work/142255093 |
Keywords
ASJC Scopus subject areas
Keywords
- Code sharing, Data sharing, Open Science, Peer review, Quality control, Reproducibility, Reproducible research, Scholarly publishing