Observing many researchers using the same data and hypothesis reveals a hidden universe of uncertainty

Research output: Contribution to journalResearch articleContributedpeer-review

Contributors

  • Crowdsourced Replication Initiative - (Author)
  • Daniel Nüst - , University of Münster (Author)
  • University of Bremen
  • University of Leeds
  • University of Mannheim
  • Ludwig Maximilian University of Munich
  • Bremen International Graduate School of Social Sciences
  • Indiana University Bloomington
  • German Institute for Economic Research
  • Max Planck Institute for Research On Collective Goods
  • Chemnitz University of Technology
  • University of Cambridge
  • Johannes Gutenberg University Mainz
  • Heidelberg University 
  • University Hospital Frankfurt
  • University of Konstanz
  • University of Bamberg
  • Peace Research Institute Frankfurt
  • The London School of Economics and Political Science
  • Leibniz Institute for the Social Sciences
  • Hertie School of Governance
  • Umeå University
  • University College London
  • University of Amsterdam
  • University of Texas Rio Grande Valley
  • Austrian Academy of Sciences
  • Gesundheit Österreich GmbH
  • University of Zurich
  • Universidad de Chile
  • Pontificia Universidad Católica de Chile
  • Loyola Marymount University
  • University of Edinburgh
  • Sciensano
  • Sciences Po
  • Free University of Berlin
  • KU Leuven
  • Europe University Viadrina
  • Leibniz Institute for Educational Trajectories

Abstract

This study explores how researchers’ analytical choices affect the reliability of scientific findings. Most discussions of reliability problems in science focus on systematic biases. We broaden the lens to emphasize the idiosyncrasy of conscious and unconscious decisions that researchers make during data analysis. We coordinated 161 researchers in 73 research teams and observed their research decisions as they used the same data to independently test the same prominent social science hypothesis: that greater immigration reduces support for social policies among the public. In this typical case of social science research, research teams reported both widely diverging numerical findings and substantive conclusions despite identical start conditions. Researchers’ expertise, prior beliefs, and expectations barely predict the wide variation in research outcomes. More than 95% of the total variance in numerical results remains unexplained even after qualitative coding of all identifiable decisions in each team’s workflow. This reveals a universe of uncertainty that remains hidden when considering a single study in isolation. The idiosyncratic nature of how researchers’ results and conclusions varied is a previously underappreciated explanation for why many scientific hypotheses remain contested. These results call for greater epistemic humility and clarity in reporting scientific findings.

Details

Original languageEnglish
Article numbere2203150119
JournalProceedings of the National Academy of Sciences of the United States of America
Volume119
Issue number44
Publication statusPublished - 1 Nov 2022
Peer-reviewedYes
Externally publishedYes

External IDs

PubMed 36306328
ORCID /0000-0002-0024-5046/work/144671595

Keywords

Sustainable Development Goals

ASJC Scopus subject areas

Keywords

  • analytical flexibility, immigration, many analysts, metascience, policy preferences, researcher degrees of freedom

Library keywords