I4R’s First Meta-Research Paper, Published in Nature
January 15, 2026 · Institute for Replication
At the Institute for Replication, our mission has always been simple: make reproduction and replication a normal, scalable part of scientific research.
Over the past few years, we have been working toward that goal by mass reproducing and replicating research across leading journals in economics, political science, and beyond. Through partnerships, Replication Games, and open calls, we have already reproduced hundreds of studies—focusing in particular on recent publications (typically 2022 onward), where results are most policy-relevant and where engagement with original authors is still active.
This work is collaborative by design. We engage directly with authors, share findings, and encourage dialogue.
Our first meta-research paper came out as a preprint back in early April 2024. All reproducers who contributed a reproducibility report were offered coauthorship. We are excited to share that this first meta-research paper synthesizing these efforts is now published in Nature.
From individual reproductions/replications to a systematic evidence base
Until now, most discussions of reproducibility in economics and politics had relied on small samples of studies or one-off replication projects. Our goal was to move beyond that, with the objective to generate systematic evidence on reproducibility and robustness at scale.
In this project, we reproduced and replicated claims from 110 papers published in leading economics and political science journals. Each study underwent structured computational reproducibility checks, as well as robustness and replication analyses conducted by independent research teams. The scale and coordination of this effort make it one of the most comprehensive examinations of reproducibility in the social sciences to date.
What did we learn?
The paper provides a rich set of findings about the state of reproducibility:
• Computational Reproducibility. Using a standardized scoring system, teams found substantial heterogeneity—from studies that could not be reproduced due to missing materials to those that were fully reproducible end-to-end. But overall, we find high rates of computational reproducibility, potentially due to the recency of the articles and our focus on leading journals.
• Excluding minor issues (e.g., missing packages or broken file paths), about 25% of studies contain coding errors, with some papers having multiple errors.
• Based on 5,511 re-analyses, the overall robustness rate is 70%. Robustness is higher when re-analyses introduce new data, and lower when they modify the sample or redefine the dependent variable.
• Using a many-analysts approach with six independent teams, results show that more experienced replicators tend to find lower robustness, while data availability (including raw data and cleaning code) is not strongly associated with robustness.
Taken together, the results reinforce a key message: reproducibility is not binary. It exists on a spectrum, and improving it requires attention to both technical practices and research design.
What comes next
This meta-study is only the beginning.
As I4R continues to expand its collaborations with journals and research communities, we aim to:
• Better understand the predictors of robustness and reproducibility, building on insights from large-scale meta-research.
• Scale up to 500 reproduced studies by the end of 2026, significantly expanding the evidence base.
• Deepen collaborations with leading journals such as Psychological Science to embed reproducibility into the publication process.
Reach out to us if you want to participate!
References
Preprint: Brodeur, A. et al. Mass Reproducibility and Replicability: A New Hope. I4R Discussion Paper No. 107.
Article: Brodeur, A. et al. (2026). Reproducibility and robustness of economics and political science research. Nature. https://www.nature.com/articles/s41586-026-10251-x