Pulse Brain · Growing Health Evidence Index
Tier 3 — Observational / field trialPeer-reviewed

Reproducibility of real-world evidence studies using clinical practice data to inform regulatory and coverage decisions

Shirley Wang, Sushama Kattinakere Sreedhara, Sebastian Schneeweiß, REPEAT Initiative, Jessica M. Franklin, Joshua J. Gagne, Krista F. Huybrechts, Elisabetta Patorno, Yinzhu Jin, Moa Lee, Mufaddal Mahesri, Ajinkya Pawar, Julie Barberio, Lily G. Bessette, Kristyn Chin, Nileesa Gautam, Adrian Santiago Ortiz, Ellen Sears, Kristina Stefanini, Mimi Zakarian, Sara Z. Dejene, James R. Rogers, Gregory Brill, Joan Landon, Joyce Lii, Theodore Tsacogianis, Seanna Vine, Elizabeth M. Garry, Liza R. Gibbs, Monica Gierada, Danielle L. Isaman, Emma Payne, Sarah Alwardt, Peter Arlett, Dorothee B. Bartels, Andrew Bate, Jesse A. Berlin, Alison Bourke, Brian D. Bradbury, Jeffrey S. Brown, K.L. Burnett, Troyen A. Brennan, K. Arnold Chan, Nam‐Kyong Choi, Frank de Vries, Hans‐Georg Eichler, Kristian B. Filion, Lisa Freeman, Jesper Hallas, Laura E. Happe, Sean Hennessy, Páll Jónsson, John P. A. Ioannidis, Javier Jiménez, Kristijan H. Kahler, Christine Laine, Elizabeth Loder, Amr Makady, David Martin, Michael Nguyen, Brian A. Nosek, Richard Platt, Robert W. Platt, John D. Seeger, William H. Shrank, Liam Smeeth, Henrik Toft Sørensen, Peter Tugwell, Yoshiaki Uyama, Richard J. Willke, Wolfgang C. Winkelmayer­, Deborah A. Zarin

Nature Communications · 2022

Read source ↗ All evidence

Summary

This multi-centre reproducibility initiative assessed 150 studies using real-world clinical practice data to evaluate the reliability of findings intended to inform regulatory and coverage decisions. Original and reproduction effect sizes showed strong positive correlation (r = 0.85), with a median relative effect magnitude of 1.0, indicating that whilst most results were closely reproduced, a subset showed meaningful divergence explainable by incomplete reporting and data updates. The authors conclude that greater methodological transparency and adherence to reporting guidance would improve reproducibility and validity assessment, supporting more robust evidence-based decision-making.

UK applicability

The findings are broadly applicable to UK healthcare systems and regulatory bodies (MHRA, NICE) that rely on real-world evidence for medical product assessment and coverage decisions. Recommendations for improved reporting and methodological transparency align with UK standards for evidence evaluation and would strengthen the credibility of studies informing NHS policy.

Key measures

Pearson's correlation between original and reproduction effect sizes (0.85); median relative magnitude of effect ratio (hazard ratio original/hazard ratio reproduction) of 1.0 [IQR 0.9–1.1, range 0.3–2.1]; completeness of methodological reporting

Outcomes reported

The study reproduced results from 150 peer-reviewed studies analysing real-world evidence from digital clinical practice data and evaluated reporting completeness for 250 studies. Reproducibility was assessed by comparing original and reproduction effect sizes across healthcare databases.

Theme
Measurement & metrics
Subject
Measurement methods & nutrient profiling
Study type
Research
Study design
Systematic reproducibility assessment
Source type
Peer-reviewed study
Status
Published
Geography
International
System type
Human clinical
DOI
10.1038/s41467-022-32310-3
Catalogue ID
SNmoixnvw4-2i4s1d

Topic tags

Pulse AI · ask about this record

Dig deeper with Pulse AI.

Pulse AI has read the whole catalogue. Ask about this record, its theme, or how the findings apply to UK farming and policy — every answer cites the underlying studies.