Skip to main content
Top
Published in: Insights into Imaging 1/2020

01-12-2020 | Original Article

An analysis of key indicators of reproducibility in radiology

Authors: Bryan D. Wright, Nam Vo, Johnny Nolan, Austin L. Johnson, Tyler Braaten, Daniel Tritz, Matt Vassar

Published in: Insights into Imaging | Issue 1/2020

Login to get access

Abstract

Background

Given the central role of radiology in patient care, it is important that radiological research is grounded in reproducible science. It is unclear whether there is a lack of reproducibility or transparency in radiologic research.

Purpose

To analyze published radiology literature for the presence or lack of key indicators of reproducibility.

Methods

This cross-sectional retrospective study was performed by conducting a search of the National Library of Medicine (NLM) for publications contained within journals in the field of radiology. Our inclusion criteria were being MEDLINE indexed, written in English, and published from January 1, 2014, to December 31, 2018. We randomly sampled 300 publications for this study. A pilot-tested Google form was used to record information from the publications regarding indicators of reproducibility. Following peer-review, we extracted data from an additional 200 publications in an attempt to reproduce our initial results. The additional 200 publications were selected from the list of initially randomized publications.

Results

Our initial search returned 295,543 records, from which 300 were randomly selected for analysis. Of these 300 records, 294 met inclusion criteria and 6 did not. Among the empirical publications, 5.6% (11/195, [3.0–8.3]) contained a data availability statement, 0.51% (1/195) provided clear documented raw data, 12.0% (23/191, [8.4–15.7]) provided a materials availability statement, 0% provided analysis scripts, 4.1% (8/195, [1.9–6.3]) provided a pre-registration statement, 2.1% (4/195, [0.4–3.7]) provided a protocol statement, and 3.6% (7/195, [1.5–5.7]) were pre-registered. The validation study of the 5 key indicators of reproducibility—availability of data, materials, protocols, analysis scripts, and pre-registration—resulted in 2 indicators (availability of protocols and analysis scripts) being reproduced, as they fell within the 95% confidence intervals for the proportions from the original sample. However, materials’ availability and pre-registration proportions from the validation sample were lower than what was found in the original sample.

Conclusion

Our findings demonstrate key indicators of reproducibility are missing in the field of radiology. Thus, the ability to reproduce studies contained in radiology publications may be problematic and may have potential clinical implications.
Literature
1.
go back to reference Jackson WL (2014) Imaging utilization trends and reimbursement. Diagn Imaging. Jackson WL (2014) Imaging utilization trends and reimbursement. Diagn Imaging.
2.
go back to reference Rosenkrantz AB, Pinnamaneni N, Babb JS, Doshi AM (2016) Most common publication types in radiology journals: what is the level of evidence? Acad Radiol 23(5):628–633CrossRef Rosenkrantz AB, Pinnamaneni N, Babb JS, Doshi AM (2016) Most common publication types in radiology journals: what is the level of evidence? Acad Radiol 23(5):628–633CrossRef
3.
go back to reference Pitcher RD (2019) The role of radiology in global health. In: Mollura DJ, Culp MP, Lungren MP (eds) Radiology in Global Health: Strategies, Implementation, and Applications. Springer International Publishing, Cham, pp 157–174CrossRef Pitcher RD (2019) The role of radiology in global health. In: Mollura DJ, Culp MP, Lungren MP (eds) Radiology in Global Health: Strategies, Implementation, and Applications. Springer International Publishing, Cham, pp 157–174CrossRef
5.
go back to reference Baker M (2016) 1,500 scientists lift the lid on reproducibility. Nature 533(7604):452–454CrossRef Baker M (2016) 1,500 scientists lift the lid on reproducibility. Nature 533(7604):452–454CrossRef
6.
go back to reference Aerts HJWL (2018) Data science in radiology: a path forward. Clin Cancer Res 24(3):532–534CrossRef Aerts HJWL (2018) Data science in radiology: a path forward. Clin Cancer Res 24(3):532–534CrossRef
7.
go back to reference Hardwicke TE, Wallach JD, Kidwell MC, Bendixen T, Crüwell S, Ioannidis JPA (2019) An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014-2017). R Soc Open Sci 7(2):190806 Hardwicke TE, Wallach JD, Kidwell MC, Bendixen T, Crüwell S, Ioannidis JPA (2019) An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014-2017). R Soc Open Sci 7(2):190806
9.
go back to reference Murad MH, Wang Z (2017) Guidelines for reporting meta-epidemiological methodology research. Evid Based Med 22(4):139–142CrossRef Murad MH, Wang Z (2017) Guidelines for reporting meta-epidemiological methodology research. Evid Based Med 22(4):139–142CrossRef
10.
go back to reference Liberati A, Altman DG, Tetzlaff J et al (2009) The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. J Clin Epidemiol 62(10):e1–e34CrossRef Liberati A, Altman DG, Tetzlaff J et al (2009) The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. J Clin Epidemiol 62(10):e1–e34CrossRef
11.
go back to reference Wallach JD, Boyack KW, Ioannidis JPA (2018) Reproducible research practices, transparency, and open access data in the biomedical literature, 2015–2017. PLoS Biol 16(11):e2006930CrossRef Wallach JD, Boyack KW, Ioannidis JPA (2018) Reproducible research practices, transparency, and open access data in the biomedical literature, 2015–2017. PLoS Biol 16(11):e2006930CrossRef
12.
go back to reference Iqbal SA, Wallach JD, Khoury MJ, Schully SD, Ioannidis JPA (2016) Reproducible research practices and transparency across the biomedical literature. PLoS Biol 14(1):e1002333CrossRef Iqbal SA, Wallach JD, Khoury MJ, Schully SD, Ioannidis JPA (2016) Reproducible research practices and transparency across the biomedical literature. PLoS Biol 14(1):e1002333CrossRef
14.
go back to reference Doran SJ, d’Arcy J, Collins DJ et al (2012) Informatics in radiology: development of a research PACS for analysis of functional imaging data in clinical research and clinical trials. Radiographics 32(7):2135–2150CrossRef Doran SJ, d’Arcy J, Collins DJ et al (2012) Informatics in radiology: development of a research PACS for analysis of functional imaging data in clinical research and clinical trials. Radiographics 32(7):2135–2150CrossRef
15.
go back to reference Sardanelli F, Alì M, Hunink MG, Houssami N, Sconfienza LM, Di Leo G (2018) To share or not to share? Expected pros and cons of data sharing in radiological research. Eur Radiol 28(6):2328–2335CrossRef Sardanelli F, Alì M, Hunink MG, Houssami N, Sconfienza LM, Di Leo G (2018) To share or not to share? Expected pros and cons of data sharing in radiological research. Eur Radiol 28(6):2328–2335CrossRef
16.
go back to reference Warren E (2016) Strengthening research through data sharing. N Engl J Med. 375(5):401–403CrossRef Warren E (2016) Strengthening research through data sharing. N Engl J Med. 375(5):401–403CrossRef
17.
go back to reference Naudet F, Sakarovitch C, Janiaud P et al (2018) Data sharing and reanalysis of randomized controlled trials in leading biomedical journals with a full data sharing policy: survey of studies published inThe BMJandPLOS Medicine. BMJ:k400. https://doi.org/10.1136/bmj.k400 Naudet F, Sakarovitch C, Janiaud P et al (2018) Data sharing and reanalysis of randomized controlled trials in leading biomedical journals with a full data sharing policy: survey of studies published inThe BMJandPLOS Medicine. BMJ:k400. https://​doi.​org/​10.​1136/​bmj.​k400
18.
go back to reference Federer LM, Belter CW, Joubert DJ et al (2018) Data sharing in PLOS ONE: an analysis of data availability statements. PLoS One. 13(5):e0194768CrossRef Federer LM, Belter CW, Joubert DJ et al (2018) Data sharing in PLOS ONE: an analysis of data availability statements. PLoS One. 13(5):e0194768CrossRef
22.
go back to reference Piccolo SR, Frampton MB (2016) Tools and techniques for computational reproducibility. Gigascience. 5(1):30CrossRef Piccolo SR, Frampton MB (2016) Tools and techniques for computational reproducibility. Gigascience. 5(1):30CrossRef
23.
go back to reference Garijo D, Kinnings S, Xie L et al (2013) Quantifying reproducibility in computational biology: the case of the tuberculosis drugome. PLoS One. 8(11):e80278CrossRef Garijo D, Kinnings S, Xie L et al (2013) Quantifying reproducibility in computational biology: the case of the tuberculosis drugome. PLoS One. 8(11):e80278CrossRef
24.
go back to reference Gronenschild EHBM, Habets P, Jacobs HIL et al (2012) The effects of FreeSurfer version, workstation type, and Macintosh operating system version on anatomical volume and cortical thickness measurements. PLoS One. 7(6):e38234CrossRef Gronenschild EHBM, Habets P, Jacobs HIL et al (2012) The effects of FreeSurfer version, workstation type, and Macintosh operating system version on anatomical volume and cortical thickness measurements. PLoS One. 7(6):e38234CrossRef
25.
go back to reference Parmar C, Barry JD, Hosny A, Quackenbush J, Aerts HJWL (2018) Data analysis strategies in medical imaging. Clin Cancer Res. 24(15):3492–3499CrossRef Parmar C, Barry JD, Hosny A, Quackenbush J, Aerts HJWL (2018) Data analysis strategies in medical imaging. Clin Cancer Res. 24(15):3492–3499CrossRef
26.
go back to reference Traverso A, Wee L, Dekker A, Gillies R (2018) Repeatability and reproducibility of radiomic features: a systematic review. Int J Radiat Oncol Biol Phys. 102(4):1143–1158CrossRef Traverso A, Wee L, Dekker A, Gillies R (2018) Repeatability and reproducibility of radiomic features: a systematic review. Int J Radiat Oncol Biol Phys. 102(4):1143–1158CrossRef
27.
go back to reference Carp J (2012) On the plurality of (methodological) worlds: estimating the analytic flexibility of FMRI experiments. Front Neurosci. 6:149CrossRef Carp J (2012) On the plurality of (methodological) worlds: estimating the analytic flexibility of FMRI experiments. Front Neurosci. 6:149CrossRef
28.
go back to reference Poldrack RA, Gorgolewski KJ, Varoquaux G (2019) Computational and informatic advances for reproducible data analysis in neuroimaging. Annu Rev Biomed Data Sci. 2(1):119–138CrossRef Poldrack RA, Gorgolewski KJ, Varoquaux G (2019) Computational and informatic advances for reproducible data analysis in neuroimaging. Annu Rev Biomed Data Sci. 2(1):119–138CrossRef
29.
go back to reference Gorgolewski KJ, Poldrack RA (2016) A practical guide for improving transparency and reproducibility in neuroimaging research. PLoS Biol. 14(7):e1002506CrossRef Gorgolewski KJ, Poldrack RA (2016) A practical guide for improving transparency and reproducibility in neuroimaging research. PLoS Biol. 14(7):e1002506CrossRef
31.
go back to reference Triphan SMF, Biederer J, Burmester K et al (2018) Design and application of an MR reference phantom for multicentre lung imaging trials. PLoS One. 13(7):e0199148CrossRef Triphan SMF, Biederer J, Burmester K et al (2018) Design and application of an MR reference phantom for multicentre lung imaging trials. PLoS One. 13(7):e0199148CrossRef
32.
go back to reference Cohen JF, Korevaar DA, Altman DG et al (2016) STARD 2015 guidelines for reporting diagnostic accuracy studies: explanation and elaboration. BMJ Open. 6(11):e012799CrossRef Cohen JF, Korevaar DA, Altman DG et al (2016) STARD 2015 guidelines for reporting diagnostic accuracy studies: explanation and elaboration. BMJ Open. 6(11):e012799CrossRef
36.
go back to reference Kottner J, Audigé L, Brorson S et al (2011) Guidelines for Reporting Reliability and Agreement Studies (GRRAS) were proposed. J Clin Epidemiol. 64(1):96–106CrossRef Kottner J, Audigé L, Brorson S et al (2011) Guidelines for Reporting Reliability and Agreement Studies (GRRAS) were proposed. J Clin Epidemiol. 64(1):96–106CrossRef
37.
go back to reference Gerke O, Möller S, Debrabant B, Halekoh U (2018) Odense Agreement Working Group. Experience applying the Guidelines for Reporting Reliability and Agreement Studies (GRRAS) indicated five questions should be addressed in the planning phase from a statistical point of view. Diagnostics (Basel) 8(4):69. https://doi.org/10.3390/diagnostics8040069 Gerke O, Möller S, Debrabant B, Halekoh U (2018) Odense Agreement Working Group. Experience applying the Guidelines for Reporting Reliability and Agreement Studies (GRRAS) indicated five questions should be addressed in the planning phase from a statistical point of view. Diagnostics (Basel) 8(4):69. https://​doi.​org/​10.​3390/​diagnostics80400​69
38.
go back to reference Cronin P, Rawson JV (2016) Review of research reporting guidelines for radiology researchers. Acad Radiol. 23(5):537–558CrossRef Cronin P, Rawson JV (2016) Review of research reporting guidelines for radiology researchers. Acad Radiol. 23(5):537–558CrossRef
40.
go back to reference Oster NV, Carney PA, Allison KH et al (2013) Development of a diagnostic test set to assess agreement in breast pathology: practical application of the Guidelines for Reporting Reliability and Agreement Studies (GRRAS). BMC Women’s Health 13(1). https://doi.org/10.1186/1472-6874-13-3 Oster NV, Carney PA, Allison KH et al (2013) Development of a diagnostic test set to assess agreement in breast pathology: practical application of the Guidelines for Reporting Reliability and Agreement Studies (GRRAS). BMC Women’s Health 13(1). https://​doi.​org/​10.​1186/​1472-6874-13-3
41.
go back to reference Open Science Collaboration (2015) Estimating the reproducibility of psychological science. Science 349(6251):aac4716CrossRef Open Science Collaboration (2015) Estimating the reproducibility of psychological science. Science 349(6251):aac4716CrossRef
43.
go back to reference Günel Karadeniz P, Uzabacı E, Atış Kuyuk S et al (2019) Statistical errors in articles published in radiology journals. Diagn Interv Radiol 25(2):102–108CrossRef Günel Karadeniz P, Uzabacı E, Atış Kuyuk S et al (2019) Statistical errors in articles published in radiology journals. Diagn Interv Radiol 25(2):102–108CrossRef
45.
go back to reference Klein RA, Ratliff K, Vianello M et al (2014) Investigating variation in replicability: a “many labs” replication project. Open Science Framework. Klein RA, Ratliff K, Vianello M et al (2014) Investigating variation in replicability: a “many labs” replication project. Open Science Framework.
46.
go back to reference Klein RA, Vianello M, Hasselman F et al (2018) Many Labs 2: investigating variation in replicability across samples and settings. Advances in Methods and Practices in Psychological Science 1(4):443–490CrossRef Klein RA, Vianello M, Hasselman F et al (2018) Many Labs 2: investigating variation in replicability across samples and settings. Advances in Methods and Practices in Psychological Science 1(4):443–490CrossRef
48.
go back to reference Higgins JPT, Green S (2011) Cochrane Handbook for Systematic Reviews of Interventions. John Wiley & Sons Higgins JPT, Green S (2011) Cochrane Handbook for Systematic Reviews of Interventions. John Wiley & Sons
Metadata
Title
An analysis of key indicators of reproducibility in radiology
Authors
Bryan D. Wright
Nam Vo
Johnny Nolan
Austin L. Johnson
Tyler Braaten
Daniel Tritz
Matt Vassar
Publication date
01-12-2020
Publisher
Springer Berlin Heidelberg
Published in
Insights into Imaging / Issue 1/2020
Electronic ISSN: 1869-4101
DOI
https://doi.org/10.1186/s13244-020-00870-x

Other articles of this Issue 1/2020

Insights into Imaging 1/2020 Go to the issue