Skip to main content
Top
Published in: Pediatric Radiology 4/2019

01-04-2019 | Pediatric Radiology | Minisymposium: Quality and safety

Survey of peer review programs among pediatric radiologists: report from the SPR Quality and Safety Committee

Authors: Ramesh S. Iyer, David W. Swenson, Neil Anand, Einat Blumfield, Tushar Chandra, Govind B. Chavhan, Thomas R. Goodman, Naeem Khan, Michael M. Moore, Thang D. Ngo, Christina L. Sammet, Raymond W. Sze, Chido D. Vera, A. Luana Stanescu

Published in: Pediatric Radiology | Issue 4/2019

Login to get access

Abstract

During the last 15 years, peer review has been widely incorporated into radiology quality improvement programs. However, current implementations are variable and carry concerns, including subjectivity of numerical scores and a sense of merely satisfying regulatory requirements. The Society for Pediatric Radiology (SPR) Quality and Safety Committee sought to evaluate the state of peer review programs in pediatric radiology practices, including implementation methods, perceived functions, strengths and weaknesses, and opportunities for improvement. We distributed an online 16-question survey to SPR members. Questions pertained to the type of peer review system, the use of numerical scores and comments, how feedback on discordances is given and received, and the use of peer learning conferences. We collected 219 responses (15% of survey invitations), 80% of which were from children’s hospitals. Fifty percent of respondents said they use a picture archiving and communication system (PACS)-integrated peer review system. Comment-enhanced feedback for interpretive discordances was either very important or somewhat important to performance improvement in 86% of responses, compared to 48% with a similar perception of numerical scores. Sixty-eight percent of respondents said they either rarely or never check their numerical scores, and 82% either strongly or somewhat agreed that comments are more effective feedback than numerical scores. Ninety-three percent either strongly or somewhat agreed that peer learning conferences would be beneficial to their practice. Forty-eight percent thought that their current peer review system should be modified. Survey results demonstrate that peer review systems in pediatric radiology practices are implemented variably, and nearly half of respondents believe their systems should be modified. Most respondents prefer feedback in the form of comments and peer learning conferences, which are thought to be more beneficial for performance improvement than numerical scores.
Appendix
Available only for authorised users
Literature
1.
go back to reference Jackson VP, Cushing T, Abujudeh HH et al (2009) RADPEER scoring white paper. J Am Coll Radiol 6:21–25CrossRefPubMed Jackson VP, Cushing T, Abujudeh HH et al (2009) RADPEER scoring white paper. J Am Coll Radiol 6:21–25CrossRefPubMed
2.
go back to reference Abujudeh H, Pyatt RS Jr, Bruno MA et al (2014) RADPEER peer review: relevance, use, concerns, challenges, and direction forward. J Am Coll Radiol 11:899–904CrossRefPubMed Abujudeh H, Pyatt RS Jr, Bruno MA et al (2014) RADPEER peer review: relevance, use, concerns, challenges, and direction forward. J Am Coll Radiol 11:899–904CrossRefPubMed
3.
go back to reference Goldberg-Stein S, Frigini LA, Long S et al (2017) ACR RADPEER committee white paper with 2016 updates: revised scoring system, new classifications, self-review, and subspecialized reports. J Am Coll Radiol 14:1080–1086CrossRefPubMed Goldberg-Stein S, Frigini LA, Long S et al (2017) ACR RADPEER committee white paper with 2016 updates: revised scoring system, new classifications, self-review, and subspecialized reports. J Am Coll Radiol 14:1080–1086CrossRefPubMed
5.
go back to reference Bluth E, Bansal S, Bender C (2017) The 2017 ACR Commission on human resources workforce survey. J Am Coll Radiol 14:1613–1619CrossRefPubMed Bluth E, Bansal S, Bender C (2017) The 2017 ACR Commission on human resources workforce survey. J Am Coll Radiol 14:1613–1619CrossRefPubMed
6.
go back to reference McEnery KW, Suitor CT, Hildebrand S, Downs RL (2000) Integration of radiologist peer review into clinical review workstation. J Digit Imaging 13:101–104CrossRefPubMedPubMedCentral McEnery KW, Suitor CT, Hildebrand S, Downs RL (2000) Integration of radiologist peer review into clinical review workstation. J Digit Imaging 13:101–104CrossRefPubMedPubMedCentral
7.
go back to reference Donnelly LF, Strife JL (2005) Performance-based assessment of radiology faculty: a practical plan to promote improvement and meet JCAHO standards. AJR Am J Roentgenol 184:1398–1401CrossRefPubMed Donnelly LF, Strife JL (2005) Performance-based assessment of radiology faculty: a practical plan to promote improvement and meet JCAHO standards. AJR Am J Roentgenol 184:1398–1401CrossRefPubMed
8.
go back to reference Mahgerefteh S, Kruskal JB, Yam CS et al (2009) Peer review in diagnostic radiology: current state and a vision for the future. Radiographics 29:1221–1231CrossRefPubMed Mahgerefteh S, Kruskal JB, Yam CS et al (2009) Peer review in diagnostic radiology: current state and a vision for the future. Radiographics 29:1221–1231CrossRefPubMed
9.
go back to reference London M, Beatty RW (1993) 360-degree feedback as a competitive advantage. Hum Resour Manag J 32:353–372CrossRef London M, Beatty RW (1993) 360-degree feedback as a competitive advantage. Hum Resour Manag J 32:353–372CrossRef
10.
go back to reference Wood J, Collins J, Burnside ES et al (2004) Patient, faculty, and self-assessment of radiology resident performance: a 360-degree method of measuring professionalism and interpersonal/communication skills. Acad Radiol 11:931–939PubMed Wood J, Collins J, Burnside ES et al (2004) Patient, faculty, and self-assessment of radiology resident performance: a 360-degree method of measuring professionalism and interpersonal/communication skills. Acad Radiol 11:931–939PubMed
11.
go back to reference Swanson JO, Thapa MM, Iyer RS et al (2012) Optimizing peer review: a year of experience after instituting a real-time, comment-enhanced program at a children’s hospital. AJR Am J Roentgenol 198:1121–1125CrossRefPubMed Swanson JO, Thapa MM, Iyer RS et al (2012) Optimizing peer review: a year of experience after instituting a real-time, comment-enhanced program at a children’s hospital. AJR Am J Roentgenol 198:1121–1125CrossRefPubMed
12.
go back to reference Larson DB, Nance J (2011) Rethinking peer review: what aviation can teach radiology about performance improvement. Radiology 259:626–632CrossRefPubMed Larson DB, Nance J (2011) Rethinking peer review: what aviation can teach radiology about performance improvement. Radiology 259:626–632CrossRefPubMed
13.
go back to reference Eisenberg RL, Cunningham ML, Siewert B, Kruskal JB (2014) Survey of faculty perceptions regarding a peer review system. J Am Coll Radiol 11:397–401CrossRefPubMed Eisenberg RL, Cunningham ML, Siewert B, Kruskal JB (2014) Survey of faculty perceptions regarding a peer review system. J Am Coll Radiol 11:397–401CrossRefPubMed
14.
go back to reference Larson DB, Donnelly LF, Podberesky DJ et al (2017) Peer feedback, learning and improvement: answering the call of the Institute of Medicine report on diagnostic error. Radiology 283:231–241CrossRefPubMed Larson DB, Donnelly LF, Podberesky DJ et al (2017) Peer feedback, learning and improvement: answering the call of the Institute of Medicine report on diagnostic error. Radiology 283:231–241CrossRefPubMed
15.
go back to reference Hussain S, Hussain JS, Karam A, Vijayaraghavan G (2012) Focused peer review: the end game of peer review. J Am Coll Radiol 9:430–433CrossRefPubMed Hussain S, Hussain JS, Karam A, Vijayaraghavan G (2012) Focused peer review: the end game of peer review. J Am Coll Radiol 9:430–433CrossRefPubMed
16.
go back to reference Iyer RS, Swanson JO, Otto RK, Weinberger E (2013) Peer review comments augment diagnostic error characterization and departmental quality assurance: 1-year experience from a children’s hospital. AJR Am J Roentgenol 200:132–137CrossRefPubMed Iyer RS, Swanson JO, Otto RK, Weinberger E (2013) Peer review comments augment diagnostic error characterization and departmental quality assurance: 1-year experience from a children’s hospital. AJR Am J Roentgenol 200:132–137CrossRefPubMed
17.
go back to reference Donnelly LF, Larson DB, Heller RE III, Kruskal JB (2018) Practical suggestions on how to move from peer review to peer learning. AJR Am J Roentgenol 210:578–582CrossRefPubMed Donnelly LF, Larson DB, Heller RE III, Kruskal JB (2018) Practical suggestions on how to move from peer review to peer learning. AJR Am J Roentgenol 210:578–582CrossRefPubMed
18.
go back to reference Randall DM, Fernandes MF (1991) The social desirability response bias in ethics research. J Bus Ethics 10:805–817CrossRef Randall DM, Fernandes MF (1991) The social desirability response bias in ethics research. J Bus Ethics 10:805–817CrossRef
19.
go back to reference Strickland NH (2015) Quality assurance in radiology: peer review and peer feedback. Clin Radiol 70:1158–1164CrossRefPubMed Strickland NH (2015) Quality assurance in radiology: peer review and peer feedback. Clin Radiol 70:1158–1164CrossRefPubMed
20.
go back to reference Kruskal JB, Eisenberg RL, Brook O, Siewert B (2016) Transitioning from peer review to peer learning for abdominal radiologists. Abdom Radiol 41:416–428CrossRef Kruskal JB, Eisenberg RL, Brook O, Siewert B (2016) Transitioning from peer review to peer learning for abdominal radiologists. Abdom Radiol 41:416–428CrossRef
21.
go back to reference Bender LC, Linnau KF, Meier EN et al (2012) Interrater agreement in the evaluation of discrepant imaging findings with the RADPEER system. AJR Am J Roentgenol 199:1320–1327CrossRefPubMed Bender LC, Linnau KF, Meier EN et al (2012) Interrater agreement in the evaluation of discrepant imaging findings with the RADPEER system. AJR Am J Roentgenol 199:1320–1327CrossRefPubMed
22.
23.
go back to reference Obuchowski NA (2003) Receiver operator characteristic curves and their use in radiology. Radiology 229:3–8CrossRefPubMed Obuchowski NA (2003) Receiver operator characteristic curves and their use in radiology. Radiology 229:3–8CrossRefPubMed
25.
go back to reference Bisset GS 3rd, Crowe J (2014) Diagnostic errors in interpretation of pediatric musculoskeletal radiographs at common injury sites. Pediatr Radiol 44:552–557CrossRefPubMed Bisset GS 3rd, Crowe J (2014) Diagnostic errors in interpretation of pediatric musculoskeletal radiographs at common injury sites. Pediatr Radiol 44:552–557CrossRefPubMed
26.
go back to reference Kaewlai R, Abujudeh H (2012) Peer review in clinical radiology practice. AJR Am J Roentgenol 199:W158–W162CrossRefPubMed Kaewlai R, Abujudeh H (2012) Peer review in clinical radiology practice. AJR Am J Roentgenol 199:W158–W162CrossRefPubMed
28.
go back to reference Board on Health Care Services, Institute of Medicine (2015) Improving diagnosis in health care. In: Balogh EP, Miller BT, Mall JR (eds) The National Academy of Sciences. The National Academies Press, Washington, DC Board on Health Care Services, Institute of Medicine (2015) Improving diagnosis in health care. In: Balogh EP, Miller BT, Mall JR (eds) The National Academy of Sciences. The National Academies Press, Washington, DC
29.
go back to reference Committee on Quality of Health Care in America, Institute of Medicine (2000) To err is human: building a safer health system. In: Kohn LT, Corrigan JM, Donaldson MS (eds) The National Academy of Sciences. The National Academies Press, Washington, DC Committee on Quality of Health Care in America, Institute of Medicine (2000) To err is human: building a safer health system. In: Kohn LT, Corrigan JM, Donaldson MS (eds) The National Academy of Sciences. The National Academies Press, Washington, DC
30.
go back to reference Committee on Quality of Health Care in America, Institute of Medicine (2001) Crossing the quality chasm: a new health system for the 21st century. In: The National Academy of Sciences. The National Academies Press, Washington, DC Committee on Quality of Health Care in America, Institute of Medicine (2001) Crossing the quality chasm: a new health system for the 21st century. In: The National Academy of Sciences. The National Academies Press, Washington, DC
31.
go back to reference Waterman AD, Garbutt J, Hazel E et al (2007) The emotional impact of medical errors on practicing physicians in the United States and Canada. Jt Comm J Qual Patient Saf 33:467–476CrossRefPubMed Waterman AD, Garbutt J, Hazel E et al (2007) The emotional impact of medical errors on practicing physicians in the United States and Canada. Jt Comm J Qual Patient Saf 33:467–476CrossRefPubMed
33.
go back to reference Blumfield E, Moore MM, Drake MK et al (2015) Survey of gadolinium-based contrast agent utilization among the members of the Society for Pediatric Radiology: a quality and safety committee report. Pediatr Radiol 47:665–673CrossRef Blumfield E, Moore MM, Drake MK et al (2015) Survey of gadolinium-based contrast agent utilization among the members of the Society for Pediatric Radiology: a quality and safety committee report. Pediatr Radiol 47:665–673CrossRef
34.
go back to reference Itri JN, Donithan A, Patel SH (2018) Random versus nonrandom peer review: a case for more meaningful peer review. J Am Coll Radiol 15:1045–1052CrossRefPubMed Itri JN, Donithan A, Patel SH (2018) Random versus nonrandom peer review: a case for more meaningful peer review. J Am Coll Radiol 15:1045–1052CrossRefPubMed
Metadata
Title
Survey of peer review programs among pediatric radiologists: report from the SPR Quality and Safety Committee
Authors
Ramesh S. Iyer
David W. Swenson
Neil Anand
Einat Blumfield
Tushar Chandra
Govind B. Chavhan
Thomas R. Goodman
Naeem Khan
Michael M. Moore
Thang D. Ngo
Christina L. Sammet
Raymond W. Sze
Chido D. Vera
A. Luana Stanescu
Publication date
01-04-2019
Publisher
Springer Berlin Heidelberg
Published in
Pediatric Radiology / Issue 4/2019
Print ISSN: 0301-0449
Electronic ISSN: 1432-1998
DOI
https://doi.org/10.1007/s00247-018-4289-3

Other articles of this Issue 4/2019

Pediatric Radiology 4/2019 Go to the issue