Skip to main content
Top
Published in: BMC Medical Imaging 1/2013

Open Access 01-12-2013 | Research article

A workstation-integrated peer review quality assurance program: pilot study

Authors: Margaret M O’Keeffe, Todd M Davis, Kerry Siminoski

Published in: BMC Medical Imaging | Issue 1/2013

Login to get access

Abstract

Background

The surrogate indicator of radiological excellence that has become accepted is consistency of assessments between radiologists, and the technique that has become the standard for evaluating concordance is peer review. This study describes the results of a workstation-integrated peer review program in a busy outpatient radiology practice.

Methods

Workstation-based peer review was performed using the software program Intelerad Peer Review. Cases for review were randomly chosen from those being actively reported. If an appropriate prior study was available, and if the reviewing radiologist and the original interpreting radiologist had not exceeded review targets, the case was scored using the modified RADPEER system.

Results

There were 2,241 cases randomly assigned for peer review. Of selected cases, 1,705 (76%) were interpreted. Reviewing radiologists agreed with prior reports in 99.1% of assessments. Positive feedback (score 0) was given in three cases (0.2%) and concordance (scores of 0 to 2) was assigned in 99.4%, similar to reported rates of 97.0% to 99.8%. Clinically significant discrepancies (scores of 3 or 4) were identified in 10 cases (0.6%). Eighty-eight percent of reviewed radiologists found the reviews worthwhile, 79% found scores appropriate, and 65% felt feedback was appropriate. Two-thirds of radiologists found case rounds discussing significant discrepancies to be valuable.

Conclusions

The workstation-based computerized peer review process used in this pilot project was seamlessly incorporated into the normal workday and met most criteria for an ideal peer review system. Clinically significant discrepancies were identified in 0.6% of cases, similar to published outcomes using the RADPEER system. Reviewed radiologists felt the process was worthwhile.
Literature
1.
go back to reference Steele JR, Hovsepian DM, Schomer DF: The Joint Commission practice performance evaluation: a primer for radiologists. J Am Coll Radiol. 2010, 7: 425-430. 10.1016/j.jacr.2010.01.027.CrossRefPubMed Steele JR, Hovsepian DM, Schomer DF: The Joint Commission practice performance evaluation: a primer for radiologists. J Am Coll Radiol. 2010, 7: 425-430. 10.1016/j.jacr.2010.01.027.CrossRefPubMed
2.
go back to reference Johnson CD, Krecke KN, Miranda R, et al: Quality initiatives: developing a radiology quality and safety program: a primer. Radiographics. 2009, 29: 951-959. 10.1148/rg.294095006.CrossRefPubMed Johnson CD, Krecke KN, Miranda R, et al: Quality initiatives: developing a radiology quality and safety program: a primer. Radiographics. 2009, 29: 951-959. 10.1148/rg.294095006.CrossRefPubMed
3.
go back to reference Jackson VP, Cushing T, Abujudeh HH, et al: RADPEER scoring white paper. J Am Coll Radiol. 2009, 6: 21-25. 10.1016/j.jacr.2008.06.011.CrossRefPubMed Jackson VP, Cushing T, Abujudeh HH, et al: RADPEER scoring white paper. J Am Coll Radiol. 2009, 6: 21-25. 10.1016/j.jacr.2008.06.011.CrossRefPubMed
4.
go back to reference Borgestede JP, Lewis RS, Bhargavan M, et al: RADPEER quality assurance program: a multifacility study of interpretive disagreement rates. J Am Coll Radiol. 2004, 1: 59-65. 10.1016/S1546-1440(03)00002-4.CrossRef Borgestede JP, Lewis RS, Bhargavan M, et al: RADPEER quality assurance program: a multifacility study of interpretive disagreement rates. J Am Coll Radiol. 2004, 1: 59-65. 10.1016/S1546-1440(03)00002-4.CrossRef
5.
go back to reference Soffa DJ, Lewis RS, Sunshine JH, et al: Disagreement in interpretation: a method for the development of benchmarks for quality assurance in imaging. J Am Coll Radiol. 2004, 1: 212-217. 10.1016/j.jacr.2003.12.017.CrossRefPubMed Soffa DJ, Lewis RS, Sunshine JH, et al: Disagreement in interpretation: a method for the development of benchmarks for quality assurance in imaging. J Am Coll Radiol. 2004, 1: 212-217. 10.1016/j.jacr.2003.12.017.CrossRefPubMed
6.
go back to reference Lockyer JM, Violato C, Fidler HM: Assessment of radiologists by a regulatory authority. Radiol. 2008, 247: 771-778. 10.1148/radiol.2473071431.CrossRef Lockyer JM, Violato C, Fidler HM: Assessment of radiologists by a regulatory authority. Radiol. 2008, 247: 771-778. 10.1148/radiol.2473071431.CrossRef
7.
go back to reference Strife JL, Kun LE, Becker GJ, et al: American Board of Radiology perspective on maintenance of certification: part IV-practice quality improvement for diagnostic radiology. Radiographics. 2007, 27: 769-774. 10.1148/rg.273075914.CrossRefPubMed Strife JL, Kun LE, Becker GJ, et al: American Board of Radiology perspective on maintenance of certification: part IV-practice quality improvement for diagnostic radiology. Radiographics. 2007, 27: 769-774. 10.1148/rg.273075914.CrossRefPubMed
8.
go back to reference Landon BE, Norman ST, Blumenthal D, et al: Physician clinical performance assessment. JAMA. 2003, 290: 1183-1189. 10.1001/jama.290.9.1183.CrossRefPubMed Landon BE, Norman ST, Blumenthal D, et al: Physician clinical performance assessment. JAMA. 2003, 290: 1183-1189. 10.1001/jama.290.9.1183.CrossRefPubMed
9.
go back to reference Munk PL, Forster BB: Accreditation: problem or opportunity?. Can Assoc Radiol J. 2011, 62: 88-89. 10.1016/j.carj.2011.03.001.CrossRefPubMed Munk PL, Forster BB: Accreditation: problem or opportunity?. Can Assoc Radiol J. 2011, 62: 88-89. 10.1016/j.carj.2011.03.001.CrossRefPubMed
10.
go back to reference Wadden N: Breast cancer screening in Canada: a review. Can Assoc Radiol J. 2005, 56: 271-275.PubMed Wadden N: Breast cancer screening in Canada: a review. Can Assoc Radiol J. 2005, 56: 271-275.PubMed
11.
go back to reference FitzGerald R: Radiological error: analysis, standard setting, targeted instruction and teamworking. Eur Radiol. 2005, 15: 1760-1767. 10.1007/s00330-005-2662-8.CrossRefPubMed FitzGerald R: Radiological error: analysis, standard setting, targeted instruction and teamworking. Eur Radiol. 2005, 15: 1760-1767. 10.1007/s00330-005-2662-8.CrossRefPubMed
12.
go back to reference Steele JR: The role of RADPEER in the Joint Commission Ongoing Practice Performance Evaluation. J Am Coll Radiol. 2011, 8: 6-7. 10.1016/j.jacr.2010.08.025.CrossRefPubMed Steele JR: The role of RADPEER in the Joint Commission Ongoing Practice Performance Evaluation. J Am Coll Radiol. 2011, 8: 6-7. 10.1016/j.jacr.2010.08.025.CrossRefPubMed
13.
go back to reference Mahgerefteh S, Kruskal JB, Yam CS, et al: Quality initiatives: peer review in diagnostic radiology: current state and a vision for the future. Radiographics. 2009, 29: 1221-1231. 10.1148/rg.295095086.CrossRefPubMed Mahgerefteh S, Kruskal JB, Yam CS, et al: Quality initiatives: peer review in diagnostic radiology: current state and a vision for the future. Radiographics. 2009, 29: 1221-1231. 10.1148/rg.295095086.CrossRefPubMed
14.
go back to reference Larson DB, Nance JJ: Rethinking peer review: what aviation can teach radiology about performance improvement. Radiol. 2011, 259: 626-672. 10.1148/radiol.11102222.CrossRef Larson DB, Nance JJ: Rethinking peer review: what aviation can teach radiology about performance improvement. Radiol. 2011, 259: 626-672. 10.1148/radiol.11102222.CrossRef
15.
go back to reference Halsted MJ: Radiology peer review as an opportunity to reduce errors and improve patient care. J Am Coll Radiol. 2004, 1: 984-987. 10.1016/j.jacr.2004.06.005.CrossRefPubMed Halsted MJ: Radiology peer review as an opportunity to reduce errors and improve patient care. J Am Coll Radiol. 2004, 1: 984-987. 10.1016/j.jacr.2004.06.005.CrossRefPubMed
16.
go back to reference Ramsey PG, Wenrich MD, Carline JD, et al: Use of peer ratings to evaluate physician performance. JAMA. 1993, 269: 1655-1660. 10.1001/jama.1993.03500130069034.CrossRefPubMed Ramsey PG, Wenrich MD, Carline JD, et al: Use of peer ratings to evaluate physician performance. JAMA. 1993, 269: 1655-1660. 10.1001/jama.1993.03500130069034.CrossRefPubMed
18.
go back to reference The Royal College of Radiologists: Standards for radiology discrepancy meetings. 2007, London: The Royal College of Radiologists The Royal College of Radiologists: Standards for radiology discrepancy meetings. 2007, London: The Royal College of Radiologists
19.
go back to reference Swanson JO, Thapa MM, Iyer RS, Otto RK, Weinberger E: Optimizing peer review: A year of experience after instituting a real-time comment-enhanced program at a Children's Hospital. Am J Roentgenol. 2012, 198: 1121-1125. 10.2214/AJR.11.6724.CrossRef Swanson JO, Thapa MM, Iyer RS, Otto RK, Weinberger E: Optimizing peer review: A year of experience after instituting a real-time comment-enhanced program at a Children's Hospital. Am J Roentgenol. 2012, 198: 1121-1125. 10.2214/AJR.11.6724.CrossRef
20.
go back to reference FitzGerald R: Performance-based assessment of radiology faculty. Am J Roentgenol. 2006, 186: 265-CrossRef FitzGerald R: Performance-based assessment of radiology faculty. Am J Roentgenol. 2006, 186: 265-CrossRef
21.
go back to reference Nakielny R: Setting up medical discrepancy meetings – the practicalities. CME Radiology. 2003, 4: 29-30. Nakielny R: Setting up medical discrepancy meetings – the practicalities. CME Radiology. 2003, 4: 29-30.
22.
go back to reference Bender LC, Linnau KF, Meier EN, Anzai Y, Gunn ML: Interrater agreement in the evaluation of discrepant imaging findings with the Radpeer system. Am J Roentgenol. 2012, 199: 1320-1327. 10.2214/AJR.12.8972.CrossRef Bender LC, Linnau KF, Meier EN, Anzai Y, Gunn ML: Interrater agreement in the evaluation of discrepant imaging findings with the Radpeer system. Am J Roentgenol. 2012, 199: 1320-1327. 10.2214/AJR.12.8972.CrossRef
23.
go back to reference Ruma J, Klein KA, Chong S, et al: Cross-sectional examination interpretation discrepancies between on-call diagnostic radiology residents and subspecialty faculty radiologists: analysis by imaging modality and subspecialty. J Am Coll Radiol. 2011, 199: 1320-1327. Ruma J, Klein KA, Chong S, et al: Cross-sectional examination interpretation discrepancies between on-call diagnostic radiology residents and subspecialty faculty radiologists: analysis by imaging modality and subspecialty. J Am Coll Radiol. 2011, 199: 1320-1327.
24.
go back to reference Lee JKT: Quality-a radiology imperative: interpretation accuracy and pertinence. J Am Coll Radiol. 2007, 4: 162-165. 10.1016/j.jacr.2006.09.020.CrossRefPubMed Lee JKT: Quality-a radiology imperative: interpretation accuracy and pertinence. J Am Coll Radiol. 2007, 4: 162-165. 10.1016/j.jacr.2006.09.020.CrossRefPubMed
25.
go back to reference Yoon LS, Haims AH, Brink JA, et al: Evaluation of an emergency radiology quality assurance program at a level I trauma center: abdominal and pelvic CT studies. Radiology. 2002, 224: 42-46. 10.1148/radiol.2241011470.CrossRefPubMed Yoon LS, Haims AH, Brink JA, et al: Evaluation of an emergency radiology quality assurance program at a level I trauma center: abdominal and pelvic CT studies. Radiology. 2002, 224: 42-46. 10.1148/radiol.2241011470.CrossRefPubMed
26.
go back to reference Tilleman EH, Phoa SS, Van Delden OM, et al: Reinterpretation of radiologic imaging in patients referred to a tertiary referral centre with a suspected pancreatic or hepatobiliary malignancy: impact on treatment strategy. Eur Radiol. 2003, 13: 1095-1099.PubMed Tilleman EH, Phoa SS, Van Delden OM, et al: Reinterpretation of radiologic imaging in patients referred to a tertiary referral centre with a suspected pancreatic or hepatobiliary malignancy: impact on treatment strategy. Eur Radiol. 2003, 13: 1095-1099.PubMed
27.
go back to reference Siegle RL, Baram EM, Reuter SR, et al: Rates of disagreement in imaging interpretation in a group of community hospitals. Acad Radiol. 1998, 5: 148-154. 10.1016/S1076-6332(98)80277-8.CrossRefPubMed Siegle RL, Baram EM, Reuter SR, et al: Rates of disagreement in imaging interpretation in a group of community hospitals. Acad Radiol. 1998, 5: 148-154. 10.1016/S1076-6332(98)80277-8.CrossRefPubMed
28.
go back to reference Rhea JT, Potsaid MS, DeLuca SA: Errors of interpretation as elicited by a quality audit of an emergency radiology facility. Radiology. 1979, 132: 277-280.CrossRefPubMed Rhea JT, Potsaid MS, DeLuca SA: Errors of interpretation as elicited by a quality audit of an emergency radiology facility. Radiology. 1979, 132: 277-280.CrossRefPubMed
29.
go back to reference Prevedello L, Khorasani R: Enhancing quality assurance and quality control programs: IT tools can help. J Am Coll Radiol. 2009, 6: 888-889. 10.1016/j.jacr.2009.09.009.CrossRefPubMed Prevedello L, Khorasani R: Enhancing quality assurance and quality control programs: IT tools can help. J Am Coll Radiol. 2009, 6: 888-889. 10.1016/j.jacr.2009.09.009.CrossRefPubMed
30.
go back to reference Maloney E, Lomasney LM, Schomer L: Application of the RADPEER scoring language to interpretation discrepancies between diagnostic radiology residents and faculty radiologists. J Am Coll Radiol. 2012, 9: 264-269. 10.1016/j.jacr.2011.11.016.CrossRefPubMed Maloney E, Lomasney LM, Schomer L: Application of the RADPEER scoring language to interpretation discrepancies between diagnostic radiology residents and faculty radiologists. J Am Coll Radiol. 2012, 9: 264-269. 10.1016/j.jacr.2011.11.016.CrossRefPubMed
Metadata
Title
A workstation-integrated peer review quality assurance program: pilot study
Authors
Margaret M O’Keeffe
Todd M Davis
Kerry Siminoski
Publication date
01-12-2013
Publisher
BioMed Central
Published in
BMC Medical Imaging / Issue 1/2013
Electronic ISSN: 1471-2342
DOI
https://doi.org/10.1186/1471-2342-13-19

Other articles of this Issue 1/2013

BMC Medical Imaging 1/2013 Go to the issue