Skip to main content
Top
Published in: BMC Musculoskeletal Disorders 1/2013

Open Access 01-12-2013 | Research article

Reliability of medical record abstraction by non-physicians for orthopedic research

Authors: Michael Y Mi, Jamie E Collins, Vladislav Lerner, Elena Losina, Jeffrey N Katz

Published in: BMC Musculoskeletal Disorders | Issue 1/2013

Login to get access

Abstract

Background

Medical record review (MRR) is one of the most commonly used research methods in clinical studies because it provides rich clinical detail. However, because MRR involves subjective interpretation of information found in the medical record, it is critically important to understand the reproducibility of data obtained from MRR. Furthermore, because medical record review is both technically demanding and time intensive, it is important to establish whether trained research staff with no clinical training can abstract medical records reliably.

Methods

We assessed the reliability of abstraction of medical record information in a sample of patients who underwent total knee replacement (TKR) at a referral center. An orthopedic surgeon instructed two research coordinators (RCs) in the abstraction of inpatient medical records and operative notes for patients undergoing primary TKR. The two RCs and the surgeon each independently reviewed 75 patients’ records and one RC reviewed the records twice. Agreement was assessed using the proportion of items on which reviewers agreed and the kappa statistic.

Results

The kappa for agreement between the surgeon and each RC ranged from 0.59 to 1 for one RC and 0.49 to 1 for the other; the percent agreement ranged from 82% to 100% for one RC and 70% to 100% for the other. The repeated abstractions by the same RC showed high intra-rater agreement, with kappas ranging from 0.66 to 1 and percent agreement ranging from 97% to 100%. Inter-rater agreement between the two RCs was moderate with kappa ranging from 0.49 to 1 and percent agreement ranging from 76% to 100%.

Conclusion

The MRR method used in this study showed excellent reliability for abstraction of information that had low technical complexity and moderate to good reliability for information that had greater complexity. Overall, these findings support the use of non-surgeons to abstract surgical data from operative notes.
Literature
1.
go back to reference Worster A, Haines T: Advanced statistics: Understanding medical record review (MRR) studies. Acad Emerg Med. 2004, 11: 187-192.CrossRefPubMed Worster A, Haines T: Advanced statistics: Understanding medical record review (MRR) studies. Acad Emerg Med. 2004, 11: 187-192.CrossRefPubMed
2.
go back to reference Gilbert EH, Lowenstein SR, Koziol-McLain J, Barta DC, Steiner J: Chart reviews in emergency medicine research: where are the methods?. Ann Emerg Med. 1996, 27: 305-308. 10.1016/S0196-0644(96)70264-0.CrossRefPubMed Gilbert EH, Lowenstein SR, Koziol-McLain J, Barta DC, Steiner J: Chart reviews in emergency medicine research: where are the methods?. Ann Emerg Med. 1996, 27: 305-308. 10.1016/S0196-0644(96)70264-0.CrossRefPubMed
3.
go back to reference Eder C, Fullerton J, Benroth R, Lindsay SP: Pragmatic strategies that enhance the reliability of data abstracted from medical records. Appl Nurs Res. 2005, 18: 50-54. 10.1016/j.apnr.2004.04.005.CrossRefPubMed Eder C, Fullerton J, Benroth R, Lindsay SP: Pragmatic strategies that enhance the reliability of data abstracted from medical records. Appl Nurs Res. 2005, 18: 50-54. 10.1016/j.apnr.2004.04.005.CrossRefPubMed
4.
go back to reference Allison JJ, Wall TC, Spettell CM, Calhoun J, Fargason CA, Kobylinski RW, Farmer R, Kiefe C: The art and science of chart review. Jt Comm J Qual Improv. 2000, 26: 115-136.PubMed Allison JJ, Wall TC, Spettell CM, Calhoun J, Fargason CA, Kobylinski RW, Farmer R, Kiefe C: The art and science of chart review. Jt Comm J Qual Improv. 2000, 26: 115-136.PubMed
5.
go back to reference Luck J, Peabody JW, Dresselhaus TR, Lee M, Glassman P: How well does chart abstraction measure quality? A prospective comparison of standardized patients with the medical record. Am J Med. 2000, 108: 642-649. 10.1016/S0002-9343(00)00363-6.CrossRefPubMed Luck J, Peabody JW, Dresselhaus TR, Lee M, Glassman P: How well does chart abstraction measure quality? A prospective comparison of standardized patients with the medical record. Am J Med. 2000, 108: 642-649. 10.1016/S0002-9343(00)00363-6.CrossRefPubMed
6.
go back to reference Kvale JN, Gillanders WR, Buss TF, Gemmel D, Crenesse A, Griffiths-Marnejon J: Agreement between telephone survey and medical record data for the elderly patient. Fam Pract Res J. 1994, 14: 29-39.PubMed Kvale JN, Gillanders WR, Buss TF, Gemmel D, Crenesse A, Griffiths-Marnejon J: Agreement between telephone survey and medical record data for the elderly patient. Fam Pract Res J. 1994, 14: 29-39.PubMed
7.
go back to reference Stange KC, Zyzanski SJ, Smith TF, Kelly R, Langa DM, Flocke SA, Jaén CR: How valid are medical records and patient questionnaires for physician profiling and health services research? A comparison with direct observation of patient visits. Medical Care. 1998, 36: 851-867. 10.1097/00005650-199806000-00009.CrossRefPubMed Stange KC, Zyzanski SJ, Smith TF, Kelly R, Langa DM, Flocke SA, Jaén CR: How valid are medical records and patient questionnaires for physician profiling and health services research? A comparison with direct observation of patient visits. Medical Care. 1998, 36: 851-867. 10.1097/00005650-199806000-00009.CrossRefPubMed
8.
go back to reference Localio AR, Weaver SL, Landis JR, Lawthers AG, Brenhan TA, Hebert L, Sharp TJ: Identifying adverse events caused by medical care: degree of physician agreement in a retrospective chart review. Ann Intern Med. 1996, 125: 457-464. 10.7326/0003-4819-125-6-199609150-00005.CrossRefPubMed Localio AR, Weaver SL, Landis JR, Lawthers AG, Brenhan TA, Hebert L, Sharp TJ: Identifying adverse events caused by medical care: degree of physician agreement in a retrospective chart review. Ann Intern Med. 1996, 125: 457-464. 10.7326/0003-4819-125-6-199609150-00005.CrossRefPubMed
9.
go back to reference Moher D, Cook DJ, Eastwood S, Olkin I, Rennie D, Stroup DF: Improving the quality of reports of meta-analyses of randomised controlled trials: the QUOROM statement. Lancet. 1999, 354: 1896-1900. 10.1016/S0140-6736(99)04149-5.CrossRefPubMed Moher D, Cook DJ, Eastwood S, Olkin I, Rennie D, Stroup DF: Improving the quality of reports of meta-analyses of randomised controlled trials: the QUOROM statement. Lancet. 1999, 354: 1896-1900. 10.1016/S0140-6736(99)04149-5.CrossRefPubMed
10.
go back to reference Yawn BP, Wollan P: Interrater reliability: completing the methods description in medical records review studies. Am J Epidemiol. 2005, 161: 974-977. 10.1093/aje/kwi122.CrossRefPubMed Yawn BP, Wollan P: Interrater reliability: completing the methods description in medical records review studies. Am J Epidemiol. 2005, 161: 974-977. 10.1093/aje/kwi122.CrossRefPubMed
11.
go back to reference Liddy C, Wiens M, Hogg W: Methods to achieve high interrater reliability in data collection from primary care medical records. Ann Fam Med. 2011, 9: 57-62. 10.1370/afm.1195.CrossRefPubMedPubMedCentral Liddy C, Wiens M, Hogg W: Methods to achieve high interrater reliability in data collection from primary care medical records. Ann Fam Med. 2011, 9: 57-62. 10.1370/afm.1195.CrossRefPubMedPubMedCentral
12.
go back to reference Shiloach M, Frencher SK, Steeger JE, Rowell KS, Bartzokis K, Tomeh MG, Richards KE, Ko CY, Hall BL: Toward robust information: data quality and inter-rater reliability in the American College of Surgeons National Surgical Quality Improvement Program. J Am Coll Surg. 2010, 210: 6-16. 10.1016/j.jamcollsurg.2009.09.031.CrossRefPubMed Shiloach M, Frencher SK, Steeger JE, Rowell KS, Bartzokis K, Tomeh MG, Richards KE, Ko CY, Hall BL: Toward robust information: data quality and inter-rater reliability in the American College of Surgeons National Surgical Quality Improvement Program. J Am Coll Surg. 2010, 210: 6-16. 10.1016/j.jamcollsurg.2009.09.031.CrossRefPubMed
13.
go back to reference Fleiss JL, Cohen J, Everitt BS: Large sample standard errors of kappa and weighted kappa. Psychol Bull. 1969, 72: 323-327.CrossRef Fleiss JL, Cohen J, Everitt BS: Large sample standard errors of kappa and weighted kappa. Psychol Bull. 1969, 72: 323-327.CrossRef
14.
go back to reference Cohen J: A coefficient of agreement for nominal scales. Educ Psychol Meas. 1960, 20: 37-46. 10.1177/001316446002000104.CrossRef Cohen J: A coefficient of agreement for nominal scales. Educ Psychol Meas. 1960, 20: 37-46. 10.1177/001316446002000104.CrossRef
15.
go back to reference Landis JR, Koch GG: Measurement of observer agreement for categorical. Biometrics. 1977, 33: 159-174. 10.2307/2529310.CrossRefPubMed Landis JR, Koch GG: Measurement of observer agreement for categorical. Biometrics. 1977, 33: 159-174. 10.2307/2529310.CrossRefPubMed
16.
go back to reference Beard CM, Yunginger JW, Reed CE, O'Connell EJ, Silverstein MD: Interobserver variability in medical record review: an epidemiological study of asthma. J Clin Epidemiol. 1992, 45: 1013-1020. 10.1016/0895-4356(92)90117-6.CrossRefPubMed Beard CM, Yunginger JW, Reed CE, O'Connell EJ, Silverstein MD: Interobserver variability in medical record review: an epidemiological study of asthma. J Clin Epidemiol. 1992, 45: 1013-1020. 10.1016/0895-4356(92)90117-6.CrossRefPubMed
17.
go back to reference Engel L, Henderson C, Fergenbaum J, Colantonio A: Medical record review conduction model for improving interrater reliability of abstracting medical-related information. Eval Health Prof. 2009, 32: 281-298. 10.1177/0163278709338561.CrossRefPubMed Engel L, Henderson C, Fergenbaum J, Colantonio A: Medical record review conduction model for improving interrater reliability of abstracting medical-related information. Eval Health Prof. 2009, 32: 281-298. 10.1177/0163278709338561.CrossRefPubMed
18.
go back to reference Kraemer HC: Ramifications of a population-model for kappa as a coefficient of reliability. Psychometrika. 1979, 44: 461-472. 10.1007/BF02296208.CrossRef Kraemer HC: Ramifications of a population-model for kappa as a coefficient of reliability. Psychometrika. 1979, 44: 461-472. 10.1007/BF02296208.CrossRef
19.
go back to reference Feinstein AR, Cicchetti DV: High agreement but low kappa: I. The problem of two paradoxes. J Clin Epidemiol. 1990, 43: 543-549. 10.1016/0895-4356(90)90158-L.CrossRefPubMed Feinstein AR, Cicchetti DV: High agreement but low kappa: I. The problem of two paradoxes. J Clin Epidemiol. 1990, 43: 543-549. 10.1016/0895-4356(90)90158-L.CrossRefPubMed
20.
go back to reference Cicchetti DV, Feinstein AR: High agreement but low kappa: II. Resolving the paradoxes. J Clin Epidemiol. 1990, 43: 551-558. 10.1016/0895-4356(90)90159-M.CrossRefPubMed Cicchetti DV, Feinstein AR: High agreement but low kappa: II. Resolving the paradoxes. J Clin Epidemiol. 1990, 43: 551-558. 10.1016/0895-4356(90)90159-M.CrossRefPubMed
21.
go back to reference Worster A, Bledsoe RD, Cleve P, Fernandes CM, Upadhye S, Eva K: Reassessing the methods of medical record review studies in emergency medicine research. Ann Emerg Med. 2005, 45: 448-451. 10.1016/j.annemergmed.2004.11.021.CrossRefPubMed Worster A, Bledsoe RD, Cleve P, Fernandes CM, Upadhye S, Eva K: Reassessing the methods of medical record review studies in emergency medicine research. Ann Emerg Med. 2005, 45: 448-451. 10.1016/j.annemergmed.2004.11.021.CrossRefPubMed
22.
go back to reference Gow RM, Barrowman NJ, Lai L, Moher D: A review of five cardiology journals found that observer variability of measured variables was infrequently reported. J Clin Epidemiol. 2008, 61: 394-401. 10.1016/j.jclinepi.2007.05.010.CrossRefPubMed Gow RM, Barrowman NJ, Lai L, Moher D: A review of five cardiology journals found that observer variability of measured variables was infrequently reported. J Clin Epidemiol. 2008, 61: 394-401. 10.1016/j.jclinepi.2007.05.010.CrossRefPubMed
23.
go back to reference Badcock D, Kelly AM, Kerr D, Reade T: The quality of medical record review studies in the international emergency medicine literature. Ann Emerg Med. 2005, 45: 444-447. 10.1016/j.annemergmed.2004.11.011.CrossRefPubMed Badcock D, Kelly AM, Kerr D, Reade T: The quality of medical record review studies in the international emergency medicine literature. Ann Emerg Med. 2005, 45: 444-447. 10.1016/j.annemergmed.2004.11.011.CrossRefPubMed
24.
go back to reference Cruz CO, Meshberg EB, Shofer FS, McCusker CM, Chang AM, Hollander JE: Interrater reliability and accuracy of clinicians and trained research assistants performing prospective data collection in emergency department patients with potential acute coronary syndrome. Ann Emerg Med. 2009, 54: 1-7. 10.1016/j.annemergmed.2008.11.023.CrossRefPubMed Cruz CO, Meshberg EB, Shofer FS, McCusker CM, Chang AM, Hollander JE: Interrater reliability and accuracy of clinicians and trained research assistants performing prospective data collection in emergency department patients with potential acute coronary syndrome. Ann Emerg Med. 2009, 54: 1-7. 10.1016/j.annemergmed.2008.11.023.CrossRefPubMed
25.
go back to reference Rowley G, Fielding K: Reliability and accuracy of the Glasgow Coma Scale with experienced and inexperienced users. Lancet. 1991, 337: 535-538. 10.1016/0140-6736(91)91309-I.CrossRefPubMed Rowley G, Fielding K: Reliability and accuracy of the Glasgow Coma Scale with experienced and inexperienced users. Lancet. 1991, 337: 535-538. 10.1016/0140-6736(91)91309-I.CrossRefPubMed
Metadata
Title
Reliability of medical record abstraction by non-physicians for orthopedic research
Authors
Michael Y Mi
Jamie E Collins
Vladislav Lerner
Elena Losina
Jeffrey N Katz
Publication date
01-12-2013
Publisher
BioMed Central
Published in
BMC Musculoskeletal Disorders / Issue 1/2013
Electronic ISSN: 1471-2474
DOI
https://doi.org/10.1186/1471-2474-14-181

Other articles of this Issue 1/2013

BMC Musculoskeletal Disorders 1/2013 Go to the issue