Skip to main content
Top
Published in: Perspectives on Medical Education 5/2015

Open Access 01-10-2015 | Original Article

The Impact of item flaws, testing at low cognitive level, and low distractor functioning on multiple-choice question quality

Authors: Syed Haris Ali, Kenneth G. Ruit

Published in: Perspectives on Medical Education | Issue 5/2015

Login to get access

Abstract

Background

This study investigated the impact of addressing item writing flaws, testing at low cognitive level and non-functioning distractors (< 5 % selection frequency) in multiple-choice assessment in preclinical medical education.

Method

Multiple-choice questions with too high or too low difficulty (difficulty index < 0.4 or > 0.8) and insufficient discriminatory ability (point-biserial correlation < 0.2) on previous administration were identified. Items in Experimental Subgroup A underwent removal of item writing flaws along with enhancement of tested cognitive level (21 multiple-choice questions), while Experimental Subgroup B underwent replacement or removal of non-functioning distractors (11 multiple-choice questions). A control group of items (Group C) did not undergo any intervention (23 multiple-choice questions).

Result

Post-intervention, the average number of functioning distractors (≥ 5 % selection frequency) per multiple-choice question increased from 0.67 to 0.81 in Subgroup A and from 0.91 to 1.09 in Subgroup B; a statistically significant increase in the number of multiple-choice questions with sufficient point-biserial correlation was also noted. No significant changes were noted in psychometric characteristics of the control group of items.

Conclusion

Correction of item flaws, removal or replacement of non-functioning distractors, and enhancement of tested cognitive level positively impact the discriminatory ability of multiple-choice questions. This helps prevent construct-irrelevant variance from affecting the evidence of validity of scores obtained in multiple-choice questions.
Appendix
Available only for authorised users
Literature
1.
go back to reference Mehrens WA, Lehmann IJ. Measurement and evaluation in education and psychology. Fort Worth: Holt, Rinehart and Winston; 1991. Mehrens WA, Lehmann IJ. Measurement and evaluation in education and psychology. Fort Worth: Holt, Rinehart and Winston; 1991.
2.
go back to reference Masters JC, Hulsmeyer BS, Pike ME, Leichty K, Miller MT, Verst AL. Assessment of multiple-choice questions in selected test banks accompanying textbooks used in nursing education. J Nurs Educ. 2001;40(1):25–32. Masters JC, Hulsmeyer BS, Pike ME, Leichty K, Miller MT, Verst AL. Assessment of multiple-choice questions in selected test banks accompanying textbooks used in nursing education. J Nurs Educ. 2001;40(1):25–32.
3.
go back to reference Jozefowicz RF, Koeppen BM, Case S, Galbraith R, Swanson D, Glew H. The quality of in-house medical school examinations. Acad Med. 2002;77:156–61.CrossRef Jozefowicz RF, Koeppen BM, Case S, Galbraith R, Swanson D, Glew H. The quality of in-house medical school examinations. Acad Med. 2002;77:156–61.CrossRef
5.
go back to reference Downing SM. The effects of violating standard item-writing principles on tests and students: the consequences of using flawed test items on achievement examinations in medical education. Adv Health Sci Educ Theory Pract. 2005;10:133–43.CrossRef Downing SM. The effects of violating standard item-writing principles on tests and students: the consequences of using flawed test items on achievement examinations in medical education. Adv Health Sci Educ Theory Pract. 2005;10:133–43.CrossRef
6.
go back to reference Tarrant M, Ware J. Impact of item-writing flaws in multiple-choice questions on student achievement in high-stakes nursing assessments. Med Educ. 2008;42:198–206.CrossRef Tarrant M, Ware J. Impact of item-writing flaws in multiple-choice questions on student achievement in high-stakes nursing assessments. Med Educ. 2008;42:198–206.CrossRef
7.
go back to reference Haladyna TM, Downing SM, Rodriguez MC. A review of multiple-choice item-writing guidelines for classroom assessment. Appl Meas Educ. 2002;15(3):309–34.CrossRef Haladyna TM, Downing SM, Rodriguez MC. A review of multiple-choice item-writing guidelines for classroom assessment. Appl Meas Educ. 2002;15(3):309–34.CrossRef
8.
go back to reference Tarrant M, Knierim A, Hayes SK, Ware J. The frequency of item writing flaws in multiple-choice questions used in high stakes nursing assessments. Nurs Educ Today. 26(8):662–71. Tarrant M, Knierim A, Hayes SK, Ware J. The frequency of item writing flaws in multiple-choice questions used in high stakes nursing assessments. Nurs Educ Today. 26(8):662–71.
9.
go back to reference Newble D. A comparison of multiple-choice and free-response tests in examination of clinical competence. Med Educ. 1979;13:263–8.CrossRef Newble D. A comparison of multiple-choice and free-response tests in examination of clinical competence. Med Educ. 1979;13:263–8.CrossRef
10.
go back to reference Maguire T, Shakun E, Harley C. Setting standards for multiple-choice items in clinical reasoning. Eval Health Prof. 1992;15:434–52.CrossRef Maguire T, Shakun E, Harley C. Setting standards for multiple-choice items in clinical reasoning. Eval Health Prof. 1992;15:434–52.CrossRef
11.
go back to reference Elstein A. Beyond multiple-choice questions and essays: the need for a new way to assess clinical competence. Acad Med. 1993;68:244–9.CrossRef Elstein A. Beyond multiple-choice questions and essays: the need for a new way to assess clinical competence. Acad Med. 1993;68:244–9.CrossRef
12.
go back to reference Boshuizen H, Vleuten C van der, Schmidt H, Machiels-Bongaerts M. Measuring knowledge and clinical reasoning skills in a problem-based curriculum. Med Educ. 1997;31:115–21.CrossRef Boshuizen H, Vleuten C van der, Schmidt H, Machiels-Bongaerts M. Measuring knowledge and clinical reasoning skills in a problem-based curriculum. Med Educ. 1997;31:115–21.CrossRef
13.
go back to reference Shakun E, Maguire T, Cook D. Strategy choices in multiple- choice items. Acad Med. 1994;69(10 suppl):S7–9.CrossRef Shakun E, Maguire T, Cook D. Strategy choices in multiple- choice items. Acad Med. 1994;69(10 suppl):S7–9.CrossRef
14.
go back to reference Rodriguez, MC. Three options are optimal for multiple-choice items: a meta analysis of 80 years of research. Educ Measure Issues Prac. 2005;24(2):3–13.CrossRef Rodriguez, MC. Three options are optimal for multiple-choice items: a meta analysis of 80 years of research. Educ Measure Issues Prac. 2005;24(2):3–13.CrossRef
15.
go back to reference Downing SM, Haladyna TM. Validity threats: overcoming interference with proposed interpretations of assessment data. Med Educ. 2004;38:327–33.CrossRef Downing SM, Haladyna TM. Validity threats: overcoming interference with proposed interpretations of assessment data. Med Educ. 2004;38:327–33.CrossRef
16.
go back to reference Tarrant M, Ware J. A comparison of the psychometric properties of three- and four-option multiple-choice questions in nursing assessments. Nurs Educ Today. 2010;30(6):539–43.CrossRef Tarrant M, Ware J. A comparison of the psychometric properties of three- and four-option multiple-choice questions in nursing assessments. Nurs Educ Today. 2010;30(6):539–43.CrossRef
17.
go back to reference Messick S. Standards of validity and the validity of standards in performance assessment. Educ Measure Issues Prac. 1995;14:5–8.CrossRef Messick S. Standards of validity and the validity of standards in performance assessment. Educ Measure Issues Prac. 1995;14:5–8.CrossRef
18.
go back to reference Downing SM. Validity: on meaningful interpretation of assessment data. Med Educ. 2003;37:830–7.CrossRef Downing SM. Validity: on meaningful interpretation of assessment data. Med Educ. 2003;37:830–7.CrossRef
19.
go back to reference Kern DE, Thomas PA, Hughes MT. Curriculum development for medical education: a six-step approach. 2nd ed. Baltimore: Johns Hopkins University Press; 2009. Kern DE, Thomas PA, Hughes MT. Curriculum development for medical education: a six-step approach. 2nd ed. Baltimore: Johns Hopkins University Press; 2009.
20.
go back to reference Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65:563–7.CrossRef Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65:563–7.CrossRef
21.
go back to reference De Champlain A. A primer on classical test theory and item response theory for assessment in medical education. Med Educ. 2010;44:109–17.CrossRef De Champlain A. A primer on classical test theory and item response theory for assessment in medical education. Med Educ. 2010;44:109–17.CrossRef
22.
go back to reference Tavakol M, Dennick R. Post-examination analysis of objective tests. Med Teach. 2011;33:447–58.CrossRef Tavakol M, Dennick R. Post-examination analysis of objective tests. Med Teach. 2011;33:447–58.CrossRef
23.
go back to reference Oosterhof AC. Similarity of various item discrimination indices. J Edu Meas. 1976;13:145–50.CrossRef Oosterhof AC. Similarity of various item discrimination indices. J Edu Meas. 1976;13:145–50.CrossRef
24.
go back to reference Attali Y, Fraenkel T. The point-biserial as a discrimination index for distractors in multiple-choice items: deficiencies in usage and an alternative. J Edu Meas. 2000;37:77–86.CrossRef Attali Y, Fraenkel T. The point-biserial as a discrimination index for distractors in multiple-choice items: deficiencies in usage and an alternative. J Edu Meas. 2000;37:77–86.CrossRef
Metadata
Title
The Impact of item flaws, testing at low cognitive level, and low distractor functioning on multiple-choice question quality
Authors
Syed Haris Ali
Kenneth G. Ruit
Publication date
01-10-2015
Publisher
Bohn Stafleu van Loghum
Published in
Perspectives on Medical Education / Issue 5/2015
Print ISSN: 2212-2761
Electronic ISSN: 2212-277X
DOI
https://doi.org/10.1007/s40037-015-0212-x

Other articles of this Issue 5/2015

Perspectives on Medical Education 5/2015 Go to the issue