Open Access 01-04-2016 | Original Article
Beyond standard checklist assessment: Question sequence may impact student performance
Published in: Perspectives on Medical Education | Issue 2/2016
Login to get accessAbstract
Introduction
Clinical encounters are often assessed using a checklist. However, without direct faculty observation, the timing and sequence of questions are not captured. We theorized that the sequence of questions can be captured and measured using coherence scores that may distinguish between low and high performing candidates.
Methods
A logical sequence of key features was determined using the standard case checklist for an observed structured clinical exam (OSCE). An independent clinician educator reviewed each encounter to provide a global rating. Coherence scores were calculated based on question sequence. These scores were compared with global ratings and checklist scores.
Results
Coherence scores were positively correlated to checklist scores and to global ratings, and these correlations increased as global ratings improved. Coherence scores explained more of the variance in student performance as global ratings improved.
Discussion
Logically structured question sequences may indicate a higher performing student, and this information is often lost when using only overall checklist scores.
Conclusions
The sequence test takers ask questions can be accurately recorded, and is correlated to checklist scores and to global ratings. The sequence of questions during a clinical encounter is not captured by traditional checklist scoring, and may represent an important dimension of performance.