Published in:
Open Access
01-09-2015 | Original Research
The ABCs of DKA: Development and Validation of a Computer-Based Simulator and Scoring System
Authors:
Catherine H. Y. Yu, MD FRCPC MHSc, Sharon Straus, MD FRCPC MSc, Ryan Brydges PhD
Published in:
Journal of General Internal Medicine
|
Issue 9/2015
Login to get access
Abstract
Background
Clinical management of diabetic ketoacidosis (DKA) continues to be suboptimal; simulation-based training may bridge this gap and is particularly applicable to teaching DKA management skills given it enables learning of basic knowledge, as well as clinical reasoning and patient management skills.
Objectives
1) To develop, test, and refine a computer-based simulator of DKA management; 2) to collect validity evidence, according to National Standard’s validity framework; and 3) to judge whether the simulator scoring system is an appropriate measure of DKA management skills of undergraduate and postgraduate medical trainees.
Design
After developing the DKA simulator, we completed usability testing to optimize its functionality. We then conducted a preliminary validation of the scoring system for measuring trainees’ DKA management skills.
Participants
We recruited year 1 and year 3 medical students, year 2 postgraduate trainees, and endocrinologists (n = 75); each completed a simulator run, and we collected their simulator-computed scores.
Main Measures
We collected validity evidence related to content, internal structure, relations with other variables, and consequences.
Key Results
Our simulator consists of six cases highlighting DKA management priorities. Real-time progression of each case includes interactive order entry, laboratory and clinical data, and individualised feedback. Usability assessment identified issues with clarity of system status, user control, efficiency of use, and error prevention. Regarding validity evidence, Cronbach’s α was 0.795 for the seven subscales indicating favorable internal structure evidence. Participants’ scores showed a significant effect of training level (p < 0.001). Scores also correlated with the number of DKA patients they reported treating, weeks on Medicine rotation, and comfort with managing DKA. A score on the simulation exercise of 75 % had a sensitivity and specificity of 94.7 % and 51.8%, respectively, for delineating between expert staff physicians and trainees.
Conclusions
We demonstrate how a simulator and scoring system can be developed, tested, and refined to determine its quality for use as an assessment modality. Our evidence suggests that it can be used for formative assessment of trainees’ DKA management skills.