Skip to main content
Top
Published in: Perspectives on Medical Education 3/2018

Open Access 01-06-2018 | Original Article

Examining the effects of gaming and guessing on script concordance test scores

Authors: Stuart Lubarsky, Valérie Dory, Sarkis Meterissian, Carole Lambert, Robert Gagnon

Published in: Perspectives on Medical Education | Issue 3/2018

Login to get access

Abstract

Introduction

In a script concordance test (SCT), examinees are asked to judge the effect of a new piece of clinical information on a proposed hypothesis. Answers are collected using a Likert-type scale (ranging from −2 to +2, with ‘0’ indicating no effect), and compared with those of a reference panel of ‘experts’. It has been argued, however, that SCT may be susceptible to the influences of gaming and guesswork. This study aims to address some of the mounting concern over the response process validity of SCT scores.

Method

Using published datasets from three independent SCTs, we investigated examinee response patterns, and computed the score a hypothetical examinee would obtain on each of the tests if he 1) guessed random answers and 2) deliberately answered ‘0’ on all test items.

Results

A simulated random guessing strategy led to scores 2 SDs below mean scores of actual respondents (Z-scores −3.6 to −2.1). A simulated ‘all-0’ strategy led to scores at least 1 SD above those obtained by random guessing (Z-scores −2.2 to −0.7). In one dataset, stepwise exclusion of items with modal panel response ‘0’ to fewer than 10% of the total number of test items yielded hypothetical scores 2 SDs below mean scores of actual respondents.

Discussion

Random guessing was not an advantageous response strategy. An ‘all-0’ response strategy, however, demonstrated evidence of artificial score inflation. Our findings pose a significant threat to the SCT’s validity argument. ‘Testwiseness’ is a potential hazard to all testing formats, and appropriate countermeasures must be established. We propose an approach that might be used to mitigate a potentially real and troubling phenomenon in script concordance testing. The impact of this approach on the content validity of SCTs merits further discussion.
Footnotes
1
For each SCT item, a maximum score of 1 is given for the response chosen by most of the experts (i.e., the modal response). Other responses are given partial credit, depending on the fraction of experts choosing them. Responses not selected by experts receive zero. An examinee’s total score for the test is the sum of the credit obtained for each of the questions, divided by the total obtainable credit for the test, and multiplied by 100 to derive a percentage score. Psychometricians support the use of this type of system, referred to as ‘aggregate scoring’ [8, 9].
 
Literature
1.
go back to reference Charlin B, van der Vleuten C. Standardized assessment of reasoning in contexts of uncertainty: the script concordance approach. Eval Health Prof. 2014;27:304–19.CrossRef Charlin B, van der Vleuten C. Standardized assessment of reasoning in contexts of uncertainty: the script concordance approach. Eval Health Prof. 2014;27:304–19.CrossRef
2.
go back to reference Schmidt HG, Norman GR, Boshuizen HPA. A cognitive perspective on medical expertise: theory and implications. Acad Med. 1990;65:611–21.CrossRef Schmidt HG, Norman GR, Boshuizen HPA. A cognitive perspective on medical expertise: theory and implications. Acad Med. 1990;65:611–21.CrossRef
3.
go back to reference Lemieux M, Bordage G. Propositional versus structural semantic analyses of medical diagnostic thinking. Cogn Sci. 1992;16:185–204.CrossRef Lemieux M, Bordage G. Propositional versus structural semantic analyses of medical diagnostic thinking. Cogn Sci. 1992;16:185–204.CrossRef
4.
go back to reference Feltovich PJ, Barrows HS. Issues of generality in medical problem solving. In: Schmidt H, De Volder ML, editors. Tutorials in problem-based learning: a new direction in teaching the health professions. Assen: Van Gorcum; 1984. Feltovich PJ, Barrows HS. Issues of generality in medical problem solving. In: Schmidt H, De Volder ML, editors. Tutorials in problem-based learning: a new direction in teaching the health professions. Assen: Van Gorcum; 1984.
5.
go back to reference Charlin B, Boshuizen H, Custers E, Feltovitch P. Scripts and clinical reasoning. Med Educ. 2007;41:1178–84.CrossRef Charlin B, Boshuizen H, Custers E, Feltovitch P. Scripts and clinical reasoning. Med Educ. 2007;41:1178–84.CrossRef
6.
go back to reference Custers EJFM. Thirty years of illness scripts: theoretical origins and practical applications. Med Teach. 2015;37:457–62.CrossRef Custers EJFM. Thirty years of illness scripts: theoretical origins and practical applications. Med Teach. 2015;37:457–62.CrossRef
7.
go back to reference Lubarsky S, Dory V, Duggan P, Gagnon R, Charlin B. Script concordance testing: from theory to practice: AMEE guide no. 75. Med Teach. 2013;35:184–93.CrossRef Lubarsky S, Dory V, Duggan P, Gagnon R, Charlin B. Script concordance testing: from theory to practice: AMEE guide no. 75. Med Teach. 2013;35:184–93.CrossRef
8.
go back to reference Norman GR. Objective measurement of clinical performance. Med Educ. 1985;19:43–7.CrossRef Norman GR. Objective measurement of clinical performance. Med Educ. 1985;19:43–7.CrossRef
9.
go back to reference Norcini JJ, Shea JA, Day SC. The use of the aggregate scoring for a recertification examination. Eval Health Prof. 1990;13:241–51.CrossRef Norcini JJ, Shea JA, Day SC. The use of the aggregate scoring for a recertification examination. Eval Health Prof. 1990;13:241–51.CrossRef
10.
go back to reference Charlin B, Brailovsky CA, Leduc C, Blouin D. The diagnosis script questionnaire: a new tool to assess a specific dimension of clinical competence. Adv Health Sci Educ Theory Pract. 1998;3:51–8.CrossRef Charlin B, Brailovsky CA, Leduc C, Blouin D. The diagnosis script questionnaire: a new tool to assess a specific dimension of clinical competence. Adv Health Sci Educ Theory Pract. 1998;3:51–8.CrossRef
11.
go back to reference Lubarsky S, Charlin B, Cook DA, Chalk C, van der Vleuten C. Script concordance testing: a review of published validity evidence. Med Educ. 2011;45:329–38.CrossRef Lubarsky S, Charlin B, Cook DA, Chalk C, van der Vleuten C. Script concordance testing: a review of published validity evidence. Med Educ. 2011;45:329–38.CrossRef
12.
go back to reference Cook DA, Beckman TJ. Current concepts in validity and reliability for psychometric instruments: theory and application. Am J Med. 2006;119:166.e7–166.e16.CrossRef Cook DA, Beckman TJ. Current concepts in validity and reliability for psychometric instruments: theory and application. Am J Med. 2006;119:166.e7–166.e16.CrossRef
13.
go back to reference Gagnon R, Charlin B, Roy L, et al. The cognitive validity of the script concordance test: a time processing study. Teach Learn Med. 2006;18:22–7.CrossRef Gagnon R, Charlin B, Roy L, et al. The cognitive validity of the script concordance test: a time processing study. Teach Learn Med. 2006;18:22–7.CrossRef
14.
go back to reference Ahmadi SF, Khoshkish S, Soltani-Arabshahi K. Challenging script concordance test reference standard by evidence: do judgments by emergency medicine consultants agree with likelihood ratios? Int J Emerg Med. 2014;7:34.CrossRef Ahmadi SF, Khoshkish S, Soltani-Arabshahi K. Challenging script concordance test reference standard by evidence: do judgments by emergency medicine consultants agree with likelihood ratios? Int J Emerg Med. 2014;7:34.CrossRef
15.
go back to reference Kreiter C. Commentary: the response process validity of a script concordance item. Adv Health Sci Educ Theory Pract. 2011;17:7–9.CrossRef Kreiter C. Commentary: the response process validity of a script concordance item. Adv Health Sci Educ Theory Pract. 2011;17:7–9.CrossRef
16.
go back to reference Lineberry M, Kreiter CD, Bordage G. Threats to the validity in the use and interpretation of script concordance test scores. Med Educ. 2013;47:1175–83.CrossRef Lineberry M, Kreiter CD, Bordage G. Threats to the validity in the use and interpretation of script concordance test scores. Med Educ. 2013;47:1175–83.CrossRef
18.
go back to reference Bland A, Kreiter C, Gordon J. The psychometric properties of five scoring methods applied to the Script Concordance Test. Acad Med. 2005;80:395–9.CrossRef Bland A, Kreiter C, Gordon J. The psychometric properties of five scoring methods applied to the Script Concordance Test. Acad Med. 2005;80:395–9.CrossRef
19.
go back to reference Gagnon R, Charlin B, Coletti M, Sauve E, van der Vleuten C. Assessment in the context of uncertainty: how many members are needed on the panel of reference of a script concordance test? Med Educ. 2005;39:284–91.CrossRef Gagnon R, Charlin B, Coletti M, Sauve E, van der Vleuten C. Assessment in the context of uncertainty: how many members are needed on the panel of reference of a script concordance test? Med Educ. 2005;39:284–91.CrossRef
20.
go back to reference Lambert C, Gagnon R, Nguyen D, Charlin B. The script concordance test in radiation oncology: validation study of a new tool to assess clinical reasoning. Radiat Oncol. 2009;4:7.CrossRef Lambert C, Gagnon R, Nguyen D, Charlin B. The script concordance test in radiation oncology: validation study of a new tool to assess clinical reasoning. Radiat Oncol. 2009;4:7.CrossRef
21.
go back to reference Lubarsky S, Chalk C, Kazitani D, Gagnon R, Charlin B. The Script Concordance Test: a new tool assessing clinical judgement in neurology. Can J Neurol Sci. 2009;36:326–31.CrossRef Lubarsky S, Chalk C, Kazitani D, Gagnon R, Charlin B. The Script Concordance Test: a new tool assessing clinical judgement in neurology. Can J Neurol Sci. 2009;36:326–31.CrossRef
22.
go back to reference Nouh T, Boutros M, Gagnon R, et al. The script concordance test as a measure of clinical reasoning: a national validation study. Am J Surg. 2012;203:530–4.CrossRef Nouh T, Boutros M, Gagnon R, et al. The script concordance test as a measure of clinical reasoning: a national validation study. Am J Surg. 2012;203:530–4.CrossRef
23.
go back to reference Wilson AB, Pike GR, Humbert A. Analyzing script concordance test: scoring methods and items by difficulty and type. Teach Learn Med. 2014;26:135–45.CrossRef Wilson AB, Pike GR, Humbert A. Analyzing script concordance test: scoring methods and items by difficulty and type. Teach Learn Med. 2014;26:135–45.CrossRef
24.
go back to reference Downing SM. Threats to the validity of locally developed multiple-choice tests in medical education: construct-irrelevant variance and construct underrepresentation. Adv Health Sci Educ Theory Pract. 2002;7:235–41.CrossRef Downing SM. Threats to the validity of locally developed multiple-choice tests in medical education: construct-irrelevant variance and construct underrepresentation. Adv Health Sci Educ Theory Pract. 2002;7:235–41.CrossRef
25.
go back to reference Williams RG, Klamen DA, McGaghie WC. Special article: cognitive, social and environmental sources of bias in clinical performance ratings. Teach Learn Med. 2003;15:270–92.CrossRef Williams RG, Klamen DA, McGaghie WC. Special article: cognitive, social and environmental sources of bias in clinical performance ratings. Teach Learn Med. 2003;15:270–92.CrossRef
26.
go back to reference See KC, Tan KL, Lim TK. The script concordance test for clinical reasoning: re-examining its utility and potential weakness. Med Educ. 2014;48:1069–77.CrossRef See KC, Tan KL, Lim TK. The script concordance test for clinical reasoning: re-examining its utility and potential weakness. Med Educ. 2014;48:1069–77.CrossRef
27.
go back to reference Fournier JP, Demeester A, Charlin B. Script concordance tests: guidelines for construction. BMC Med Inform Decis Mak. 2008;8:18.CrossRef Fournier JP, Demeester A, Charlin B. Script concordance tests: guidelines for construction. BMC Med Inform Decis Mak. 2008;8:18.CrossRef
28.
go back to reference Friedman Ben-David M. Principles of assessment. In: Dent J, Harden RM, editors. A practical guide for medical teachers. 2nd ed. Edinburgh, Churchill, Livingstone: Elsevier; 2005. Friedman Ben-David M. Principles of assessment. In: Dent J, Harden RM, editors. A practical guide for medical teachers. 2nd ed. Edinburgh, Churchill, Livingstone: Elsevier; 2005.
29.
go back to reference Boulouffe C, Charlin B, Vanpee D. Evaluation of clinical reasoning in basic emergencies using a script concordance test. Am J Pharm Educ. 2010;74:1–6.CrossRef Boulouffe C, Charlin B, Vanpee D. Evaluation of clinical reasoning in basic emergencies using a script concordance test. Am J Pharm Educ. 2010;74:1–6.CrossRef
30.
go back to reference Ramaekers S, Kremer W, Pilot A, van Keulen H. Assessment of competence in clinical reasoning and decision-making under uncertainty: the script concordance test method. Assess Eval High Educ. 2010;35:661–73.CrossRef Ramaekers S, Kremer W, Pilot A, van Keulen H. Assessment of competence in clinical reasoning and decision-making under uncertainty: the script concordance test method. Assess Eval High Educ. 2010;35:661–73.CrossRef
31.
go back to reference Dawson T, Comer L, Kossick MA, Neubrander J. Can script concordance testing be used in nursing education to accurately assess clinical reasoning skills? J Nurs Educ. 2014;53:281–6.CrossRef Dawson T, Comer L, Kossick MA, Neubrander J. Can script concordance testing be used in nursing education to accurately assess clinical reasoning skills? J Nurs Educ. 2014;53:281–6.CrossRef
32.
go back to reference Van den Broek WES, van Asperen MV, Custers EJFM, Valk GD, ten Cate O. Effects of two different instructional formats on scores and reliability of a script concordance test. Perspect Med Educ. 2012;1:119–28.CrossRef Van den Broek WES, van Asperen MV, Custers EJFM, Valk GD, ten Cate O. Effects of two different instructional formats on scores and reliability of a script concordance test. Perspect Med Educ. 2012;1:119–28.CrossRef
Metadata
Title
Examining the effects of gaming and guessing on script concordance test scores
Authors
Stuart Lubarsky
Valérie Dory
Sarkis Meterissian
Carole Lambert
Robert Gagnon
Publication date
01-06-2018
Publisher
Bohn Stafleu van Loghum
Published in
Perspectives on Medical Education / Issue 3/2018
Print ISSN: 2212-2761
Electronic ISSN: 2212-277X
DOI
https://doi.org/10.1007/s40037-018-0435-8

Other articles of this Issue 3/2018

Perspectives on Medical Education 3/2018 Go to the issue