Skip to main content
Top
Published in: Perspectives on Medical Education 2/2016

Open Access 01-04-2016 | Original Article

Tablet versus paper marking in assessment: feedback matters

Authors: Alan Denison, Emily Bate, Jessica Thompson

Published in: Perspectives on Medical Education | Issue 2/2016

Login to get access

Abstract

Background

The Objective Structured Clinical Examination (OSCE) is a cornerstone in healthcare assessment. As a potential tool for providing learner-centred feedback on a large scale, the use of tablet devices has been proposed for the recording of OSCE marks, moving away from the traditional, paper-based checklist.

Methods

Examiner-recorded comments were collated from successive first year formative and summative OSCE examinations, with paper-based checklists used in 2012 and iPad-based checklists used in 2013. A total of 558 and 498 examiner-candidate interactions took place in the January OSCE examinations, and 1402 and 1344 for the May OSCE examination for 2012 and 2013 respectively. Examiner comments were analyzed for quantity and quality. A tool was developed and validated to assess the quality of the comments left by examiners for use as feedback (Kappa = 0.625).

Results

A direct comparison of paper-based checklists and iPad-recorded examinations showed an increase in the quantity of comments left from 41 to 51 % (+ 10 %). Furthermore, there was an increase in the number of comments left for students deemed ‘borderline’: + 22 %. In terms of the quality of the comments for feedback, there was a significant improvement (p < 0.001) between comments left in written-recorded and iPad-recorded examinations.

Conclusions

iPad-marked examinations resulted in a greater quantity and quality of examiner comment for use as feedback, particularly for students performing less well, enabling tutors to direct further learning for these students.
Literature
1.
go back to reference Harden R, Stevenson M, Wilson Downie W, Wilson GM. Assessment of clinical competence using Objective Structured Examination. BMJ. 1975;1:447–51.CrossRef Harden R, Stevenson M, Wilson Downie W, Wilson GM. Assessment of clinical competence using Objective Structured Examination. BMJ. 1975;1:447–51.CrossRef
2.
go back to reference Black NMI, Harden RM. Providing feedback to students on clinical skills by using the Objective Structured Clinical Examination. Med Educ. 1986;20:48–52.CrossRef Black NMI, Harden RM. Providing feedback to students on clinical skills by using the Objective Structured Clinical Examination. Med Educ. 1986;20:48–52.CrossRef
3.
go back to reference Treadwell I. The usability of personal digital assistants (PDAs) for assessment of practical performance. Med Educ. 2006;40:855–61.CrossRef Treadwell I. The usability of personal digital assistants (PDAs) for assessment of practical performance. Med Educ. 2006;40:855–61.CrossRef
4.
go back to reference Schmidts MB. OSCE logistics—handheld computers replace checklists and provide automated feedback. Med Educ. 2000;34:957–8.CrossRef Schmidts MB. OSCE logistics—handheld computers replace checklists and provide automated feedback. Med Educ. 2000;34:957–8.CrossRef
5.
go back to reference Snodgrass SJ, Ashby SE, Onyango L, Russell T. Electronic practical skills assessments in the health professions: a review. Internet J Allied Health Sci Pract. 2014;12:1–10. Snodgrass SJ, Ashby SE, Onyango L, Russell T. Electronic practical skills assessments in the health professions: a review. Internet J Allied Health Sci Pract. 2014;12:1–10.
6.
go back to reference Whitelock D, Watt S, Raw Y, Moreale E. Analysing tutor feedback to students: first steps towards constructing an electronic monitoring system. J Assoc Learn Technol. 2003;11:31–42.CrossRef Whitelock D, Watt S, Raw Y, Moreale E. Analysing tutor feedback to students: first steps towards constructing an electronic monitoring system. J Assoc Learn Technol. 2003;11:31–42.CrossRef
7.
go back to reference Bales RF. A set of categories for the analysis of small group interaction. Am Sociol Rev. 1950;15:257–63.CrossRef Bales RF. A set of categories for the analysis of small group interaction. Am Sociol Rev. 1950;15:257–63.CrossRef
8.
go back to reference Hyland F, Hyland K. Sugaring the pill: praise and criticism in written feedback. J Second Lang Writ. 2001;10:185–212.CrossRef Hyland F, Hyland K. Sugaring the pill: praise and criticism in written feedback. J Second Lang Writ. 2001;10:185–212.CrossRef
9.
go back to reference Brown E, Glover C Evaluating written feedback. In: Bryan C, Clegg K, editors. Innovative assessment in higher education. Routledge: Taylor & Francis Group, 2006. pp 81–91. Brown E, Glover C Evaluating written feedback. In: Bryan C, Clegg K, editors. Innovative assessment in higher education. Routledge: Taylor & Francis Group, 2006. pp 81–91.
10.
go back to reference Glover C, Brown E. Written feedback for students: too much, too detailed or too incomprehensible to be effective. Biosci Educ. 2006;7. Glover C, Brown E. Written feedback for students: too much, too detailed or too incomprehensible to be effective. Biosci Educ. 2006;7.
11.
12.
go back to reference Hewson MG, Little M. Giving feedback in medical education. J Gen Int Med. 2001;13:111–6.CrossRef Hewson MG, Little M. Giving feedback in medical education. J Gen Int Med. 2001;13:111–6.CrossRef
13.
go back to reference Straub AT. Understanding technology adoption: theory and future directions for informal learning. Rev Educ Res. 2009;79:625–49.CrossRef Straub AT. Understanding technology adoption: theory and future directions for informal learning. Rev Educ Res. 2009;79:625–49.CrossRef
Metadata
Title
Tablet versus paper marking in assessment: feedback matters
Authors
Alan Denison
Emily Bate
Jessica Thompson
Publication date
01-04-2016
Publisher
Bohn Stafleu van Loghum
Published in
Perspectives on Medical Education / Issue 2/2016
Print ISSN: 2212-2761
Electronic ISSN: 2212-277X
DOI
https://doi.org/10.1007/s40037-016-0262-8

Other articles of this Issue 2/2016

Perspectives on Medical Education 2/2016 Go to the issue