Skip to main content
Top
Published in: Journal of Robotic Surgery 4/2013

01-12-2013 | Original Article

Multidisciplinary validation study of the da Vinci Skills Simulator: educational tool and assessment device

Authors: Kirsten Foell, Alexander Furse, R. John D’A. Honey, Kenneth T. Pace, Jason Y. Lee

Published in: Journal of Robotic Surgery | Issue 4/2013

Login to get access

Abstract

Despite the increased dexterity and precision of robotic surgery, like any new surgical technology it is still associated with a learning curve that can impact patient outcomes. The use of surgical simulators outside of the operating room, in a low-stakes environment, has been shown to shorten such learning curves. We present a multidisciplinary validation study of a robotic surgery simulator, the da Vinci® Skills Simulator (dVSS). Trainees and attending faculty from the University of Toronto, Departments of Surgery and Obstetrics and Gynecology (ObGyn), were recruited to participate in this validation study. All participants completed seven different exercises on the dVSS (Camera Targeting 1, Peg Board 1, Peg Board 2, Ring Walk 2, Match Board 1, Thread the Rings, Suture Sponge 1) and, using the da Vinci S Robot (dVR), completed two standardized skill tasks (Ring Transfer, Needle Passing). Participants were categorized as novice robotic surgeon (NRS) and experienced robotic surgeon (ERS) based on the number of robotic cases performed. Statistical analysis was conducted using independent T test and non-parametric Spearman’s correlation. A total of 53 participants were included in the study: 27 urology, 13 ObGyn, and 13 thoracic surgery (Table 1). Most participants (89 %) either had no prior console experience or had performed <10 robotic cases, while one (2 %) had performed 10–20 cases and five (9 %) had performed ≥20 robotic surgeries. The dVSS demonstrated excellent face and content validity and 97 and 86 % of participants agreed that it was useful for residency training and post-graduate training, respectively. The dVSS also demonstrated construct validity, with NRS performing significantly worse than ERS on most exercises with respect to overall score, time to completion, economy of motion, and errors (Table 2). Excellent concurrent validity was also demonstrated as dVSS scores for most exercises correlated with performance of the two standardized skill tasks using the dVR (Table 3). This multidisciplinary validation study of the dVSS provides excellent face, content, construct, and concurrent validity evidence, which supports its integrated use in a comprehensive robotic surgery training program, both as an educational tool and potentially as an assessment device.
Table 1
dVSS validation study participant demographic information
Survey question
Response
Number (%)
Gender
Male
36 (67.9)
Female
17 (32.1)
Handedness
Right-hand dominant
45 (84.9)
Left-hand dominant
4 (7.5)
Ambidextrous
3 (5.7)
Level of training
Junior Resident (R1–R3)
17 (32.1)
Senior Resident (R4–R5)
12 (22.6)
Fellow
16 (30.2)
Staff Surgeon
8 (15.1)
Specialty
Urology
27 (50.9)
ObGyn
13 (24.5)
Thoracics
13 (24.5)
Previous MIS experience
(laparoscopic or thoracoscopic)
None/minimal
17 (32.1)
Moderate
11 (20.8)
Significant
18 (34.0)
Fellowship-trained in MIS
4 (7.5)
Previous robotic surgery experience
None
32 (60.4)
Yes
21 (39.6)
If yes, number of operative cases as surgical assistant
0 cases
33 (62.3)
<10 cases
9 (17.0)
10–20 cases
3 (5.7)
>20 cases
8 (9.4)
If yes, number of operative cases at robotic console for at least 30 min
0 cases
41 (77.4)
<10 cases
6 (11.3)
10–20 cases
1 (1.9)
>20 cases
5 (9.4)
MIS minimally invasive surgery
Table 2
dVSS construct validity evidence
dVSS exercise
All subjects’ overall score (%, mean ± SD)
Novice robotic surgeon overall score (%, mean ± SD)
Expert robotic surgeon overall score (%, mean ± SD)
p value
Camera Targeting 1
69.943 ± 21.7489
67.170 ± 21.5258
91.667 ± 4.2269
0.008
Peg Board 1
78.596 ± 11.9824
76.913 ± 11.6616
91.500 ± 3.8341
0.004
Match Board 1
69.880 ± 17.7691
67.864 ± 17.9075
84.667 ± 6.1860
0.028
Thread the Rings
74.152 ± 16.4289
71.825 ± 16.2605
89.667 ± 5.8878
0.011
Suture Sponge 1
74.787 ± 14.3086
73.171 ± 14.5067
85.833 ± 5.6716
0.042
Ring Walk 2
75.098 ± 20.0861
73.333 ± 20.1099
88.333 ± 15.4100
0.086
Peg Board 2
84.308 ± 11.7633
83.283 ± 12.0861
92.167 ± 3.6009
0.082
Table 3
dVSS concurrent validity evidence
 
NP time
NP errors
RT time
RT errors
Camera Targeting 1 overall score
0.471 (0.001)
0.083 (0.575)
0.291 (0.045)
0.061 (0.685)
Peg Board 1 overall score
0.486 (0.001)
0.141 (0.344)
0.325 (0.026)
0.088 (0.555)
Match Board 1 overall score
0.543 (<0.001)
0.096 (0.530)
0.295 (0.050)
0.215 (0.162)
Thread the Rings overall score
0.432 (0.005)
0.231 (0.147)
0.533 (<0.001)
0.163 (0.310)
Suture Sponge 1 overall score
0.592 (<0.001)
0.105 (0.509)
0.437 (0.004)
0.015 (0.925)
Ring Walk 2 overall score
0.454 (0.002)
0.179 (0.234)
0.399 (0.006)
0.022 (0.884)
Peg Board 2 overall score
0.675 (<0.001)
0.058 (0.696)
0.073 (0.626)
0.045 (0.762)
Subjects’ overall score for each dVSS exercise is correlated with the time to complete (time) and number of errors (errors) for the Needle Passing (NP) and Ring Transfer (RT) tasks performed using the dVR. Data is expressed as Pearson correlation coefficient (p value)
Literature
1.
go back to reference Lee JY, Mucksavage P, Sundaram CP, McDougall EM (2011) Best practices for robotic surgery training and credentialing. J Urol 185(4):1191–1197PubMedCrossRef Lee JY, Mucksavage P, Sundaram CP, McDougall EM (2011) Best practices for robotic surgery training and credentialing. J Urol 185(4):1191–1197PubMedCrossRef
3.
go back to reference Palter VN, Graafland M, Schijven MP, Grantcharov TP (2012) Designing a proficiency-based, content validated virtual reality curriculum for laparoscopic colorectal surgery: a Delphi approach. Surgery 151(3):391–397PubMedCrossRef Palter VN, Graafland M, Schijven MP, Grantcharov TP (2012) Designing a proficiency-based, content validated virtual reality curriculum for laparoscopic colorectal surgery: a Delphi approach. Surgery 151(3):391–397PubMedCrossRef
4.
go back to reference Stefanidis D, Korndorffer JR Jr, Markley S, Sierra R, Heniford BT, Scott DJ (2007) Closing the gap in operative performance between novices and experts: does harder mean better for laparoscopic simulator training? J Am Coll Surg 205(2):307–313PubMedCrossRef Stefanidis D, Korndorffer JR Jr, Markley S, Sierra R, Heniford BT, Scott DJ (2007) Closing the gap in operative performance between novices and experts: does harder mean better for laparoscopic simulator training? J Am Coll Surg 205(2):307–313PubMedCrossRef
5.
go back to reference Seymour NE (2007) VR to OR: a review of the evidence that virtual reality simulation improves operating room performance. World J Surg 32(2):182–188CrossRef Seymour NE (2007) VR to OR: a review of the evidence that virtual reality simulation improves operating room performance. World J Surg 32(2):182–188CrossRef
6.
go back to reference Ericsson KA (2008) Deliberate practice and acquisition of expert performance: a general overview. Acad Emerg Med 15(11):988–994PubMedCrossRef Ericsson KA (2008) Deliberate practice and acquisition of expert performance: a general overview. Acad Emerg Med 15(11):988–994PubMedCrossRef
7.
go back to reference Orvieto MA, Marchetti P, Castillo OA, Coelho RF, Chauhan S, Rocco B et al (2011) Robotic technologies in surgical oncology training and practice. Surg Oncol 20(3):203–209PubMedCrossRef Orvieto MA, Marchetti P, Castillo OA, Coelho RF, Chauhan S, Rocco B et al (2011) Robotic technologies in surgical oncology training and practice. Surg Oncol 20(3):203–209PubMedCrossRef
8.
go back to reference Mucksavage P, Kerbl DC, Pick DL, Lee JY, McDougall EM, Louie MK (2011) Differences in grip forces among various robotic instruments and da Vinci surgical platforms. J Endourol 25(3):523–528PubMedCrossRef Mucksavage P, Kerbl DC, Pick DL, Lee JY, McDougall EM, Louie MK (2011) Differences in grip forces among various robotic instruments and da Vinci surgical platforms. J Endourol 25(3):523–528PubMedCrossRef
9.
go back to reference Aboumarzouk OM, Stein RJ, Eyraud R, Haber G-P, Chlosta PL, Somani BK et al (2012) Robotic versus laparoscopic partial nephrectomy: a systematic review and meta-analysis. Eur Urol 62(6):1023–1033PubMedCrossRef Aboumarzouk OM, Stein RJ, Eyraud R, Haber G-P, Chlosta PL, Somani BK et al (2012) Robotic versus laparoscopic partial nephrectomy: a systematic review and meta-analysis. Eur Urol 62(6):1023–1033PubMedCrossRef
10.
go back to reference Ahmed K, Ibrahim A, Wang TT, Khan N, Challacombe B, Khan et al (2012) Assessing the cost effectiveness of robotics in urological surgery—a systematic review. BJU Int 110(10):1544–1556PubMedCrossRef Ahmed K, Ibrahim A, Wang TT, Khan N, Challacombe B, Khan et al (2012) Assessing the cost effectiveness of robotics in urological surgery—a systematic review. BJU Int 110(10):1544–1556PubMedCrossRef
11.
go back to reference Patel VR, Coelho RF, Chauhan S, Orvieto MA, Palmer KJ, Rocco B et al (2010) Continence, potency and oncological outcomes after robotic-assisted radical prostatectomy: early trifecta results of a high-volume surgeon. BJU Int 106(5):696–702PubMedCrossRef Patel VR, Coelho RF, Chauhan S, Orvieto MA, Palmer KJ, Rocco B et al (2010) Continence, potency and oncological outcomes after robotic-assisted radical prostatectomy: early trifecta results of a high-volume surgeon. BJU Int 106(5):696–702PubMedCrossRef
12.
go back to reference Panumatrassamee K, Autorino R, Laydner H, Hillyer S, Khalifeh A, Kassab A et al (2012) Robotic versus laparoscopic partial nephrectomy for tumor in a solitary kidney: a single institution comparative analysis. Int J Urol. doi:10.1111/j.1442-2042.2012.03205.x Panumatrassamee K, Autorino R, Laydner H, Hillyer S, Khalifeh A, Kassab A et al (2012) Robotic versus laparoscopic partial nephrectomy for tumor in a solitary kidney: a single institution comparative analysis. Int J Urol. doi:10.​1111/​j.​1442-2042.​2012.​03205.​x
13.
go back to reference Liss MA, Abdelshehid C, Quach S, Lusch A, Graversen J, Landman J et al (2012) Validation, correlation, and comparison of the da Vinci trainer™ and the da Vinci surgical skills simulator™ using the Mimic™ software for urologic robotic surgical education. J Endourol 26(12):1629-34 Liss MA, Abdelshehid C, Quach S, Lusch A, Graversen J, Landman J et al (2012) Validation, correlation, and comparison of the da Vinci trainer™ and the da Vinci surgical skills simulator™ using the Mimic™ software for urologic robotic surgical education. J Endourol 26(12):1629-34
14.
go back to reference Kelly DC, Margules AC, Kundavaram CR, Narins H, Gomella LG, Trabulsi EJ et al (2012) Face, content, and construct validation of the da Vinci skills simulator. Urology 79(5):1068–1072PubMedCrossRef Kelly DC, Margules AC, Kundavaram CR, Narins H, Gomella LG, Trabulsi EJ et al (2012) Face, content, and construct validation of the da Vinci skills simulator. Urology 79(5):1068–1072PubMedCrossRef
15.
go back to reference Finnegan KT, Meraney AM, Staff I, Shichman SJ (2012) da Vinci skills simulator construct validation study: correlation of prior robotic experience with overall score and time score simulator performance. Urology 80(2):330–336PubMedCrossRef Finnegan KT, Meraney AM, Staff I, Shichman SJ (2012) da Vinci skills simulator construct validation study: correlation of prior robotic experience with overall score and time score simulator performance. Urology 80(2):330–336PubMedCrossRef
16.
go back to reference Hung AJ, Patil MB, Zehnder P, Cai J, Ng CK, Aron M et al (2012) Concurrent and predictive validation of a novel robotic surgery simulator: a prospective, randomized study. J Urol 187(2):630–637PubMedCrossRef Hung AJ, Patil MB, Zehnder P, Cai J, Ng CK, Aron M et al (2012) Concurrent and predictive validation of a novel robotic surgery simulator: a prospective, randomized study. J Urol 187(2):630–637PubMedCrossRef
17.
go back to reference Lee JY, Mucksavage P, Kerbl DC, Huynh VB, Etafy M, McDougall EM (2012) Validation study of a virtual reality robotic simulator—role as an assessment tool? J Urol 187(3):998–1002PubMedCrossRef Lee JY, Mucksavage P, Kerbl DC, Huynh VB, Etafy M, McDougall EM (2012) Validation study of a virtual reality robotic simulator—role as an assessment tool? J Urol 187(3):998–1002PubMedCrossRef
18.
go back to reference Abboudi H, Khan MS, Aboumarzouk O, Guru KA, Challacombe B, Dasgupta P et al (2013) Current status of validation for robotic surgery simulators—a systematic review. BJU Int 111(2):194-205 Abboudi H, Khan MS, Aboumarzouk O, Guru KA, Challacombe B, Dasgupta P et al (2013) Current status of validation for robotic surgery simulators—a systematic review. BJU Int 111(2):194-205
19.
go back to reference Korets R, Mues AC, Graversen JA, Gupta M, Benson MC, Cooper KL et al (2011) Validating the USE of the Mimic dV-trainer for robotic surgery skill acquisition among urology residents. Urology 78(6):1326–1330PubMedCrossRef Korets R, Mues AC, Graversen JA, Gupta M, Benson MC, Cooper KL et al (2011) Validating the USE of the Mimic dV-trainer for robotic surgery skill acquisition among urology residents. Urology 78(6):1326–1330PubMedCrossRef
20.
go back to reference Perrenot C, Perez M, Tran N, Jehl J-P, Felblinger J, Bresler L et al (2012) The virtual reality simulator dV-Trainer® is a valid assessment tool for robotic surgical skills. Surg Endosc 26(9):2587–2593PubMedCrossRef Perrenot C, Perez M, Tran N, Jehl J-P, Felblinger J, Bresler L et al (2012) The virtual reality simulator dV-Trainer® is a valid assessment tool for robotic surgical skills. Surg Endosc 26(9):2587–2593PubMedCrossRef
Metadata
Title
Multidisciplinary validation study of the da Vinci Skills Simulator: educational tool and assessment device
Authors
Kirsten Foell
Alexander Furse
R. John D’A. Honey
Kenneth T. Pace
Jason Y. Lee
Publication date
01-12-2013
Publisher
Springer London
Published in
Journal of Robotic Surgery / Issue 4/2013
Print ISSN: 1863-2483
Electronic ISSN: 1863-2491
DOI
https://doi.org/10.1007/s11701-013-0403-6

Other articles of this Issue 4/2013

Journal of Robotic Surgery 4/2013 Go to the issue