Skip to main content
Top
Published in: BMC Musculoskeletal Disorders 1/2022

Open Access 01-12-2022 | Research article

Introducing a brain-computer interface to facilitate intraoperative medical imaging control – a feasibility study

Authors: Hooman Esfandiari, Pascal Troxler, Sandro Hodel, Daniel Suter, Mazda Farshad, Philipp Fürnstahl, Collaboration Group

Published in: BMC Musculoskeletal Disorders | Issue 1/2022

Login to get access

Abstract

Background

Safe and accurate execution of surgeries to date mainly rely on preoperative plans generated based on preoperative imaging. Frequent intraoperative interaction with such patient images during the intervention is needed, which is currently a cumbersome process given that such images are generally displayed on peripheral two-dimensional (2D) monitors and controlled through interface devices that are outside the sterile filed. This study proposes a new medical image control concept based on a Brain Computer Interface (BCI) that allows for hands-free and direct image manipulation without relying on gesture recognition methods or voice commands.

Method

A software environment was designed for displaying three-dimensional (3D) patient images onto external monitors, with the functionality of hands-free image manipulation based on the user’s brain signals detected by the BCI device (i.e., visually evoked signals). In a user study, ten orthopedic surgeons completed a series of standardized image manipulation tasks to navigate and locate predefined 3D points in a Computer Tomography (CT) image using the developed interface. Accuracy was assessed as the mean error between the predefined locations (ground truth) and the navigated locations by the surgeons. All surgeons rated the performance and potential intraoperative usability in a standardized survey using a five-point Likert scale (1 = strongly disagree to 5 = strongly agree).

Results

When using the developed interface, the mean image control error was 15.51 mm (SD: 9.57). The user's acceptance was rated with a Likert score of 4.07 (SD: 0.96) while the overall impressions of the interface was rated as 3.77 (SD: 1.02) by the users. We observed a significant correlation between the users' overall impression and the calibration score they achieved.

Conclusions

The use of the developed BCI, that allowed for a purely brain-guided medical image control, yielded promising results, and showed its potential for future intraoperative applications. The major limitation to overcome was noted as the interaction delay.
Appendix
Available only for authorised users
Literature
1.
go back to reference Korb W, Bohn S, Burgert O, Dietz A, Jacobs S, Falk V, et al. Surgical PACS for the Digital Operating Room. Systems Engineering and Specification of User Requirements. Stud Health Technol Inform. 2006;119:267–72.PubMed Korb W, Bohn S, Burgert O, Dietz A, Jacobs S, Falk V, et al. Surgical PACS for the Digital Operating Room. Systems Engineering and Specification of User Requirements. Stud Health Technol Inform. 2006;119:267–72.PubMed
2.
go back to reference Lemke HU, Berliner L. PACS for surgery and interventional radiology: Features of a Therapy Imaging and Model Management System (TIMMS). Eur J Radiol. 2011;78(2):239–42.CrossRef Lemke HU, Berliner L. PACS for surgery and interventional radiology: Features of a Therapy Imaging and Model Management System (TIMMS). Eur J Radiol. 2011;78(2):239–42.CrossRef
3.
go back to reference Cleary K, Kinsella A, Mun SK. OR 2020 Workshop Report: Operating Room of the Future. Int Congr Ser. 2005;1281:832–8.CrossRef Cleary K, Kinsella A, Mun SK. OR 2020 Workshop Report: Operating Room of the Future. Int Congr Ser. 2005;1281:832–8.CrossRef
4.
go back to reference Watts I, Boulanger P, Kawchuk G. ProjectDR: augmented reality system for displaying medical images directly onto a patient. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology (VRST '17). New York: Association for Computing Machinery; 2017. Article 70, 1–2. https://doi.org/10.1145/3139131.3141198. Watts I, Boulanger P, Kawchuk G. ProjectDR: augmented reality system for displaying medical images directly onto a patient. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology (VRST '17). New York: Association for Computing Machinery; 2017. Article 70, 1–2. https://​doi.​org/​10.​1145/​3139131.​3141198.
5.
go back to reference Hartmann B, Benson M, Junger A, Quinzio L, Röhrig R, Fengler B, et al. Computer Keyboard and Mouse as a Reservoir of Pathogens in an Intensive Care Unit. J Clin Monit Comput. 2003;18(1):7–12.CrossRef Hartmann B, Benson M, Junger A, Quinzio L, Röhrig R, Fengler B, et al. Computer Keyboard and Mouse as a Reservoir of Pathogens in an Intensive Care Unit. J Clin Monit Comput. 2003;18(1):7–12.CrossRef
6.
go back to reference Johnson R, O’Hara K, Sellen A, Cousins C, Criminisi A. Exploring the potential for touchless interaction in image-guided 10.1186/s12891-022-05384-9 interventional radiology. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11). New York: Association for Computing Machinery; 2011. p 3323–3332. https://doi.org/10.1145/1978942.1979436. Johnson R, O’Hara K, Sellen A, Cousins C, Criminisi A. Exploring the potential for touchless interaction in image-guided 10.1186/s12891-022-05384-9 interventional radiology. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11). New York: Association for Computing Machinery; 2011. p 3323–3332. https://​doi.​org/​10.​1145/​1978942.​1979436.
7.
go back to reference O’Hara K, Gonzalez G, Sellen A, Penney G, Varnavas A, Mentis H, et al. Touchless Interaction in Surgery. Commun ACM. 2014;57(1):70–7.CrossRef O’Hara K, Gonzalez G, Sellen A, Penney G, Varnavas A, Mentis H, et al. Touchless Interaction in Surgery. Commun ACM. 2014;57(1):70–7.CrossRef
8.
go back to reference Wachs JP, Kölsch M, Stern H, Edan Y. Vision-Based Hand-Gesture Applications. Commun ACM. 2011;54(2):60–71.CrossRef Wachs JP, Kölsch M, Stern H, Edan Y. Vision-Based Hand-Gesture Applications. Commun ACM. 2011;54(2):60–71.CrossRef
9.
go back to reference Grätzel C, Fong T, Grange S, Baur C. A Non-Contact Mouse for Surgeon-Computer Interaction. Technol Health Care Off J Eur Soc Eng Med. 2004;12(3):245–57. Grätzel C, Fong T, Grange S, Baur C. A Non-Contact Mouse for Surgeon-Computer Interaction. Technol Health Care Off J Eur Soc Eng Med. 2004;12(3):245–57.
10.
go back to reference Wachs JP, Stern HI, Edan Y, Gillam M, Handler J, Feied C, et al. A Gesture-based Tool for Sterile Browsing of Radiology Images. J Am Med Inform Assoc JAMIA. 2008;15(3):321–3.CrossRef Wachs JP, Stern HI, Edan Y, Gillam M, Handler J, Feied C, et al. A Gesture-based Tool for Sterile Browsing of Radiology Images. J Am Med Inform Assoc JAMIA. 2008;15(3):321–3.CrossRef
11.
go back to reference Lopes DS, Parreira PD De F, Paulo SF, Nunes V, Rego PA, Neves MC, et al. On the Utility of 3D Hand Cursors to Explore Medical Volume Datasets with a Touchless Interface. J Biomed Inform. 2017;72:140–9.CrossRef Lopes DS, Parreira PD De F, Paulo SF, Nunes V, Rego PA, Neves MC, et al. On the Utility of 3D Hand Cursors to Explore Medical Volume Datasets with a Touchless Interface. J Biomed Inform. 2017;72:140–9.CrossRef
12.
go back to reference Jacob MG, Wachs JP. Context-Based Hand Gesture Recognition for the Operating Room. Pattern Recogn Lett. 2014;36:196–203.CrossRef Jacob MG, Wachs JP. Context-Based Hand Gesture Recognition for the Operating Room. Pattern Recogn Lett. 2014;36:196–203.CrossRef
13.
go back to reference Ebert LC, Hatch G, Ampanozi G, Thali MJ, Ross S. You Can’t Touch This: Touch-free Navigation Through Radiological Images. Surg Innov. 2012;19(3):301–7.CrossRef Ebert LC, Hatch G, Ampanozi G, Thali MJ, Ross S. You Can’t Touch This: Touch-free Navigation Through Radiological Images. Surg Innov. 2012;19(3):301–7.CrossRef
14.
go back to reference Strickland M, Tremaine J, Brigley G, Law C. Using a Depth-Sensing Infrared Camera System to Access and Manipulate Medical Imaging from Within the Sterile Operating Field. Can J Surg J Can Chir. 2013;56(3):E1–6.CrossRef Strickland M, Tremaine J, Brigley G, Law C. Using a Depth-Sensing Infrared Camera System to Access and Manipulate Medical Imaging from Within the Sterile Operating Field. Can J Surg J Can Chir. 2013;56(3):E1–6.CrossRef
15.
go back to reference Tan JH, Chao C, Zawaideh M, Roberts AC, Kinney TB. Informatics in Radiology: developing a touchless user interface for intraoperative image control during interventional radiology procedures. Radiographics. 2013 Mar-Apr;33(2):E61–70. https://doi.org/10.1148/rg.332125101 Tan JH, Chao C, Zawaideh M, Roberts AC, Kinney TB. Informatics in Radiology: developing a touchless user interface for intraoperative image control during interventional radiology procedures. Radiographics. 2013 Mar-Apr;33(2):E61–70.  https://​doi.​org/​10.​1148/​rg.​332125101
16.
go back to reference Paulo SF, Relvas F, Nicolau H, Rekik Y, Machado V, Botelho J, et al. Touchless Interaction with Medical Images Based on 3D Hand Cursors Supported by Single-Foot Input: A Case Study in Dentistry. J Biomed Inform. 2019;100:103316.CrossRef Paulo SF, Relvas F, Nicolau H, Rekik Y, Machado V, Botelho J, et al. Touchless Interaction with Medical Images Based on 3D Hand Cursors Supported by Single-Foot Input: A Case Study in Dentistry. J Biomed Inform. 2019;100:103316.CrossRef
17.
go back to reference Norman DA. Natural User Interfaces are Not Natural. Interactions. 2010;17(3):6–10.CrossRef Norman DA. Natural User Interfaces are Not Natural. Interactions. 2010;17(3):6–10.CrossRef
18.
go back to reference MA M, Fallavollita P, Habert S, Weidert S, Navab N. Device- and System-Independent Personal Touchless User Interface for Operating Rooms. Int J Comput Assist Radiol Surg 2016;11(6):853–861. MA M, Fallavollita P, Habert S, Weidert S, Navab N. Device- and System-Independent Personal Touchless User Interface for Operating Rooms. Int J Comput Assist Radiol Surg 2016;11(6):853–861.
19.
go back to reference Saalfeld P, Kasper D, Preim B, Hansen C. Touchless Measurement of Medical Image Data for Interventional Support. 2017-Tagungsband; 2017. Saalfeld P, Kasper D, Preim B, Hansen C. Touchless Measurement of Medical Image Data for Interventional Support. 2017-Tagungsband; 2017.
20.
go back to reference Rosa GM, Elizondo ML. Use of a Gesture User Interface as a Touchless Image Navigation System in Dental Surgery: Case Series Report. Imaging Sci Dent. 2014;44(2):155–60.CrossRef Rosa GM, Elizondo ML. Use of a Gesture User Interface as a Touchless Image Navigation System in Dental Surgery: Case Series Report. Imaging Sci Dent. 2014;44(2):155–60.CrossRef
21.
go back to reference Schwarz LA, Bigdelou A, Navab N. Learning Gestures for Customizable Human-Computer Interaction in the Operating Room. In: Fichtinger G, Martel A, Peters T, editors. Medical Image Computing and Computer-Assisted Intervention – MICCAI 2011. Berlin, Heidelberg: Springer Berlin Heidelberg; 2011. p. 129–36. (Lecture Notes in Computer Science; vol. 6891).CrossRef Schwarz LA, Bigdelou A, Navab N. Learning Gestures for Customizable Human-Computer Interaction in the Operating Room. In: Fichtinger G, Martel A, Peters T, editors. Medical Image Computing and Computer-Assisted Intervention – MICCAI 2011. Berlin, Heidelberg: Springer Berlin Heidelberg; 2011. p. 129–36. (Lecture Notes in Computer Science; vol. 6891).CrossRef
23.
go back to reference Wolpaw JR, Birbaumer N, McFarland DJ, Pfurtscheller G, Vaughan TM. Brain–Computer Interfaces for Communication and Control. Clin Neurophysiol. 2002;113(6):767–91.CrossRef Wolpaw JR, Birbaumer N, McFarland DJ, Pfurtscheller G, Vaughan TM. Brain–Computer Interfaces for Communication and Control. Clin Neurophysiol. 2002;113(6):767–91.CrossRef
24.
go back to reference Aznan NKN, Bonner S, Connolly JD, Moubayed NA, Breckon TP. "On the Classification of SSVEP-Based Dry-EEG Signals via Convolutional Neural Networks," 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC).2018 p. 3726–3731. https://doi.org/10.1109/SMC.2018.00631. Aznan NKN, Bonner S, Connolly JD, Moubayed NA, Breckon TP. "On the Classification of SSVEP-Based Dry-EEG Signals via Convolutional Neural Networks," 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC).2018 p. 3726–3731. https://​doi.​org/​10.​1109/​SMC.​2018.​00631.
25.
go back to reference Autthasan P, Du X, Arnin J, Lamyai S, Perera M, Itthipuripat S, et al. A Single-Channel Consumer-Grade EEG Device for Brain-Computer Interface: Enhancing Detection of SSVEP and Its Amplitude Modulation. IEEE Sensors J. 2020;20(6):3366–78.CrossRef Autthasan P, Du X, Arnin J, Lamyai S, Perera M, Itthipuripat S, et al. A Single-Channel Consumer-Grade EEG Device for Brain-Computer Interface: Enhancing Detection of SSVEP and Its Amplitude Modulation. IEEE Sensors J. 2020;20(6):3366–78.CrossRef
26.
go back to reference Xing X, Wang Y, Pei W, Guo X, Liu Z, Wang F, et al. A High-Speed SSVEP-Based BCI Using Dry EEG Electrodes. Sci Rep. 2018;8(1):14708.CrossRef Xing X, Wang Y, Pei W, Guo X, Liu Z, Wang F, et al. A High-Speed SSVEP-Based BCI Using Dry EEG Electrodes. Sci Rep. 2018;8(1):14708.CrossRef
28.
go back to reference Nicolas-Alonso LF, Gomez-Gil J. Brain Computer Interfaces, a Review. Sensors. 2012;12(2):1211–79.CrossRef Nicolas-Alonso LF, Gomez-Gil J. Brain Computer Interfaces, a Review. Sensors. 2012;12(2):1211–79.CrossRef
29.
go back to reference Bockbrader MA, Francisco G, Lee R, Olson J, Solinsky R, Boninger ML. Brain Computer Interfaces in Rehabilitation Medicine. PM&R. 2018;10(9S2):S233–43. Bockbrader MA, Francisco G, Lee R, Olson J, Solinsky R, Boninger ML. Brain Computer Interfaces in Rehabilitation Medicine. PM&R. 2018;10(9S2):S233–43.
30.
31.
go back to reference Galloway NR. Human Brain Electrophysiology: Evoked Potentials and Evoked Magnetic Fields in Science and Medicine. Br J Ophthalmol. 1990;74(4):255.CrossRef Galloway NR. Human Brain Electrophysiology: Evoked Potentials and Evoked Magnetic Fields in Science and Medicine. Br J Ophthalmol. 1990;74(4):255.CrossRef
32.
go back to reference Wang Y, Wang R, Gao X, Hong B, Gao S. A Practical VEP-Based Brain-Computer Interface. IEEE Trans Neural Syst Rehabil Eng Publ IEEE Eng Med Biol Soc. 2006;14(2):234–9.CrossRef Wang Y, Wang R, Gao X, Hong B, Gao S. A Practical VEP-Based Brain-Computer Interface. IEEE Trans Neural Syst Rehabil Eng Publ IEEE Eng Med Biol Soc. 2006;14(2):234–9.CrossRef
33.
go back to reference Kouider S, Zerafa R, Steinmetz N, Barascud N. Brain-Computer Interface. WO2021140247A1. 2021. Kouider S, Zerafa R, Steinmetz N, Barascud N. Brain-Computer Interface. WO2021140247A1. 2021.
Metadata
Title
Introducing a brain-computer interface to facilitate intraoperative medical imaging control – a feasibility study
Authors
Hooman Esfandiari
Pascal Troxler
Sandro Hodel
Daniel Suter
Mazda Farshad
Philipp Fürnstahl
Collaboration Group
Publication date
01-12-2022
Publisher
BioMed Central
Published in
BMC Musculoskeletal Disorders / Issue 1/2022
Electronic ISSN: 1471-2474
DOI
https://doi.org/10.1186/s12891-022-05384-9

Other articles of this Issue 1/2022

BMC Musculoskeletal Disorders 1/2022 Go to the issue