Abstract
Abstract
Background
Safe and accurate execution of surgeries to date mainly rely on preoperative plans generated based on preoperative imaging. Frequent intraoperative interaction with such patient images during the intervention is needed, which is currently a cumbersome process given that such images are generally displayed on peripheral two-dimensional (2D) monitors and controlled through interface devices that are outside the sterile filed. This study proposes a new medical image control concept based on a Brain Computer Interface (BCI) that allows for hands-free and direct image manipulation without relying on gesture recognition methods or voice commands.
Method
A software environment was designed for displaying three-dimensional (3D) patient images onto external monitors, with the functionality of hands-free image manipulation based on the user’s brain signals detected by the BCI device (i.e., visually evoked signals). In a user study, ten orthopedic surgeons completed a series of standardized image manipulation tasks to navigate and locate predefined 3D points in a Computer Tomography (CT) image using the developed interface. Accuracy was assessed as the mean error between the predefined locations (ground truth) and the navigated locations by the surgeons. All surgeons rated the performance and potential intraoperative usability in a standardized survey using a five-point Likert scale (1 = strongly disagree to 5 = strongly agree).
Results
When using the developed interface, the mean image control error was 15.51 mm (SD: 9.57). The user's acceptance was rated with a Likert score of 4.07 (SD: 0.96) while the overall impressions of the interface was rated as 3.77 (SD: 1.02) by the users. We observed a significant correlation between the users' overall impression and the calibration score they achieved.
Conclusions
The use of the developed BCI, that allowed for a purely brain-guided medical image control, yielded promising results, and showed its potential for future intraoperative applications. The major limitation to overcome was noted as the interaction delay.
Funder
SURGENT under the umbrella of University Medicine Zurich/Hochschulmedizin Zürich
Publisher
Springer Science and Business Media LLC
Subject
Orthopedics and Sports Medicine,Rheumatology
Reference33 articles.
1. Korb W, Bohn S, Burgert O, Dietz A, Jacobs S, Falk V, et al. Surgical PACS for the Digital Operating Room. Systems Engineering and Specification of User Requirements. Stud Health Technol Inform. 2006;119:267–72.
2. Lemke HU, Berliner L. PACS for surgery and interventional radiology: Features of a Therapy Imaging and Model Management System (TIMMS). Eur J Radiol. 2011;78(2):239–42.
3. Cleary K, Kinsella A, Mun SK. OR 2020 Workshop Report: Operating Room of the Future. Int Congr Ser. 2005;1281:832–8.
4. Watts I, Boulanger P, Kawchuk G. ProjectDR: augmented reality system for displaying medical images directly onto a patient. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology (VRST '17). New York: Association for Computing Machinery; 2017. Article 70, 1–2. https://doi.org/10.1145/3139131.3141198.
5. Hartmann B, Benson M, Junger A, Quinzio L, Röhrig R, Fengler B, et al. Computer Keyboard and Mouse as a Reservoir of Pathogens in an Intensive Care Unit. J Clin Monit Comput. 2003;18(1):7–12.
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献