Multisensory Extended Reality Applications Offer Benefits for Volumetric Biomedical Image Analysis in Research and Medicine
-
Published:2024-06-11
Issue:
Volume:
Page:
-
ISSN:2948-2933
-
Container-title:Journal of Imaging Informatics in Medicine
-
language:en
-
Short-container-title:J Digit Imaging. Inform. med.
Author:
Krieger KathrinORCID, Egger Jan, Kleesiek Jens, Gunzer Matthias, Chen Jianxu
Abstract
Abstract3D data from high-resolution volumetric imaging is a central resource for diagnosis and treatment in modern medicine. While the fast development of AI enhances imaging and analysis, commonly used visualization methods lag far behind. Recent research used extended reality (XR) for perceiving 3D images with visual depth perception and touch but used restrictive haptic devices. While unrestricted touch benefits volumetric data examination, implementing natural haptic interaction with XR is challenging. The research question is whether a multisensory XR application with intuitive haptic interaction adds value and should be pursued. In a study, 24 experts for biomedical images in research and medicine explored 3D medical shapes with 3 applications: a multisensory virtual reality (VR) prototype using haptic gloves, a simple VR prototype using controllers, and a standard PC application. Results of standardized questionnaires showed no significant differences between all application types regarding usability and no significant difference between both VR applications regarding presence. Participants agreed to statements that VR visualizations provide better depth information, using the hands instead of controllers simplifies data exploration, the multisensory VR prototype allows intuitive data exploration, and it is beneficial over traditional data examination methods. While most participants mentioned manual interaction as the best aspect, they also found it the most improvable. We conclude that a multisensory XR application with improved manual interaction adds value for volumetric biomedical data examination. We will proceed with our open-source research project ISH3DE (Intuitive Stereoptic Haptic 3D Data Exploration) to serve medical education, therapeutic decisions, surgery preparations, or research data analysis.
Funder
KITE (Plattform für KI-Translation Essen) from the REACT-EU initiative Bundesministerium für Bildung und Forschung Ministerium für Kultur und Wissenschaft des Landes Nordrhein-Westfalen Universität Bielefeld
Publisher
Springer Science and Business Media LLC
Reference31 articles.
1. Zbontar, J., Knoll, F., Sriram, A., Murrell, T., Huang, Z., Muckley, M.J., Defazio, A., Stern, R., Johnson, P., Bruno, M., et al.: fastmri: An open dataset and benchmarks for accelerated mri. arXiv preprint arXiv:1811.08839 (2018) 2. Howard, I.P., Rogers, B.J.: Binocular Vision and Stereopsis. Oxford University Press, USA (1995) 3. Helbig, H.B., Ernst, M.O.: Optimal integration of shape information from vision and touch. Experimental brain research 179, 595–606 (2007) 4. Ernst, M.O., Banks, M.S.: Humans integrate visual and haptic information in a statistically optimal fashion. Nature 415(6870), 429–433 (2002) 5. Lobachev, O., Berthold, M., Pfeffer, H., Guthe, M., Steiniger, B.S.: Inspection of histological 3d reconstructions in virtual reality. Frontiers in Virtual Reality 2, 628449 (2021)
|
|