Manipulating Underfoot Tactile Perceptions of Flooring Materials in Augmented Virtuality
-
Published:2023-12-08
Issue:24
Volume:13
Page:13106
-
ISSN:2076-3417
-
Container-title:Applied Sciences
-
language:en
-
Short-container-title:Applied Sciences
Author:
Topliss Jack1ORCID, Lukosch Stephan2ORCID, Coutts Euan1ORCID, Piumsomboon Tham1ORCID
Affiliation:
1. School of Product Design, University of Canterbury, Christchurch 8041, New Zealand 2. HIT Lab NZ, University of Canterbury, Christchurch 8041, New Zealand
Abstract
Underfoot haptics, a largely unexplored area, offers rich tactile information close to that of hand-based interactions. Haptic feedback gives a sense of physicality to virtual environments, making for a more realistic and immersive experience. Augmented Virtuality offers the ability to render virtual materials on a physical object, or haptic proxy, without the user being aware of the object’s physical appearance while seeing their own body. In this research, we investigate how the visual appearance of physical objects can be altered virtually to impact the tactile perception of the object. An Augmented Virtuality system was developed to explore this, and two tactile perception experiments, consisting of 18 participants, were conducted. Specifically, we explore whether changing the visual appearance of materials affects a person’s underfoot tactile perception and which tactile perception is most affected by the change through a within-subjects experiment. Additionally, the study examines whether people are aware of changes in visual appearance when focused on other tasks through a between-subjects experiment. The study showed that a change in visual appearance significantly impacts the tactile perception of roughness. Matching visual appearance to physical materials was found to increase awareness of tactile perception.
Subject
Fluid Flow and Transfer Processes,Computer Science Applications,Process Chemistry and Technology,General Engineering,Instrumentation,General Materials Science
Reference58 articles.
1. Strandholt, P.L., Dogaru, O.A., Nilsson, N.C., Nordahl, R., and Serafin, S. (2020, January 25–30). Knock on wood: Combining redirected touching and physical props for tool-based interaction in virtual reality. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA. 2. Cheng, L.P., Chang, L., Marwecki, S., and Baudisch, P. (2018, January 21–26). iturk: Turning passive haptics into active haptics by making users reconfigure props in virtual reality. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada. 3. Teng, S.Y., Li, P., Nith, R., Fonseca, J., and Lopes, P. (2021, January 8–13). Touch&Fold: A Foldable Haptic Actuator for Rendering Touch in Mixed Reality. Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, CHI ’21, Virtual. 4. Zenner, A., and Krüger, A. (2019, January 4–9). Drag: On: A virtual reality controller providing haptic feedback based on drag and weight shift. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK. 5. McClelland, J.C., Teather, R.J., and Girouard, A. (2017, January 16–17). Haptobend: Shape-changing passive haptic feedback in virtual reality. Proceedings of the 5th Symposium on Spatial User Interaction, Brighton, UK.
|
|