Affiliation:
1. Medizinische Physik and Cluster of Excellence Hearing4all, Universität Oldenburg, 26111 Oldenburg, Germany
Abstract
Awareness of space, and subsequent orientation and navigation in rooms, is dominated by the visual system. However, humans are able to extract auditory information about their surroundings from early reflections and reverberation in enclosed spaces. To better understand orientation and navigation based on acoustic cues only, three virtual corridor layouts (I-, U-, and Z-shaped) were presented using real-time virtual acoustics in a three-dimensional 86-channel loudspeaker array. Participants were seated on a rotating chair in the center of the loudspeaker array and navigated using real rotation and virtual locomotion by “teleporting” in steps on a grid in the invisible environment. A head mounted display showed control elements and the environment in a visual reference condition. Acoustical information about the environment originated from a virtual sound source at the collision point of a virtual ray with the boundaries. In different control modes, the ray was cast either in view or hand direction or in a rotating, “radar”-like fashion in 90° steps to all sides. Time to complete, number of collisions, and movement patterns were evaluated. Navigation and orientation were possible based on the direct sound with little effect of room acoustics and control mode. Underlying acoustic cues were analyzed using an auditory model.
Funder
Deutsche Forschungsgemeinschaft
Publisher
Acoustical Society of America (ASA)
Subject
Acoustics and Ultrasonics,Arts and Humanities (miscellaneous)
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献