Author:
Chen Ji,Wright William Geoffrey,Keshner Emily,Darvish Kurosh
Abstract
The ability to control head orientation relative to the body is a multisensory process that mainly depends on proprioceptive, vestibular, and visual sensory systems. A system to study the sensory integration of head orientation was developed and tested. A test seat with a five-point harness was assembled to provide passive postural support. A lightweight head-mounted display was designed for mounting multiaxis accelerometers and a mini-CCD camera to provide the visual input to virtual reality goggles with a 39° horizontal field of view. A digitally generated sinusoidal signal was delivered to a motor-driven computer-controlled sled on a 6-m linear railing system. A data acquisition system was designed to collect acceleration data. A pilot study was conducted to test the system. Four young, healthy subjects were seated with their trunks fixed to the seat. The subjects received a sinusoidal anterior–posterior translation with peak accelerations of 0.06g at 0.1 Hz and 0.12g at 0.2, 0.5, and 1.1 Hz. Four sets of visual conditions were randomly presented along with the translation. These conditions included eyes open, looking forward, backward, and sideways, and also eyes closed. Linear acceleration data were collected from linear accelerometers placed on the head, trunk, and seat and were processed using MATLAB. The head motion was analyzed using fast Fourier transform to derive the gain and phase of head pitch acceleration relative to seat linear acceleration. A randomization test for two independent variables tested the significance of visual and inertial effects on response gain and phase shifts. Results show that the gain was close to one, with no significant difference among visual conditions across frequencies. The phase was shown to be dependent on the head strategy each subject used.
Subject
General Materials Science