Abstract
ABSTRACTSelf-motion perception is a vital skill for all species. It is an inherently multisensory process, that combines inertial (body-based) and relative (with respect to the environment) motion cues. While extensively studied in human and non-human primates, there is currently no paradigm to test self-motion perception in rodents using both inertial and relative self-motion cues. We developed a novel rodent motion simulator using two synchronized robotic arms to generate inertial, relative or combined (inertial and relative) cues of self-motion. Eight rats were trained to perform a task of heading-discrimination, similar to the popular primate paradigm. Strikingly, the rats relied heavily on airflow for relative self-motion perception, with little contribution from optic flow (performance in the dark was almost as good). Relative self-motion (airflow) was perceived with greater reliability vs. inertial. Disrupting airflow (using a fan or windshield) damaged relative, but not inertial, self-motion perception. However, whiskers were not needed for this function. Lastly, the rats integrated relative and inertial self-motion cues in a reliability-based (Bayesian-like) manner. These results implicate airflow as a dominant cue for self-motion perception in rats, and provide a new domain to investigate the neural bases of self-motion perception and multisensory processing in awake behaving rodents.
Publisher
Cold Spring Harbor Laboratory