Abstract
AbstractPerceptual representations of tactile motion are thought to emerge from computations that integrate cutaneous cues such as direction, speed, and saliency of the object moving on the skin. However, this knowledge was accrued from studies that presented stimuli with the hand in a fixed proprioceptive state. We studied how perception of tactile motion is modulated by proprioception, and how interactions between proprioceptive and tactile motion inputs are controlled by the reference frame of motion judgements. Participants judged the direction of motion of stimuli presented to their index finger in two reference frames (relative to the hand or their sternum) while the stimulated hand was placed in different positions. Our data show that tactile motion can be flexibly perceived in different reference frames, with proprioceptive modulations occurring only for motion judgements made in a Sternum-centric reference frame. We developed a Bayesian model that robustly accounts for participants’ perceptual decisions. Taken together, our data highlight the flexibility of the brain to represent tactile motion information in different coordinate systems. Our study also provides a computational framework for how task-dependent reference frame signals regulate the integration of proprioceptive and tactile inputs to generate flexible perception of tactile motion.
Publisher
Cold Spring Harbor Laboratory