Abstract
ABSTRACTPerceiving the spatial location and physical dimensions of objects that we touch is crucial for goal-directed actions. To achieve this, our brain transforms skin-based coordinates into a reference frame by integrating visual and proprioceptive cues, a process known as tactile remapping. In the current study, we examine the role of proprioception in the remapping process when information from the more dominant visual modality is withheld. We developed a new visual-to-touch sensory substitution device and asked participants to perform a spatial localization task in three different arm postures that included posture switches between blocks of trials. We observed that in the absence of visual information novel proprioceptive inputs can be overridden after switching postures. This behavior demonstrates effective top-down modulations of proprioception and points to the unequal contribution of different sensory modalities to tactile remapping.
Publisher
Cold Spring Harbor Laboratory