Abstract
Proper interpretation of visual information requires capturing the structural regularities in the visual signal and this frequently occurs in conjunction with movement. Perceptual interpretation is complicated both by transient perceptual changes that accompany motor activity, and as found in audition and somatosensation, by more persistent changes that accompany the learning of new movements. Here we asked whether motor learning also results in sustained changes to visual perception. We designed a reaching task in which participants directly controlled the visual information they received, which we term self-operated stimuli. Specifically, they trained to make movements in a number of directions. Directional information was provided by the motion of an intrinsically ambiguous moving stimulus which was directly tied to motion of the hand. We find that movement training improves perception of coherent stimulus motion, and that changes in movement are correlated with the perceptual change. No perceptual changes are observed in passive observers even when they are provided with an explicit strategy to solve perceptual grouping. Comparison of empirical perceptual data with simulations based on a Bayesian generative model of motion perception suggests that movement training promotes the fine-tuning of the internal representation of stimulus geometry. These results emphasize the role of sensorimotor interaction in determining the persistent properties in space and time that define a percept.
Publisher
Cold Spring Harbor Laboratory