Abstract
AbstractWhilst everyday interactions with objects often involve multiple tactile contacts, integration of tactile signals remains poorly understood. Here we characterise the integration process of tactile motion on multiple fingerpads. Across four experiments, participants averaged the direction of two simultaneous tactile motion trajectories delivered to different fingerpads. Averaging performance differed between within- and between-hands in terms of sensitivity and precision but was unaffected by somatotopic proximity between stimulated fingers. The sensitivity to the average direction was influenced by the discrepancy between individual motion signals, but only for within-hand conditions. This was explained by a model, in which the ‘virtually leading finger’ received a higher perceptual weighting. Precision was greater in between-hand compared to within-hand conditions. While biased weighting accounted for differences in sensitivity, it was not sufficient to explain the difference in precision, implying additional sensory limitations during within-hand integration. We suggest that unimanual integration is limited and thus exploits a ‘natural’ cognitive prior involving a single object moving relative to the hand to maximise information gain.Author summaryTactile stimulation is always on. Yet little is known about how the brain combines widespread tactile inputs for perception. Most tactile studies emphasize a single point of tactile stimulation (e.g., location or intensity of a static stimulus) and minimal units of tactile perception (e.g., acuity or selectivity). However, our daily interactions with the world involve encoding spatially and temporally extended tactile signals. Perceiving tactile objects and events as coherent entities requires the somatosensory system to aggregate tactile afferent signals across separate skin regions (i.e., separate digits). Across four experiments we asked participants to average direction of two tactile motion trajectories delivered simultaneously to two different fingerpads, either on the same, or on different hands. Our results show strong integration between multiple tactile inputs, but subject to limitations for inputs delivered within a hand. Our model suggests that tactile inputs are weighted according to an integrative model of hand-object interaction that operates within-hands on purely geometric information to prioritise ‘novel’ information from a ‘virtually leading finger’ (VLF).
Publisher
Cold Spring Harbor Laboratory