Abstract
AbstractTouch perception is an inherently multisensory process in which vision plays an essential role. However, our understanding of how vision encodes sensory and emotional-affective aspects of observed touch, and the timing of these processes, is still limited. Here we address this gap by investigating the neural dynamics of visual touch observation by analysing electroencephalographic (EEG) data from participants viewing detailed hand interactions from the Validated Touch-Video Database. We examined how the brain encodes basic body cues such as hand orientation and viewing perspective, in addition to sensory aspects, including the type of touch (e.g., stroking vs. pressing; hand vs. object touch) and the object involved (e.g., knife, brush), as well as emotional-affective dimensions. Using multivariate decoding, information about body cues was present within 60 ms, with sensory and emotional details, including valence, arousal, and pain present around 130 ms, demonstrating efficient early visual processing. Threat was most clearly identified by approximately 265 ms, similarly involving visual regions, suggesting that such evaluations require slightly extended neural engagement. Our findings reveal that bottom-up, automatic visual processing is integral to complex tactile assessments, important for rapidly extracting both the personal relevance and the sensory and emotional dimensions of touch.
Publisher
Cold Spring Harbor Laboratory