Author:
Pápai Márta Szabina,Torralba Mireia,Soto-Faraco Salvador
Abstract
AbstractAccording to many reports, cross-modal interactions can lead to enhancement of visual perception, even when visual events appear below awareness. Yet, the mechanism underlying this cross-modal enhancement is still unclear. The present study addressed whether cross-modal integration based on bottom-up processing can break through the threshold of awareness. We used a binocular rivalry protocol, and measured ERP responses and perceptual switches time-locked to flashes, sounds or flash-sound co-occurrences. In behavior, perceptual switches happened the earliest when subthreshold flashes co-occurred with sounds. Yet, this cross-modal facilitation never surpassed the benchmark indicated by the probability summation, thus suggesting independence rather than integration of sensory signals. Likewise, the ERPs to audiovisual events did not differ from the summed unimodal ERPs, also suggesting that the cross-modal behavioural benefit for unaware visual events can be explained by the independent contribution of unisensory signals and suggest no need for a multisensory integration mechanism. Hence, even though cross-modal benefits appeared behaviourally, we suggest that this cross-modal facilitation might origin from well-known bottom-up attentional capture processes, contributed by each individual sensory stimulus.
Publisher
Cold Spring Harbor Laboratory