Author:
Hanada G.M.,Ahveninen J.,Calabro F.J.,Yengo-Kahn A.,Vaina L.M.
Abstract
SummaryThe everyday environment brings about many competing inputs from different modalities to our sensory systems. The ability to filter these multisensory inputs in order to identify and efficiently utilize useful spatial cues is necessary to detect and process the relevant information. In the present study, we investigate how feature-based attention affects the detection of motion across sensory modalities. We were interested to determine how subjects use intramodal, crossmodal auditory, and combined audiovisual motion cues to attend to specific visual motion signals. The results show that in most cases, both visual and auditory cues enhance feature-based orienting to a visual motion pattern that is presented among distractor patterns. Furthermore, in many cases, detection of transparent motion patterns was significantly more accurate after combined visual-auditory than unimodal attention cues. Whereas previous studies have shown crossmodal effects of spatial attention, our results demonstrate a spread of crossmodal feature-based attention cues, which have been matched for the detection threshold of the visual target. These effects were evident in comparisons between cued and uncued conditions, as well as in analyses comparing the effects of valid vs. invalid cues.
Publisher
Cold Spring Harbor Laboratory