Abstract
AbstractTo interpret the world and make accurate perceptual decisions, the brain must combine information across sensory modalities. For instance, it must combine vision and hearing to localize objects based on their image and sound. Probability theory suggests that evidence from multiple independent cues should be combined additively, and several studies support this hypothesis1–14. However, other studies suggest that mammals, particularly mice, can exhibit sensory bias during audiovisual integration15–19; furthermore, the cortical substrates of multisensory integration remain uncertain20. Here we show that to localize a stimulus, mice combine auditory and visual spatial cues additively, a computation supported by additive multisensory integration in frontal cortex. We developed an audiovisual localization task where mice turn a wheel to indicate the joint position of an image and a sound. Scanning optogenetic inactivation21–24 during stimulus presentation showed that visual cortex contributes unisensory information whereas frontal cortex contributes multisensory information to the mouse’s decision. Neuropixels recordings of >10,000 neurons indicated that neural activity in frontal area MOs (secondary motor cortex) reflects an additive combination of visual and auditory signals. An accumulator model2,25,26 applied to the sensory representations of MOs neurons reproduced both the observed choices and reaction times. Sensory signals in MOs were specific to trained mice, and indeed the same accumulator applied to naïve mice failed to perform the task. These results suggest that frontal cortex adapts through learning to integrate information across sensory cortices, providing a signal that is then transformed into a binary decision by a downstream accumulator.
Publisher
Cold Spring Harbor Laboratory
Cited by
12 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献