Dynamic Weighting of Time-Varying Visual and Auditory Evidence During Multisensory Decision Making

Author:

Tuip Rosanne R. M.12ORCID,van der Ham Wessel1,Lorteije Jeannette A. M.13,Van Opstal Filip2

Affiliation:

1. Swammerdam Institute for Life Sciences, Center for Neuroscience, Faculty of Science, University of Amsterdam, 1098 XH Amsterdam, The Netherlands

2. Department of Psychology, Brain and Cognition, University of Amsterdam, 1098 XH Amsterdam, The Netherlands

3. Animal Welfare Body, Radboud University/UMC, 6525 EZ Nijmegen, The Netherlands

Abstract

Abstract Perceptual decision-making in a dynamic environment requires two integration processes: integration of sensory evidence from multiple modalities to form a coherent representation of the environment, and integration of evidence across time to accurately make a decision. Only recently studies started to unravel how evidence from two modalities is accumulated across time to form a perceptual decision. One important question is whether information from individual senses contributes equally to multisensory decisions. We designed a new psychophysical task that measures how visual and auditory evidence is weighted across time. Participants were asked to discriminate between two visual gratings, and/or two sounds presented to the right and left ear based on respectively contrast and loudness. We varied the evidence, i.e., the contrast of the gratings and amplitude of the sound, over time. Results showed a significant increase in performance accuracy on multisensory trials compared to unisensory trials, indicating that discriminating between two sources is improved when multisensory information is available. Furthermore, we found that early evidence contributed most to sensory decisions. Weighting of unisensory information during audiovisual decision-making dynamically changed over time. A first epoch was characterized by both visual and auditory weighting, during the second epoch vision dominated and the third epoch finalized the weighting profile with auditory dominance. Our results suggest that during our task multisensory improvement is generated by a mechanism that requires cross-modal interactions but also dynamically evokes dominance switching.

Funder

Universiteit van Amsterdam

Amsterdam Neuroscience

Publisher

Brill

Subject

Cognitive Neuroscience,Computer Vision and Pattern Recognition,Sensory Systems,Ophthalmology,Experimental and Cognitive Psychology

Reference50 articles.

1. A spatially collocated sound thrusts a flash into awareness;Aller, M.

2. Bayesian integration of visual and auditory signals for spatial localization;Battaglia, P. W.

3. Controlling the false discovery rate: a practical and powerful approach to multiple testing;Benjamini, Y.

4. Cross-modal bias and perceptual fusion with auditory-visual spatial discordance;Bertelson, P.

5. Where are multisensory signals combined for perceptual decision-making?;Bizley, J. K.

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3