Preliminary Evidence for Global Properties in Human Listeners During Natural Auditory Scene Perception

Author:

McMullin Margaret A.1,Kumar Rohit2,Higgins Nathan C.3ORCID,Gygi Brian4,Elhilali Mounya2,Snyder Joel S.1

Affiliation:

1. Department of Psychology, University of Nevada, Las Vegas, Las Vegas, NV, USA

2. Department of Electrical and Computer Engineering, Johns Hopkins University, Baltimore, MD, USA

3. Department of Communication Sciences & Disorders, University of South Florida, Tampa, FL, USA

4. East Bay Institute for Research and Education, Martinez, CA, USA

Abstract

Abstract Theories of auditory and visual scene analysis suggest the perception of scenes relies on the identification and segregation of objects within it, resembling a detail-oriented processing style. However, a more global process may occur while analyzing scenes, which has been evidenced in the visual domain. It is our understanding that a similar line of research has not been explored in the auditory domain; therefore, we evaluated the contributions of high-level global and low-level acoustic information to auditory scene perception. An additional aim was to increase the field’s ecological validity by using and making available a new collection of high-quality auditory scenes. Participants rated scenes on 8 global properties (e.g., open vs. enclosed) and an acoustic analysis evaluated which low-level features predicted the ratings. We submitted the acoustic measures and average ratings of the global properties to separate exploratory factor analyses (EFAs). The EFA of the acoustic measures revealed a seven-factor structure explaining 57% of the variance in the data, while the EFA of the global property measures revealed a two-factor structure explaining 64% of the variance in the data. Regression analyses revealed each global property was predicted by at least one acoustic variable (R2 = 0.33–0.87). These findings were extended using deep neural network models where we examined correlations between human ratings of global properties and deep embeddings of two computational models: an object-based model and a scene-based model. The results support that participants’ ratings are more strongly explained by a global analysis of the scene setting, though the relationship between scene perception and auditory perception is multifaceted, with differing correlation patterns evident between the two models. Taken together, our results provide evidence for the ability to perceive auditory scenes from a global perspective. Some of the acoustic measures predicted ratings of global scene perception, suggesting representations of auditory objects may be transformed through many stages of processing in the ventral auditory stream, similar to what has been proposed in the ventral visual stream. These findings and the open availability of our scene collection will make future studies on perception, attention, and memory for natural auditory scenes possible.

Funder

National Defense Science & Engineering Graduate (NDSEG) Fellowship Program

Publisher

MIT Press

Reference147 articles.

1. “What” and “where” in the human auditory system;Alain;Proceedings of the National Academy of Sciences,2001

2. Modulation of the FFA and PPA by language related to faces and places;Aziz-Zadeh;Social Neuroscience,2008

3. Common factors in the identification of an assortment of brief everyday sounds;Ballas;Journal of Experimental Psychology: Human Perception and Performance,1993

4. Cortical mechanisms specific to explicit visual object recognition;Bar;Neuron,2001

5. Orthogonal acoustic dimensions define auditory field maps in human cortex;Barton;Proceedings of the National Academy of Sciences,2012

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3