How visual cues to speech rate influence speech perception

Author:

Bosker Hans Rutger12ORCID,Peeters David123,Holler Judith12

Affiliation:

1. Max Planck Institute for Psycholinguistics, Nijmegen, The Netherlands

2. Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands

3. Tilburg Center for Cognition and Communication (TiCC), Department of Communication and Cognition, Tilburg University, Tilburg, The Netherlands

Abstract

Spoken words are highly variable and therefore listeners interpret speech sounds relative to the surrounding acoustic context, such as the speech rate of a preceding sentence. For instance, a vowel midway between short /ɑ/ and long /a:/ in Dutch is perceived as short /ɑ/ in the context of preceding slow speech, but as long /a:/ if preceded by a fast context. Despite the well-established influence of visual articulatory cues on speech comprehension, it remains unclear whether visual cues to speech rate also influence subsequent spoken word recognition. In two “Go Fish”–like experiments, participants were presented with audio-only (auditory speech + fixation cross), visual-only (mute videos of talking head), and audiovisual (speech + videos) context sentences, followed by ambiguous target words containing vowels midway between short /ɑ/ and long /a:/. In Experiment 1, target words were always presented auditorily, without visual articulatory cues. Although the audio-only and audiovisual contexts induced a rate effect (i.e., more long /a:/ responses after fast contexts), the visual-only condition did not. When, in Experiment 2, target words were presented audiovisually, rate effects were observed in all three conditions, including visual-only. This suggests that visual cues to speech rate in a context sentence influence the perception of following visual target cues (e.g., duration of lip aperture), which at an audiovisual integration stage bias participants’ target categorisation responses. These findings contribute to a better understanding of how what we see influences what we hear.

Funder

Nederlandse Organisatie voor Wetenschappelijk Onderzoek

Max-Planck-Gesellschaft

Publisher

SAGE Publications

Subject

Physiology (medical),General Psychology,Experimental and Cognitive Psychology,General Medicine,Neuropsychology and Physiological Psychology,Physiology

Cited by 6 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. Individual differences in the use of top-down versus bottom-up cues to resolve phonetic ambiguity;Attention, Perception, & Psychophysics;2024-05-29

2. Audiovisual Mandarin Lexical Tone Perception in Quiet and Noisy Contexts: The Influence of Visual Cues and Speech Rate;Journal of Speech, Language, and Hearing Research;2022-11-17

3. Evidence For Selective Adaptation and Recalibration in the Perception of Lexical Stress;Language and Speech;2021-07-06

4. Beat gestures influence which speech sounds you hear;Proceedings of the Royal Society B: Biological Sciences;2021-01-27

5. Eye-tracking the time course of distal and global speech rate effects.;Journal of Experimental Psychology: Human Perception and Performance;2020-10

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3