Abstract
During multisensory speech perception, slow δ oscillations (∼1–3 Hz) in the listener's brain synchronize with the speech signal, likely engaging in speech signal decomposition. Notable fluctuations in the speech amplitude envelope, resounding speaker prosody, temporally align with articulatory and body gestures and both provide complementary sensations that temporally structure speech. Further, δ oscillations in the left motor cortex seem to align with speech and musical beats, suggesting their possible role in the temporal structuring of (quasi)-rhythmic stimulation. We extended the role of δ oscillations to audiovisual asynchrony detection as a test case of the temporal analysis of multisensory prosody fluctuations in speech. We recorded Electroencephalograph (EEG) responses in an audiovisual asynchrony detection task while participants watched videos of a speaker. We filtered the speech signal to remove verbal content and examined how visual and auditory prosodic features temporally (mis-)align. Results confirm (1) that participants accurately detected audiovisual asynchrony, and (2) increased δ power in the left motor cortex in response to audiovisual asynchrony. The difference of δ power between asynchronous and synchronous conditions predicted behavioral performance, and (3) decreased δ-β coupling in the left motor cortex when listeners could not accurately map visual and auditory prosodies. Finally, both behavioral and neurophysiological evidence was altered when a speaker's face was degraded by a visual mask. Together, these findings suggest that motor δ oscillations support asynchrony detection of multisensory prosodic fluctuation in speech.SIGNIFICANCE STATEMENTSpeech perception is facilitated by regular prosodic fluctuations that temporally structure the auditory signal. Auditory speech processing involves the left motor cortex and associated δ oscillations. However, visual prosody (i.e., a speaker's body movements) complements auditory prosody, and it is unclear how the brain temporally analyses different prosodic features in multisensory speech perception. We combined an audiovisual asynchrony detection task with electroencephalographic (EEG) recordings to investigate how δ oscillations support the temporal analysis of multisensory speech. Results confirmed that asynchrony detection of visual and auditory prosodies leads to increased δ power in left motor cortex and correlates with performance. We conclude that δ oscillations are invoked in an effort to resolve denoted temporal asynchrony in multisensory speech perception.
Cited by
16 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献