Abstract
AbstractPurposeInvestigate cortical tracking of speech (CTS) in adults who stutter (AWS) compared to typically fluent adults (TFA) to test the involvement of the speech-motor network in tracking auditory information.MethodParticipants’ EEG was recorded while they either had to simply listen to sentences (listening only) or complete them by naming a picture (listening-for-speaking), thus manipulating the upcoming involvement of speech production. We analyzed speech-brain coherence and brain connectivity during listening.ResultsDuring the listening-for-speaking task, AWS exhibited reduced CTS in the 3-5 Hz range (theta), corresponding to the syllabic rhythm. The effect was localized in left inferior parietal and right pre-/supplementary motor regions. Connectivity analyses revealed that TFA had stronger information transfer in the theta range in both tasks in fronto-temporo-parietal regions. When considering the whole sample of participants, increased connectivity from the right superior temporal cortex to the left sensorimotor cortex was correlated with faster naming times in the listening-for-speaking task.ConclusionsAtypical speech-motor functioning in stuttering also impacts speech perception, especially in situations requiring articulatory alertness. The involvement of frontal and (pre-) motor regions in CTS in typically fluent adults is highlighted. Speech perception in individuals with speech-motor deficits should be further investigated, especially when smooth transitioning between listening and speaking is required, such as in real-life conversational settings.
Publisher
Cold Spring Harbor Laboratory