Alpha and beta oscillations index semantic congruency between speech and gestures in clear and degraded speech
Source
Journal of Cognitive Neuroscience, 30, 8, (2018), pp. 1086-1097ISSN
Publication type
Article / Letter to editor
Display more detailsDisplay less details
Organization
PI Group Neurobiology of Language
Taalwetenschap
SW OZ DCC PL
PI Group Neuronal Oscillations
Journal title
Journal of Cognitive Neuroscience
Volume
vol. 30
Issue
iss. 8
Languages used
English (eng)
Page start
p. 1086
Page end
p. 1097
Subject
110 000 Neurocognition of Language; DI-BCB_DCC_Theme 1: Language and Communication; Giving cognition a hand: Linking spatial cognition to linguistic expression in native and late signers and bimodal bilinguals; Giving speech a hand: How functional brain networks support gestureal enhancement of language; Language & Communication; Language in our hands: Acquisition of spatial language in deaf and hearing children; Multimodal language and communication; Psycholinguistics; Language in InteractionAbstract
Previous work revealed that visual semantic information conveyed by gestures can enhance degraded speech comprehension, but the mechanisms underlying these integration processes under adverse listening conditions remain poorly understood. We used MEG to investigate how oscillatory dynamics support speech-gesture integration when integration load is manipulated by auditory (e.g., speech degradation) and visual semantic (e.g., gesture congruency) factors. Participants were presented with videos of an actress uttering an action verb in clear or degraded speech, accompanied by a matching (mixing gesture + "mixing") or mismatching (drinking gesture + "walking") gesture. In clear speech, alpha/beta power was more suppressed in the left inferior frontal gyrus and motor and visual cortices when integration load increased in response to mismatching versus matching gestures. In degraded speech, beta power was less suppressed over posterior STS and medial temporal lobe for mismatching compared with matching gestures, showing that integration load was lowest when speech was degraded and mismatching gestures could not be integrated and disambiguate the degraded signal. Our results thus provide novel insights on how low-frequency oscillatory modulations in different parts of the cortex support the semantic audiovisual integration of gestures in clear and degraded speech: When speech is clear, the left inferior frontal gyrus and motor and visual cortices engage because higher-level semantic information increases semantic integration load. When speech is degraded, posterior STS/middle temporal gyrus and medial temporal lobe are less engaged because integration load is lowest when visual semantic information does not aid lexical retrieval and speech and gestures cannot be integrated.
Subsidient
NWO (Grant code:info:eu-repo/grantAgreement/NWO/Gravitation/024.001.006)
This item appears in the following Collection(s)
- Academic publications [246515]
- Donders Centre for Cognitive Neuroimaging [4040]
- Electronic publications [134102]
- Faculty of Arts [30004]
- Faculty of Social Sciences [30494]
- Open Access publications [107633]
Upload full text
Use your RU credentials (u/z-number and password) to log in with SURFconext to upload a file for processing by the repository team.