
Fulltext:
225307.pdf
Embargo:
until further notice
Size:
1.152Mb
Format:
PDF
Description:
publisher's version
Publication year
2020Number of pages
9 p.
Source
The Journal of Neuroscience, 40, 49, (2020), pp. 9467-9475ISSN
Publication type
Article / Letter to editor

Display more detailsDisplay less details
Organization
SW OZ DCC PL
PI Group Language and Computation in Neural Systems
Journal title
The Journal of Neuroscience
Volume
vol. 40
Issue
iss. 49
Languages used
English (eng)
Page start
p. 9467
Page end
p. 9475
Subject
270 Language and Computation in Neural Systems; PsycholinguisticsAbstract
Neural oscillations track linguistic information during speech comprehension (e.g., Ding et al., 2016; Keitel et al., 2018), and are known to be modulated by acoustic landmarks and speech intelligibility (e.g., Doelling et al., 2014; Zoefel & VanRullen, 2015). However, studies investigating linguistic tracking have either relied on non-naturalistic isochronous stimuli or failed to fully control for prosody. Therefore, it is still unclear whether low frequency activity tracks linguistic structure during natural speech, where linguistic structure does not follow such a palpable temporal pattern. Here, we measured electroencephalography (EEG) and manipulated the presence of semantic and syntactic information apart from the timescale of their occurrence, while carefully controlling for the acoustic-prosodic and lexical-semantic information in the signal. EEG was recorded while 29 adult native speakers (22 women, 7 men) listened to naturally-spoken Dutch sentences, jabberwocky controls with morphemes and sentential prosody, word lists with lexical content but no phrase structure, and backwards acoustically-matched controls. Mutual information (MI) analysis revealed sensitivity to linguistic content: MI was highest for sentences at the phrasal (0.8-1.1 Hz) and lexical timescale (1.9-2.8 Hz), suggesting that the delta-band is modulated by lexically-driven combinatorial processing beyond prosody, and that linguistic content (i.e., structure and meaning) organizes neural oscillations beyond the timescale and rhythmicity of the stimulus. This pattern is consistent with neurophysiologically inspired models of language comprehension (Martin, 2016, 2020; Martin & Doumas, 2017) where oscillations encode endogenously generated linguistic content over and above exogenous or stimulus-driven timing and rhythm information. SIGNIFICANCE STATEMENT: Biological systems like the brain encode their environment not only by reacting in a series of stimulus-driven responses, but by combining stimulus-driven information with endogenous, internally-generated, inferential knowledge and meaning. Understanding language from speech is the human benchmark for this. Much research focusses on the purely stimulus-driven response, but here, we focus on the goal of language behavior: conveying structure and meaning. To that end, we use naturalistic stimuli that contrast acoustic-prosodic and lexical-semantic information to show that, during spoken language comprehension, oscillatory modulations reflect computations related to inferring structure and meaning from the acoustic signal. Our experiment provides the first evidence to date that compositional structure and meaning organize the oscillatory response, above and beyond prosodic and lexical controls.
This item appears in the following Collection(s)
- Academic publications [227942]
- Donders Centre for Cognitive Neuroimaging [3570]
- Electronic publications [107434]
- Faculty of Social Sciences [28476]
Upload full text
Use your RU credentials (u/z-number and password) to log in with SURFconext to upload a file for processing by the repository team.