Evidence for children's online integration of simultaneous information from speech and iconic gestures: An ERP study
Publication year
2020Number of pages
12 p.
Source
Language, Cognition and Neuroscience, 35, 10, (2020), pp. 1283-1294ISSN
Publication type
Article / Letter to editor
Display more detailsDisplay less details
Organization
Toegepaste Taalwetenschap
SW OZ DCC PL
Journal title
Language, Cognition and Neuroscience
Volume
vol. 35
Issue
iss. 10
Languages used
English (eng)
Page start
p. 1283
Page end
p. 1294
Subject
Language & Communication; Multimodal language and communication; PsycholinguisticsAbstract
Children perceive iconic gestures, along with speech they hear. Previous studies have shown that children integrate information from both modalities. Yet it is not known whether children can integrate both types of information simultaneously as soon as they are available (as adults do) or whether they initially process them separately and integrate them later. Using electrophysiological measures, we examined the online neurocognitive processing of gesture-speech integration in 6- to 7-year-old children. We focused on the N400 event-related potential component which is modulated by semantic integration load. Children watched video clips of matching or mismatching gesture-speech combinations, which varied the semantic integration load. The ERPs showed that the amplitude of the N400 was larger in the mismatching condition than in the matching condition. This finding provides the first neural evidence that by the ages of 6 or 7, children integrate multimodal semantic information in an online fashion comparable to that of adults.
This item appears in the following Collection(s)
- Academic publications [244262]
- Electronic publications [131202]
- Faculty of Arts [29768]
- Faculty of Social Sciences [30036]
- Open Access publications [105225]
Upload full text
Use your RU credentials (u/z-number and password) to log in with SURFconext to upload a file for processing by the repository team.