Gesture-speech coupling in persons with aphasia: A kinematic-acoustic analysis
Source
Journal of Experimental Psychology - General, 152, 5, (2023), pp. 1469-1483ISSN
Publication type
Article / Letter to editor
Display more detailsDisplay less details
Organization
SW OZ DCC PL
Journal title
Journal of Experimental Psychology - General
Volume
vol. 152
Issue
iss. 5
Languages used
English (eng)
Page start
p. 1469
Page end
p. 1483
Subject
PsycholinguisticsAbstract
Aphasia is a profound language pathology hampering speech production and/or comprehension. People With Aphasia (PWA) use more manual gestures than Non-Brain Injured (NBI) individuals. This intuitively invokes the idea that gesture is compensatory in some way, but there is variable evidence of a gesture-boosting effect on speech processes. The status quo in gesture research with PWA is an emphasis on categorical analysis of gesture types, focusing on how often they are recruited, and whether more or less gesturing aids communication or speaking. However, there are increasingly louder calls for the investigation of gesture and speech as continuous entangled modes of expression. In NBI adults, expressive moments of gesture and speech are synchronized on the prosodic level. It has been neglected how this multimodal prosody is instantiated in PWA. In the current study, we perform the first acoustic-kinematic gesture-speech analysis in Persons With Aphasia (i.e., Wernicke's, Broca's, Anomic) relative to age-matched controls, where we apply several multimodal signal analysis methods. Specifically, we related the speech peaks (smoothed amplitude envelope change) with that of the nearest peaks in the gesture acceleration profile. We obtained that the magnitude of gesture versus speech peaks are positively related across the groups, though more variably for PWA, and such coupling was related to less severe Aphasia-related symptoms. No differences were found between controls and PWA in terms of temporal ordering of speech envelope versus acceleration peaks. Finally, we show that both gesture and speech have slower quasi-rhythmic structure, indicating that next to speech, gesture is slowed down too. The current results indicate that there is a basic gesture-speech coupling mechanism that is not fully reliant on core linguistic competences, as it is found relatively intact in PWA. This resonates with a recent biomechanical theory of gesture, which renders gesture-vocal coupling as fundamental and a priori to the (evolutionary) development of core linguistic competences.
This item appears in the following Collection(s)
- Academic publications [245012]
- Electronic publications [132296]
- Faculty of Social Sciences [30311]
- Open Access publications [105919]
Upload full text
Use your RU credentials (u/z-number and password) to log in with SURFconext to upload a file for processing by the repository team.