Subject:
|
Action, intention, and motor control |
Former Organization:
|
F.C. Donders Centre for Cognitive Neuroimaging SW OZ NICI CO
|
Journal title:
|
Journal of Cognitive Neuroscience
|
Abstract:
|
Although generally studied in isolation, action observation and speech comprehension go hand in hand during everyday human communication. That is, people gesture while they speak. From previous research it is known that a tight link exists between spoken language and such hand gestures. This study investigates for the first time the neural correlates of co-speech gestures and the neural locus of the integration of speech and gesture in a naturally occurring situation, i.e. as an integrated whole embedded in contextual information. fMRI data were gathered while subjects viewed a) meaningful and meaningless gestures in the absence of speech and b) the same meaningful gestures in the context of speech, time-locked to the verbal information in a sentence. First we tested whether co-speech gestures are recognised as meaningful actions without speech. In the second part, verbal and/or gestural content matched or mismatched the preceding speech context. Integration load was expected to vary with this manipulation, showing regions specific for and common to gesture and speech processing. No areas were activated stronger to meaningful gestures compared to meaningless gestures without speech. With speech, in reaction to increased integration load, both language and gesture recruited the left inferior frontal cortex. Parietal and temporal regions showed gesture and speech specific responses. We argue a) that co-speech gestures do not convey meaning on their own and b) that both types of information are integrated into the preceding context by left inferior frontal cortex.
|