Semantically related gestures move alike: Towards a distributional semantics of gesture kinematics
Publication year
2021Publisher
Cham : Springer
ISBN
9783030778170
In
Duffy, V.G. (ed.), Digital human modeling and spplications in health, safety, ergonomics and risk nanagement. human body, motion and behavior: Proceedings HCII 2021, pp. 269-287Annotation
HCII 2021: 12th International Conference on Human-Computer Interaction (July 24-29, 2021)
Publication type
Article in monograph or in proceedings
Display more detailsDisplay less details
Editor(s)
Duffy, V.G.
Organization
SW OZ DCC PL
PI Group Intention & Action
Toegepaste Taalwetenschap
PI Group Neurobiology of Language
Languages used
English (eng)
Book title
Duffy, V.G. (ed.), Digital human modeling and spplications in health, safety, ergonomics and risk nanagement. human body, motion and behavior: Proceedings HCII 2021
Page start
p. 269
Page end
p. 287
Subject
110 000 Neurocognition of Language; 111 000 Intention & Action; Cultural Cognition and Multimodal Interaction; Language & Communication; Psycholinguistics; Language in InteractionAbstract
Most manual communicative gestures that humans produce cannot be looked up in a dictionary, as these manual gestures inherit their meaning in large part from the communicative context and are not conventionalized. However, it is understudied to what extent the communicative signal as such - bodily postures in movement, or kinematics - can inform about gesture semantics. Can we construct, in principle, a distribution-based semantics of gesture kinematics, similar to how word vectorization methods in NLP (Natural language Processing) are now widely used to study semantic properties in text and speech? For such a project to get off the ground, we need to know the extent to which semantically similar gestures are more likely to be kinematically similar. In study 1 we assess whether semantic word2vec distances between the conveyed concepts participants were explicitly instructed to convey in silent gestures, relate to the kinematic distances of these gestures as obtained from Dynamic Time Warping (DTW). In a second director-matcher dyadic study we assess kinematic similarity between spontaneous co-speech gestures produced between interacting participants. Participants were asked before and after they interacted how they would name the objects. The semantic distances between the resulting names were related to the gesture kinematic distances of gestures that were made in the context of conveying those objects in the interaction. We find that the gestures' semantic relatedness is reliably predictive of kinematic relatedness across these highly divergent studies, which suggests that the development of an NLP method of deriving semantic relatedness from kinematics is a promising avenue for future developments in automated multimodal recognition. Deeper implications for statistical learning processes in multimodal language are discussed.
Subsidient
NWO (Grant code:info:eu-repo/grantAgreement/NWO/Gravitation/024.001.006)
This item appears in the following Collection(s)
- Academic publications [246216]
- Donders Centre for Cognitive Neuroimaging [4037]
- Electronic publications [133894]
- Faculty of Arts [30004]
- Faculty of Social Sciences [30432]
- Open Access publications [107414]
Upload full text
Use your RU credentials (u/z-number and password) to log in with SURFconext to upload a file for processing by the repository team.