Automated recognition of grooming behavior in wild chimpanzees
Publication year
2024In
Samuelson, L.K.; Frank, S.L.; Toneva, M. (ed.), Proceedings of the 46th Annual Meeting of the Cognitive Science Society, pp. 691-697Related links
Annotation
CogSci 2024: 46th Annual Meeting of the Cognitive Science Society (Rotterdam, the Netherlands, 24-27 July, 2024)
Publication type
Article in monograph or in proceedings
Display more detailsDisplay less details
Editor(s)
Samuelson, L.K.
Frank, S.L.
Toneva, M.
Mackey, A.
Hazeltine, E.
Organization
AI Taal, Spraak en Communicatie
SW OZ DCC PL
Languages used
English (eng)
Book title
Samuelson, L.K.; Frank, S.L.; Toneva, M. (ed.), Proceedings of the 46th Annual Meeting of the Cognitive Science Society
Page start
p. 691
Page end
p. 697
Subject
Language & Communication; PsycholinguisticsAbstract
Video recording is a widely used tool for studying animal behavior, especially in fields such as primatology. Primatologists rely on video data to analyze and research topics such as social grooming to uncover subtle mechanisms behind complex social behavior and structures. Insights into these social behaviors may provide us with a better understanding of our closest living relatives, but also new theories and insights into our own behavior. However, analyzing this type of data using manual annotation is currently a time-consuming task. Here we present an end-to-end pipeline to track chimpanzee (Pan troglodytes) poses using DeepLabCut (DLC) which then serves as input to a support vector machine. This classifier was trained to detect role transitions within grooming interactions. We replicate a recent study showing that DLC has usability value for chimpanzee data collected in natural environments. Our combined method of tracking and classification is remarkably successful in detecting the presence of grooming, indicating the directionality and a change in turn with an accuracy above 86% on unseen videos. We can identify particular pose features used in the classification of grooming, which will contribute to the exploration of turn-taking dynamics on a scale that would otherwise be difficult to attain with traditional methods. Finally, our pipeline can in principle be applied to recognize a variety of other socially interactive behaviors that are largely recognizable by (joint) postural states.
This item appears in the following Collection(s)
- Academic publications [245186]
- Electronic publications [132505]
- Faculty of Arts [29859]
- Faculty of Social Sciences [30339]
- Open Access publications [106105]
Upload full text
Use your RU credentials (u/z-number and password) to log in with SURFconext to upload a file for processing by the repository team.