The role of head and hand movements for infants' predictions of others' actions
SourcePsychological Research, 83, 6, (2019), pp. 1269-1280
Article / Letter to editor
Display more detailsDisplay less details
SW OZ DCC SMN
SubjectAction, intention, and motor control
In everyday life, both the head and the hand movements of another person reveal the other's action target. However, studies on the development of action prediction have primarily included displays in which only hand and no head movements were visible. Given that infants acquire in their first year both the ability to follow other's gaze and the ability to predict other's reaching actions, the question is whether they rely mostly on the hand or the head when predicting other's manual actions. The current study aimed to provide an answer to this question using a screen-based eye tracking setup. Thirteen-month-old infants observed a model transporting plastic rings from one side of the screen to the other side and place them on a pole. In randomized trials the model's head was either visible or occluded. The dependent variable was gaze-arrival time, which indicated whether participants predicted the model's action targets. Gaze-arrival times were not found to be different when the head was visible or rendered invisible. Furthermore, target looks that occurred after looks at the hand were found to be predictive, whereas target looks that occurred after looks at the head were reactive. In sum, the study shows that 13-month-olds are capable of predicting an individual's action target based on the observed hand movements but not the head movements. The data suggest that earlier findings on infants' action prediction in screen-based tasks in which often only the hands were visible may well generalize to real-life settings in which infants have visual access to the actor's head.
Upload full text
Use your RU credentials (u/z-number and password) to log in with SURFconext to upload a file for processing by the repository team.