until further notice
Author’s version preprint
Number of pages
SourceCerebral Cortex, 30, 2, (2020), pp. 597-606
Article / Letter to editor
Display more detailsDisplay less details
SW OZ DCC SMN
SubjectAction, intention, and motor control
Sounds (e.g., barking) help us to visually identify objects (e.g., a dog) that are distant or ambiguous. While neuroimaging studies have revealed neuroanatomical sites of audiovisual interactions, little is known about the time-course by which sounds facilitate visual object processing. Here we used magnetoencephalography (MEG) to reveal the time-course of the facilitatory influence of natural sounds (e.g., barking) on visual object processing, and compared this to the facilitatory influence of spoken words (e.g., "dog"). Participants viewed images of blurred objects preceded by a task-irrelevant natural sound, a spoken word, or uninformative noise. A classifier was trained to discriminate multivariate sensor patterns evoked by animate and inanimate intact objects with no sounds, presented in a separate experiment, and tested on sensor patterns evoked by the blurred objects in the three auditory conditions. Results revealed that both sounds and words, relative to uninformative noise, significantly facilitated visual object category decoding between 300-500 ms after visual onset. We found no evidence for earlier facilitation by sounds than by words. These findings provide evidence for a semantic route of facilitation by both natural sounds and spoken words, whereby the auditory input first activates semantic object representations, which then modulate the visual processing of objects.
This item appears in the following Collection(s)
- Academic publications 
- Electronic publications 
- Faculty of Social Sciences 
- Open Access publications 
Upload full text
Use your RU credentials (u/z-number and password) to log in with SURFconext to upload a file for processing by the repository team.