Generative adversarial networks for reconstructing natural images from brain activity
Publication year
2018Number of pages
11 p.
Source
NeuroImage, 181, (2018), pp. 775-785ISSN
Publication type
Article / Letter to editor

Display more detailsDisplay less details
Organization
SW OZ DCC AI
SW OZ DCC SMN
SW OZ DCC CO
Journal title
NeuroImage
Volume
vol. 181
Languages used
English (eng)
Page start
p. 775
Page end
p. 785
Subject
Action, intention, and motor control; Cognitive artificial intelligence; DI-BCB_DCC_Theme 2: Perception, Action and Control; DI-BCB_DCC_Theme 4: Brain Networks and Neuronal CommunicationAbstract
We explore a method for reconstructing visual stimuli from brain activity. Using large databases of natural images we trained a deep convolutional generative adversarial network capable of generating gray scale photos, similar to stimuli presented during two functional magnetic resonance imaging experiments. Using a linear model we learned to predict the generative model's latent space from measured brain activity. The objective was to create an image similar to the presented stimulus image through the previously trained generator. Using this approach we were able to reconstruct structural and some semantic features of a proportion of the natural images sets. A behavioural test showed that subjects were capable of identifying a reconstruction of the original stimulus in 67.2% and 66.4% of the cases in a pairwise comparison for the two natural image datasets respectively. Our approach does not require end-to-end training of a large generative model on limited neuroimaging data. Rapid advances in generative modeling promise further improvements in reconstruction performance.
This item appears in the following Collection(s)
- Academic publications [203812]
- Electronic publications [102283]
- Faculty of Social Sciences [27301]
- Open Access publications [70926]
Upload full text
Use your RU credentials (u/z-number and password) to log in with SURFconext to upload a file for processing by the repository team.