Distinguishing between target and nontarget fixations in a visual search task using fixation-related potentials
Publication year
2013Number of pages
10 p.
Source
Journal of Vision, 13, 3, (2013), article 17ISSN
Publication type
Article / Letter to editor
Display more detailsDisplay less details
Organization
SW OZ DCC AI
Journal title
Journal of Vision
Volume
vol. 13
Issue
iss. 3
Languages used
English (eng)
Subject
Cognitive artificial intelligence; DI-BCB_DCC_Theme 4: Brain Networks and Neuronal CommunicationAbstract
The P300 event-related potential (ERP) can be used to infer whether an observer is looking at a target or not. Common practice in P300 experiments and applications is that observers are asked to fixate their eyes while stimuli are presented. We investigated the possibility to differentiate between single target and nontarget fixations in a target search task involving eye movements by using EEG epochs synchronized to fixation onset (fixation-related potentials: FRPs). Participants systematically scanned search displays consisting of six small Landolt Cs in search of Cs with a particular orientation. After each search display, they indicated whether and where target Cs had been presented. As expected, an FRP component consistent with the P300 reliably distinguished between target and nontarget fixations. It was possible to classify single FRPs into target and nontarget FRPs above chance (on average 62% correct, where 50% would be chance). These results are the first step to practical applications such as covertly monitoring observers' interests and supporting search tasks.
This item appears in the following Collection(s)
- Academic publications [244228]
- Electronic publications [131195]
- Faculty of Social Sciences [30034]
- Open Access publications [105201]
Upload full text
Use your RU credentials (u/z-number and password) to log in with SURFconext to upload a file for processing by the repository team.