Abstract
Contextual cues guide and facilitate visual search in both synthetic (Chun & Jiang, 1998) and natural images (Eckstein et al., 2006; Torralba et al., 2006). Although studies have identified the neural correlates of the attentional shifts with synthetic cues (Woodman & Luck, 1999; Johnson et al., 2007; Woodman et al., 2009), little is known about the neural basis of contextual cueing in natural scenes. We used multivariate pattern classifiers to analyze neural activity (electroencephalography, EEG) during search for objects in natural scenes and predict the contextual location of the expected target from a single trial. Methods: Ten naive observers searched for a target specified by a word (500ms duration) presented prior to natural scene (100ms). Targets were present in 50% of the 800 images. Critically, target absent images were selected so that a single expected location (left/right lateralized) was consistent with the sought target. Observers reported their decision (target present/absent) using a 10-point confidence rating. Results: The results showed a weak (nonsignificant) N2pc event-related potential (ERP) component, often found in visual search tasks. Classifier performance (area under the ROC) identifying the expected location (left/right) of target absent images from single trial EEG for a 100-700ms time epoch was 0.7 ± 0.02. Classification using the electrooculogram (EOG) was not significantly above chance (50%) suggesting that our results cannot be explained by eye movements to the expected target locations. Analysis using target present images to train the classifier and then predict the expected location in the target absent images resulted in chance performance. Conclusion: Our findings suggest that contextual locations in natural scenes can be predicted reliably from neural activity recorded when observers are searching for targets. The identified neural mechanisms predicting context are distinct from those coding the physical presence of the target.
Army grant W911NF- 09-D-0001.