August 2010
Volume 10, Issue 7
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2010
Predicting contextual locations in natural scenes from neural activity
Author Affiliations
  • Koel Das
    Department of Psychology, University of California, Santa Barbara
  • Fei Guo
    Department of Psychology, University of California, Santa Barbara
  • Barry Geisbrecht
    Department of Psychology, University of California, Santa Barbara
    Institute for Collaborative Biotechnologies,University of California, Santa Barbara
  • Miguel P. Eckstein
    Department of Psychology, University of California, Santa Barbara
    Institute for Collaborative Biotechnologies,University of California, Santa Barbara
Journal of Vision August 2010, Vol.10, 1295. doi:10.1167/10.7.1295
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Koel Das, Fei Guo, Barry Geisbrecht, Miguel P. Eckstein; Predicting contextual locations in natural scenes from neural activity. Journal of Vision 2010;10(7):1295. doi: 10.1167/10.7.1295.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Contextual cues guide and facilitate visual search in both synthetic (Chun & Jiang, 1998) and natural images (Eckstein et al., 2006; Torralba et al., 2006). Although studies have identified the neural correlates of the attentional shifts with synthetic cues (Woodman & Luck, 1999; Johnson et al., 2007; Woodman et al., 2009), little is known about the neural basis of contextual cueing in natural scenes. We used multivariate pattern classifiers to analyze neural activity (electroencephalography, EEG) during search for objects in natural scenes and predict the contextual location of the expected target from a single trial. Methods: Ten naive observers searched for a target specified by a word (500ms duration) presented prior to natural scene (100ms). Targets were present in 50% of the 800 images. Critically, target absent images were selected so that a single expected location (left/right lateralized) was consistent with the sought target. Observers reported their decision (target present/absent) using a 10-point confidence rating. Results: The results showed a weak (nonsignificant) N2pc event-related potential (ERP) component, often found in visual search tasks. Classifier performance (area under the ROC) identifying the expected location (left/right) of target absent images from single trial EEG for a 100-700ms time epoch was 0.7 ± 0.02. Classification using the electrooculogram (EOG) was not significantly above chance (50%) suggesting that our results cannot be explained by eye movements to the expected target locations. Analysis using target present images to train the classifier and then predict the expected location in the target absent images resulted in chance performance. Conclusion: Our findings suggest that contextual locations in natural scenes can be predicted reliably from neural activity recorded when observers are searching for targets. The identified neural mechanisms predicting context are distinct from those coding the physical presence of the target.

Das, K. Guo, F. Geisbrecht, B. Eckstein, M. P. (2010). Predicting contextual locations in natural scenes from neural activity [Abstract]. Journal of Vision, 10(7):1295, 1295a, http://www.journalofvision.org/content/10/7/1295, doi:10.1167/10.7.1295. [CrossRef]
Footnotes
 Army grant W911NF- 09-D-0001.
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×