August 2016
Volume 16, Issue 12
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2016
The new best model of visual search can be found in the brain
Author Affiliations
  • Gregory Zelinsky
    Department of Psychology, Stony Brook University
  • Hossein Adeli
    Department of Psychology, Stony Brook University
  • Françoise Vitu
    Laboratoire de Psychologie Cognitive, CNRS, Aix-Marseille Université
Journal of Vision September 2016, Vol.16, 996. doi:10.1167/16.12.996
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Gregory Zelinsky, Hossein Adeli, Françoise Vitu; The new best model of visual search can be found in the brain. Journal of Vision 2016;16(12):996. doi: 10.1167/16.12.996.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Modern image-based models of search prioritize fixation locations using target maps that capture visual evidence for a target goal. But while many such models are biologically plausible, none have looked to the oculomotor system for design inspiration or parameter specification. These models also focus disproportionately on specific target exemplars, ignoring the fact that many important targets are categories (e.g., weapons, tumors). We introduce MASC, a Model of Attention in the Superior Colliculus (SC). MASC differs from other image-based models in that it is grounded in the neurophysiology of the SC, a mid-brain structure implicated in programming saccades—the behaviors to be predicted. It first creates a target map in one of two ways: by comparing a target image to objects in a search display (exemplar search), or by using a SVM-classifier trained on the target category to estimate the probability of search display objects being target category members (categorical search). MASC then projects this target map into the foveally-magnified space of the SC, where cascading operations average priority signals over visual and motor neural populations. Motor populations compete, with the vector average of the winning population determining the next saccade. We evaluated MASC against exemplar and categorical search datasets, where two groups of 15 subjects viewed identical search displays after presentation of exemplar or categorical target cues. MASC predicted saccade-distance traveled to the target and the proportion of immediate target fixations nearly as well as a Subject model created using the leave-one-out method. MASC's success stems from its incorporation of constraints from the saccade programming literature. Whereas most models of search explore different algorithms and parameter spaces to minimize prediction error, MASC takes its parameters directly from the brain. The brain already found the optimal parameters for search, and it is in the brain that we should look for model inspiration.

Meeting abstract presented at VSS 2016

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×