June 2006
Volume 6, Issue 6
Vision Sciences Society Annual Meeting Abstract  |   June 2006
Learning predictive cues to optimize visual search
Author Affiliations
  • Jason A. Droll
    Dept. of Psychology, University of California Santa Barbara
  • Binh T. Pham
    Dept. of Psychology, University of California Santa Barbara
  • Craig K. Abbey
    Dept. of Psychology, University of California Santa Barbara
  • Miguel P. Eckstein
    Dept. of Psychology, University of California Santa Barbara
Journal of Vision June 2006, Vol.6, 834. doi:https://doi.org/10.1167/6.6.834
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Jason A. Droll, Binh T. Pham, Craig K. Abbey, Miguel P. Eckstein; Learning predictive cues to optimize visual search. Journal of Vision 2006;6(6):834. https://doi.org/10.1167/6.6.834.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Observers performing visual search tasks that require both detection and localization of a target often show improved performance as the experiment progresses, especially when the target is accompanied by a predictive cue. Is this increase in performance due to a general improvement in signal encoding, or a re-weighting of visual information with respect to each cue? Twelve subjects performed 300 trials in which they searched for a bright target among dimmer distractors with contrast noise (mean 110 and 60, stdev. 20), displayed for 2s. Each of the six stimuli was surrounded by a colored circle. Subjects were told that some colors were more likely to contain the target, although the distribution of this likelihood was not specified. Subjects reported the presence and location of the target. Performance in the perceptual task improved by the final quarter of trials and accompanied a change in gaze strategy. By the final quarter of trials, first saccades most frequently targeted predictive cues (24% vs. 13%), and had longer fixation duration (348ms vs 243ms), even on trials when no target was in fact present. False alarm trials for perceptual decisions and first saccades also suggested that localization was most affected by the noise in valid cues. However, this learning was suboptimal when compared to an ideal Bayesian learner model exposed to similar cue statistics. We conclude that visual-saccadic and perceptual decisions during search may be influenced by learned statistics of cue validity, allowing observers to more optimally weight sensory evidence when seeking relevant information.

Droll, J. A. Pham, B. T. Abbey, C. K. Eckstein, M. P. (2006). Learning predictive cues to optimize visual search [Abstract]. Journal of Vision, 6(6):834, 834a, http://journalofvision.org/6/6/834/, doi:10.1167/6.6.834. [CrossRef]
 Supported by NIHEY grant 015925

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.