August 2012
Volume 12, Issue 9
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2012
Visual search in natural scenes: fixation positions predicted by local color properties
Author Affiliations
  • David H. Foster
    School of Electrical & Electronic Engineering, University of Manchester, Manchester M13 9PL, UK
  • Kinjiro Amano
    School of Electrical & Electronic Engineering, University of Manchester, Manchester M13 9PL, UK
  • Matthew S. Mould
    School of Electrical & Electronic Engineering, University of Manchester, Manchester M13 9PL, UK
  • John P. Oakley
    School of Electrical & Electronic Engineering, University of Manchester, Manchester M13 9PL, UK
Journal of Vision August 2012, Vol.12, 1004. doi:10.1167/12.9.1004
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to Subscribers Only
      Sign In or Create an Account ×
    • Get Citation

      David H. Foster, Kinjiro Amano, Matthew S. Mould, John P. Oakley; Visual search in natural scenes: fixation positions predicted by local color properties. Journal of Vision 2012;12(9):1004. doi: 10.1167/12.9.1004.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Searching for an object in the world around us is an everyday experience. Success depends on many factors, including the structure of the scene being searched, the nature of the object being sought, and, critically, where we look. Color has been found to be influential in such tasks, but primarily as a way of defining differences between a target and its background, and usually with abstract geometric displays rather than with more natural scenes. The aim of this study was to determine to what extent the local color properties of natural scenes can predict the distribution of observers' fixation positions. Seven observers with normal color vision and visual acuity were presented with 1-s images of 20 natural scenes, each subtending 17x13 deg visual angle on a color monitor. The target to be detected was a shaded gray sphere, subtending 0.25deg, embedded randomly in the scene and matched in mean luminance to its local surround to avoid producing accidental luminance cues. Observers' gaze position was simultaneously monitored with an infra-red video eye-tracker sampling at 250 Hz. The spatial distribution of observers' fixations was fitted by linear combinations of the spatial distribution of local color properties, namely lightness and the red-green and blue-yellow components of chroma. Goodness of fit was quantified by the proportion of variance explained by local color properties, after adjustments for degrees of freedom in the fit and smoothing of the distributions. It was found that when averaged over scenes the proportion of the variance explained was 36-40%, but there were large differences between scenes, and for some the proportion reached 75-84%, depending on the degree of smoothing. Despite a common assumption that achromatic features dominate gaze behavior, the present results suggest that local color information can be at least as important in influencing where we look.

Meeting abstract presented at VSS 2012

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×