Abstract
Searching for an object in the world around us is an everyday experience. Success depends on many factors, including the structure of the scene being searched, the nature of the object being sought, and, critically, where we look. Color has been found to be influential in such tasks, but primarily as a way of defining differences between a target and its background, and usually with abstract geometric displays rather than with more natural scenes. The aim of this study was to determine to what extent the local color properties of natural scenes can predict the distribution of observers' fixation positions. Seven observers with normal color vision and visual acuity were presented with 1-s images of 20 natural scenes, each subtending 17x13 deg visual angle on a color monitor. The target to be detected was a shaded gray sphere, subtending 0.25deg, embedded randomly in the scene and matched in mean luminance to its local surround to avoid producing accidental luminance cues. Observers' gaze position was simultaneously monitored with an infra-red video eye-tracker sampling at 250 Hz. The spatial distribution of observers' fixations was fitted by linear combinations of the spatial distribution of local color properties, namely lightness and the red-green and blue-yellow components of chroma. Goodness of fit was quantified by the proportion of variance explained by local color properties, after adjustments for degrees of freedom in the fit and smoothing of the distributions. It was found that when averaged over scenes the proportion of the variance explained was 36-40%, but there were large differences between scenes, and for some the proportion reached 75-84%, depending on the degree of smoothing. Despite a common assumption that achromatic features dominate gaze behavior, the present results suggest that local color information can be at least as important in influencing where we look.
Meeting abstract presented at VSS 2012