Abstract
Significant effort has been spent evaluating the performance of saliency algorithms at predicting human fixations in natural images. However, many other aspects of human visual attention have received relatively little focus in the saliency literature but have been richly characterized by psychophysical investigations. In order to make use of this data, Bruce et al. (2015) have recommended the development of an axiomatic set of model constraints grounded in this body of psychophysical knowledge. We aim to provide a step towards this goal by linking human visual search response time to saliency algorithm output. Duncan and Humphreys (1989) theorized that subject response time in visual search tasks is correlated with similarity between search items (with search time increasing as targets become more similar to distractors). This result fits well with the widely held notion that saliency is largely driven by stimulus uniqueness, but has not been explicitly tested against the performance of saliency algorithms. To do so systematically, we need a well-characterized human performance curve for a given set of visual search stimuli. Arun (2012) produced a performance curve for oriented bars which shows the relationship between human response time and target-distractor orientation differences over the range 7-60°. Here we use Arun's stimuli as input to a range of current saliency algorithms and discover that performance falls into three broad categories: algorithms which cannot consistently find the target, those which consistently find the target but have no differentiated performace with target-distractor difference, and those which are able to deliver a human performance-like curve. In this way we provide a new performance criterion that is more closely aligned with the use of saliency as an early selection mechanism. Future work will look at the full set of Wolfe's (1998) features which can elicit efficient search for singleton targets.
Meeting abstract presented at VSS 2016