Abstract
Purpose: A number of studies have demonstrated that people often integrate information from multiple perceptual cues in a statistically optimal manner when judging properties of surfaces in a scene. For example, subjects typically weight the information based on each cue to a degree that is inversely proportional to the variance of the distribution of a scene property given a cue's value. We wanted to determine whether subjects similarly use information about the reliabilities of low-level visual features when making image-based discriminations, as in visual texture discriminations. Methods: Three experiments used a modified version of the classification image technique. A basis set consisting of 20 low-level visual features (resembling narrow-band textures) was generated. Prototypes for visual categories were defined as specific linear combinations of elements from this basis set. Stimuli were constructed by corrupting prototypes with additive Gaussian noise defined in our basis space. In each trial, subjects were asked to determine which of two prototypes was embedded in a noisy test stimulus. The variance structure of the noise was varied across experimental conditions, and the template for an ideal observer was calculated based on the feature covariance matrix. Logistic regression was used to calculate classification images for the subjects over the course of training, and these images were cross-correlated with the ideal template. Conclusions: As the variance structure of the noise was changed, subjects modified their templates in a manner corresponding to the resulting changes in the ideal template. The data suggest that subjects were sensitive to our features and weighted each feature based on its reliability as defined by the feature covariance matrix. We conclude that human observers indeed use information about the reliabilities of low-level features when making image-based discriminations. This work was supported by NIH research grant RO1-EY13149.
This work was supported by NIH research grant RO1-EY13149.