Purchase this article with an account.
Marissa Nederhouser, Xiaomin Yue, Irving Biederman; Predicting psychophysical similarity of complex shapes from measures of physical similarity. Journal of Vision 2006;6(6):320. doi: 10.1167/6.6.320.
Download citation file:
© 2017 Association for Research in Vision and Ophthalmology.
Subjects performed a match-to-sample task in which they judged which of two comparison stimuli, presented briefly, was identical to the sample. The stimuli were unfamiliar, smooth, asymmetric, complex 3D blobs produced by varying the orientations of the second and third harmonics of a sphere and then adding these orientations to the sphere and fourth harmonic. The (dis)similarity between the matching and distracting blobs was assessed by four measures: a) subjective pair-wise ratings made by human subjects, b) Euclidean distances in a 2D stimulus space defined by the differences in the angles of rotation of the orientation-varying harmonics, c) mean pixel luminance energy differences between pairs of images, and d) the von der Malsburg's Gabor-jet model (Lades, et al., 1993), designed to model aspects of V1 simple-cell filtering. The last is based on a wavelet-like filtering of the image by a lattice of Gabor jets, each composed of kernels over multiple scales and orientations. Similarity in the model is a function of the correlation of the activation values between corresponding kernels in corresponding jets. All four measures correlated negatively with error rates on the match-to-sample trials: Euclidean distance = −.804, judged similarity =−.846, pixel energy= −.891, and Gabor jet = −.940. In the absence of salient nonaccidental differences (e.g., differences in parts or whether contours are straight vs. curved), a model based on V1 computations does remarkably well in scaling the psychophysical similarity of complex, novel shapes.
This PDF is available to Subscribers Only