Purchase this article with an account.
Henry Galperin, Peter Bex, Jozsef Fiser; Orientation integration in complex visual processing. Journal of Vision 2009;9(8):1020. doi: https://doi.org/10.1167/9.8.1020.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
How does the visual system integrate local features to represent global object forms? Previously we quantified human orientation sensitivity in complex natural images and found that orientation is encoded only with limited precision defined by an internal threshold that is set by predictability of the stimulus (VSS 2007). Here we tested the generality of this finding by asking whether local orientation information is integrated differently when orientation noise was distributed across a scene, and in an object identification task for natural images that were reconstructed from a fixed number of Gabor wavelets. In the noise discrimination task, subjects viewed pairs of images where orientation noise was added to all the elements of only one or both images, or was distributed evenly between the two images, and were required to identify the noisier pair of images. Sensitivity to orientation noise with the addition of external noise produced a dipper function that did not change with the manner in which noise was distributed, suggesting that orientation information is integrated consistently irrespective of the distribution of orientation information across the scene. In the identification task, subjects identified an object from four categories, randomly selected from a total of 40 categories. The proportion of signal Gabors, whose orientation and position were taken from the object, and noise Gabors, whose positions were randomly assigned, was adjusted to find the form coherence threshold for 75% correct object identification. Signal elements consisted of pairs of adjacent Gabors whose orientation difference was low (contour-defining), high (corner-defining), or randomly selected. Thresholds for image identification were only slightly elevated compared with earlier discrimination results, and were equal for all types of signal elements used. These results suggest that orientation information is integrated by perceptual templates that depend on orientation predictability but not on the complexity level of the visual task.
This PDF is available to Subscribers Only