Abstract
Despite extensive research on stereoscopic cues, whether stereoscopic slant is based on point vs. orientation disparity is still a matter of debate. We measured slant sensitivity for stereoscopic planar patches covered by: back-projected straight lines with random orientation (Experiment 1) and random dots (Experiment 2). Surfaces were inclined around both horizontal (H) and vertical (V) axes, and were viewed through a circular aperture on a fronto-parallel screen hiding their boundaries. Three reference patches were used with increasing simulated tilt: 113, 123, 143 deg. The second patch had a simulated slant of 66-deg; while the first and the third were less slanted (56-deg) but had opposite H/V inclinations. In a sequential matching task observers judged whether the reference patch was more/less slanted than a test patch with the same tilt but a variable amount of maximum disparity [from −0.33 to 0.33 in seven steps including the 0]. In both experiments a slant by tilt anisotropy was found with slant sensitivity decreasing as the tilt of the reference increased. Discriminability for test patches with smaller disparity than the reference decreased as the tilt increased, and vice versa for test patches with larger disparity. Results are consistent with a model that extracts surface orientation via the implicit knowledge of the family of (H, V, convergence angle)-triplets compatible with the stereoscopic images; while they cannot be explained by either the linear combination of (H, V, convergence angle)-parameters or the maximum amount of disparity. We show that the family of compatible (H, V, convergence angle)-triplets is derivable from the relation between the local orientation disparity and the average orientation of projected surface markings (in random dot stereograms inferable from the horizontal shear angle of corresponding groups of dots) and discuss how the orientation disparity field can account for slant biases without assumptions on viewing geometry.
MUR-PRIN grant n. 2005119851.