Abstract
Veridical perception of surface slant is important to everyday tasks such as traversing terrain and interacting with or placing objects on surfaces. However, natural surfaces contain higher-order depth variation, or curvature, which may impact how slant is perceived. We propose a computational model which predicts that curvature, real or distortion-induced, biases the perception of surface slant. The model is based on the perspective projection of surfaces to form “retinal images” containing monocular and binocular texture cues (gradients) for slant estimation. Curvature was either intrinsic to the modelled surface or induced by non-uniform magnification i.e. radial distortion (typical in wide-angle lenses and head-mounted display optics). The resulting binocular and monocular texture gradients derived from these conditions make specific predictions regarding perceived surface slant. In a series of psychophysical experiments we tested these predictions using slant discrimination and magnitude estimation tasks. Our results confirm that local slant estimation is biased in a manner consistent with apparent surface curvature. Further we show that for concave surfaces, irrespective of whether curvature is intrinsic or distortion-induced, there is a net underestimation of global surface slant. Somewhat surprisingly, we also find that the observed biases in global slant are driven largely by the texture gradients and not by the concurrent changes in binocular disparity. This is due to vertical asymmetry in texture gradients of curved surfaces with overall slant. Our results show that while there is a potentially complex interaction between surface curvature and slant perception, much of the perceptual data can be predicted by a relatively simple model based on perspective projection. The work highlights the importance of evaluating the impact of higher-order variations on perceived surface attitude, particularly in virtual environments in which curvature may be intrinsic or caused by optical distortion.