Although it is widely believed that perception must be veridical for action to be accurate, an alternative view is that some systematic perceptual errors (e.g., scale expansion) may improve motor performance by enhancing coding precision. Sloped surfaces look and feel (when stood upon) much steeper than they are. The visual perception of geographical slant (GS: surface slant relative to the horizontal) can, in theory, be estimated by combining estimates of optical slant (OS: surface slant relative to the line of gaze) and gaze declination (GD: direction of gaze relative to the horizontal): GS = OS - GD. In studies of downhill slant perception (Li & Durgin, JOV 2009), we found that this simple geometric model predicted visual slant perception based on measured scale-expansion in both perceptual variables: Estimates of optical slant and estimates of the orientation of the axis of gaze itself. Here we show that the same model can be applied to uphill surface orientation. Using an immersive VR system with corrected and calibrated optics, in two experiments we measured (1) the perceived geographical slant of irregular 3D surfaces presented straight ahead or above or below eye level by 22.5 or 45 deg, and (2) the perceived direction of gaze when looking at targets ranging in visual direction from −52.5 to +52.5 deg from horizontal. All surfaces were simulated at a viewing distance of 1.5 m. The best-fitting model of the slant estimation data estimated that gaze direction was exaggerated by a factor of 1.5, exactly as found when directly measured. The model fit to the data estimated that changes in optical slant over the entire measured range were perceptually scaled by a factor of 1.4 at this viewing distance. Scale expansion of optical slant may serve a functional role in the evaluation of upcoming ground plane orientation during locomotion.
Swarthmore College Faculty Research Grant.