Abstract
Yu et al. (2018, doi: 10.1167/18.10.720) showed that the apparent shape of a 3D object depended on viewing distance: Far objects appear compressed in depth compared to near ones. The present experiment investigates how well people can notice this perceptual distortion when the viewing distance is changing due to either object-motion or self-motion.
Method: The virtual-reality stimuli were presented on a HTC Vive head-mounted display. On each trial, participants saw two side-by-side polyhedral objects, similar to the stimuli of Yu et al. (2018), floating at eye level within a virtual room. In the object-moving condition, both objects moved synchronously back and forth relative to a stationary observer. In the participant-moving condition, participants walked back and forth relative to the static objects. In either condition, one of the objects was systematically stretching or compressing in depth, whereas the other object remained rigid. The task was to identify the nonrigid object. Both nonrigid deformation and rigid motion occurred along the line from the object’s to the participant’s current locations. The amount of deformation was a linear function of viewing distance. The slope of this function (or rate of nonrigidity) was manipulated by staircase procedures. Rate=0 produced a rigid object. Positive or negative rate produced a nonrigid object that stretched or compressed in depth, respectively, as distance increased. The 75%-correct threshold was estimated by psychometric-function fit.
Results: Data from 3 participants showed that the deformation-discrimination threshold was higher for the positive rates than for the negative rates in both conditions, meaning that deforming that compensating the previously found distance-dependent shape distortion was harder to detect than the deforming exaggerating this distortion. Moreover, the threshold was lower in the participant-moving than in the object-moving condition, suggesting the importance of self-motion in detecting nonrigidity.