Abstract
Visual cues to depth are inherently ambiguous and uncertain. Classic research in vision science has demonstrated that human judgments of depth are closely predicted by Bayesian cue integration, which weights individual depth cues by their relative uncertainties to reach a more reliable estimate. While this weighting of cues is linear for the case of Gaussian variability, more complex interactions of cues will result in strongly non-linear cue weighting, therefore providing a stronger test of predictions of Bayesian computations. One such scenario is perceptual explaining-away, in which an auxiliary cue helps disambiguating the influence of two causes of a sensory measurement. Here we investigate whether human subjects utilize a known texture as auxiliary cue to infer size and depth of a ball when only a relative size cue is given in a 2d display and when a relative size cue is given together with stereo disparity in a VR display. In our experiment subjects decided in a 2AFC task which of two spherical objects, shown either on a computer screen or an HMD, was closer. The objects differed in their textures to appear as soccer, tennis or golf balls. Used size ratios were adjusted to match realistic size ratios. Further, we gathered eye tracking data in the 3d condition to investigate how decisions were related to looking times. Based on a probabilistic computational model in the Bayesian framework we inferred subjects' prior beliefs about size ratios. Our model takes uncertainty into account both for the perceived ratio and participants' prior belief and enables us to use the collected behavioral data to infer the shape of subjects' internal belief structure. The results show that human decisions in size judgments can be explained as perceptual explaining away, prior size ratios are quite accurate, and response probabilities scale linearly with looking times.
Meeting abstract presented at VSS 2018