Abstract
Many stereotypical behaviors in contour extrapolation, cue integration, and motion perception can be understood as the result of optimal inference under sensory uncertainty. In 3D motion perception, observers' estimates of object motion are typically biased laterally: objects on a collision course with the head are judged as missing the head (Harris & Dean, 2003; Welchman et al., 2004). These biases have been modeled as the result of differential reliability in lateral motion and motion in depth estimation (Lages 2006; Welchman, 2008) combined with a prior expectation for slow speeds (Simoncelli, 1993; Weiss et al., 2002; Stocker & Simoncelli, 2006). A critical test of this explanation requires investigation of 3D motion estimation under variable levels of uncertainty. Here we aim to advance the understanding of the motion estimation biases through development of a framework that supports systematic manipulation of sensory uncertainty. On each trial, a target moved along a linear trajectory in the x-z plane for 1s before disappearing. The observer subsequently adjusted the position of a 'paddle' along a circular orbit, so that the paddle would have intercepted the target, providing an estimate of the observer's perceived motion direction. We manipulated noise in two ways. In the external noise manipulation, the target appeared at fixation with one of three contrast levels: 100%, 10%, 6.5%. In the internal noise manipulation, a 100% contrast target appeared in a location in front of, behind, or at the fixation plane. Estimated motion direction of 100% contrast targets at fixation was near veridical. However, significant lateral biases in trajectory estimates emerged with both noise manipulations. Analyses of the depth and lateral components of the estimates both reflected increased uncertainty due to noise. In conclusion, these results provide direct evidence that biases in visual processing arise under sensory uncertainty, be it from internal or external noise.
Meeting abstract presented at VSS 2014