Abstract
Blur is determined by the geometry of the viewed scene, where in the scene the eye is focused, and the size of the pupil. Blur can provide useful information for discriminating distances (Held et al., 2012) and the scale of a scene (Held et al., 2010; Vishwanath & Blaser, 2010). But to make appropriate inferences about scene properties, the visual system needs statistical information about the relationship between blur, scenes, and fixations. We used a mobile eye-tracking and scene-tracking device to investigate this relationship during natural viewing. The device measured fixations and 3D scene layout while participants engaged in everyday activities. We reconstructed the distances to points in each scene from the eye and calculated the point-spread function for all positions in the central 20° of the visual field. The pattern of blur varies from one task to another, but is remarkably consistent between participants. A weighted combination of the patterns across tasks reveals the natural distribution of blur for each position in the visual field. There is a significant vertical gradient of blur, while the horizontal gradient much smaller. The magnitude of likely blurs in different field positions is reasonably consistent with the variation in blur discrimination capability across the visual field. These data reveal the prior distribution of blur as a function of field position and can be used in probabilistic models that infer depth from blur.
Meeting abstract presented at VSS 2015