August 2009
Volume 9, Issue 8
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2009
Making big things look small: Blur combined with other depth cues affects perceived size and distance
Author Affiliations
  • Robert Held
    Joint Graduate Group in Bioengineering, University of California, San Francisco and University of California, Berkeley
  • Emily Cooper
    Helen Wills Neuroscience Institute, University of California, Berkeley
  • James O'Brien
    Department of Electrical Engineering and Computer Science, University of California, Berkeley
  • Martin Banks
    Joint Graduate Group in Bioengineering, University of California, San Francisco and University of California, Berkeley, and Helen Wills Neuroscience Institute, University of California, Berkeley
Journal of Vision August 2009, Vol.9, 959. doi:https://doi.org/10.1167/9.8.959
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Robert Held, Emily Cooper, James O'Brien, Martin Banks; Making big things look small: Blur combined with other depth cues affects perceived size and distance. Journal of Vision 2009;9(8):959. https://doi.org/10.1167/9.8.959.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract
 

Blur is commonly considered a weak distance cue, but photographic techniques that manipulate blur cause significant and compelling changes in the perceived distance and size of objects. One such technique is “tilt-shift miniaturization,” in which a camera's lens is translated and slanted relative to the film plane. The result is an exaggerated vertical blur gradient that makes scenes with a vertical distance gradient (e.g., bird's-eye view of landscape) appear significantly nearer and therefore smaller. We will begin by demonstrating this compelling effect, and then describe how we used it to examine the visual system's use of blur as a cue to distance and size. In a psychophysical experiment, we presented computer-generated, bird's-eye images of a highly realistic model of a city. Blur was manipulated in four ways: 1) sharp images with no blur; 2) horizontal blur gradients were applied to those images; 3) vertical gradients were applied; 4) a large aperture (diameter up to 60m) was used to create an image with an accurate correlation between blur and depth for realizable, small-scale scenes. Observers indicated the perceived distance to objects in the images. Technique 1 produced a convincing impression of a full-sized scene. Technique 2 produced no systematic miniaturization. Techniques 3 and 4 produced significant and similar miniaturization. Thus, the correlation between blur and the depth indicated by other cues affects perceived distance and size. The correlation must be only reasonably accurate to produce a significant and systematic effect. We developed a probabilistic model of the relationship between blur and distance. An interesting prediction of the model is that blur only affects perceived distance when coupled with other distance cues, which is manifested in the tilt-shift effect we observed in humans. Thus, blur is a useful cue to absolute distance when coupled with other depth information.

 
Held, R. Cooper, E. O'Brien, J. Banks, M. (2009). Making big things look small: Blur combined with other depth cues affects perceived size and distance [Abstract]. Journal of Vision, 9(8):959, 959a, http://journalofvision.org/9/8/959/, doi:10.1167/9.8.959. [CrossRef]
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×