Purchase this article with an account.
Paul V. McGraw, Neil W. Roach, David R. Badcock, David Whitaker; Size-induced distortions in perceptual maps of visual space. Journal of Vision 2012;12(4):8. doi: https://doi.org/10.1167/12.4.8.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
In order to interact with our environment, the human brain constructs maps of visual space. The orderly mapping of external space across the retinal surface, termed retinotopy, is maintained at subsequent levels of visual cortical processing and underpins our capacity to make precise and reliable judgments about the relative location of objects around us. While these maps, at least in the visual system, support high precision judgments about the relative location of objects, they are prone to significant perceptual distortion. Here, we ask observers to estimate the separation of two visual stimuli—a spatial interval discrimination task. We show that large stimulus sizes require much greater separation in order to be perceived as having the same separation as small stimulus sizes. The relationship is linear, task independent, and unrelated to the perceived position of object edges. We also show that this type of spatial distortion is not restricted to the object itself but can also be revealed by changing the spatial scale of the background, while object size remains constant. These results indicate that fundamental spatial properties, such as retinal image size or the scale at which an object is analyzed, exert a marked influence on spatial coding.
This PDF is available to Subscribers Only