Abstract
The ability to accurately perceive, represent, and remember spatial information is one of the most foundational abilities of all mobile organisms. Yet in the present work we find that even the simplest possible spatial tasks reveal surprising systematic deviations from the ground truth — such as biases wherein objects are perceived and remembered as being nearer to the centers of their surrounding quadrants. We employed both a relative-location placement task (in which observers see two differently sized shapes, one of which has a dot in it, and then must place a second dot in the other shape so that their relative locations are equated) and a matching task (in which observers see two dots, each inside a separate shape, and must simply report whether their relative locations are matched). Some of the resulting biases were shape-specific. For example, when dots appeared in a triangle during the placement task, the dots placed by observers were biased away from the axes that join the midpoints of each side to the triangle’s center. But many of the systematic biases were not shape-specific, and seemed instead to reflect differences in the grain of resolution for different regions of space itself. For example, with both a circle and a shapeless configuration (with only a central landmark) in the matching task, the data revealed an unexpected dissociation in the acuity for angle vs. distance: in oblique sectors, observers were better at discriminating radial differences (i.e. when a dot moved inward or outward); but in cardinal sectors, observers were better at discriminating angular differences (i.e. when a dot moved around the circle). These data provide new insights about the format of visuospatial representations: the locations of objects may be represented in terms of polar rather than cartesian coordinates.
Acknowledgement: SRY was supported by an NSF graduate research fellowship. BJS was supported by ONR MURI #N00014-16-1-2007.