September 2018
Volume 18, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2018
Perceptual integration of perspective and stereoscopic cues in macaque monkeys
Author Affiliations
  • Ting-Yu Chang
    Department of Neuroscience, School of Medicine and Public Health, University of Wisconsin-Madison, Madison, WI, USAPhysiology Graduate Training Program, University of Wisconsin-Madison, Madison, WI, USA.
  • Byounghoon Kim
    Department of Neuroscience, School of Medicine and Public Health, University of Wisconsin-Madison, Madison, WI, USA
  • Adhira Sunkara
    Department of Surgery, School of Medicine and Public Health, University of Wisconsin-Madison, Madison, WI, USA.
  • Ari Rosenberg
    Department of Neuroscience, School of Medicine and Public Health, University of Wisconsin-Madison, Madison, WI, USA
Journal of Vision September 2018, Vol.18, 489. doi:https://doi.org/10.1167/18.10.489
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Ting-Yu Chang, Byounghoon Kim, Adhira Sunkara, Ari Rosenberg; Perceptual integration of perspective and stereoscopic cues in macaque monkeys. Journal of Vision 2018;18(10):489. https://doi.org/10.1167/18.10.489.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Successful interactions with the world often require robust (i.e., accurate and precise) three-dimensional (3D) visual perception. Robust 3D perception can be achieved through the weighted integration of distinct sensory signals, such as perspective and stereoscopic cues, based on their reliabilities. Because the reliabilities of these cues are differentially affected by viewing conditions (e.g., object distance and slant), dynamic cue-reweighting is critical for robust perception. Here we test if robust 3D perception is achieved through dynamic cue reweighting in non-human primates. Two rhesus monkeys were trained to report the tilt (0° to 315° in 45° steps) of a planar surface in an 8 alternative forced choice task. The planes were rendered virtually and defined by perspective cues only, stereoscopic cues only, or both cues. To manipulate cue reliability, the distances (37 to 137cm) and slants (15° to 60°) of the planes were varied. All stimulus combinations were interleaved and presented pseudo-randomly. To quantify tilt perception for each cue type, distance, and slant combination, the probability of reporting each tilt was computed, and the data fit with a von Mises density function over tilt. The accuracy and precision of tilt perception were defined by the mean and concentration parameters of the fit, respectively. Tilt perception was unbiased, and therefore accurate in all conditions. The precision of tilt perception based on perspective cues was independent of distance and increased with slant. The precision of tilt perception based on stereoscopic cues showed a distance x slant interaction, falling off more slowly with distance at larger slants. A comparison of combined-cue tilt perception measured empirically and predicted using cue integration theory further revealed that the cues were optimally integrated to maximize the precision of tilt perception. These results demonstrate dynamic, reliability-dependent reweighting of perspective and stereoscopic cues by non-human primates to achieve robust 3D visual perception.

Meeting abstract presented at VSS 2018

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×