December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
Binocular viewing geometry shapes neural processing of slanted planes: Results from theoretical V1 modeling and human psychophysics
Author Affiliations & Notes
  • Stephanie M. Shields
    The University of Texas at Austin
  • Alexander C. Huk
    The University of Texas at Austin
  • Lawrence K. Cormack
    The University of Texas at Austin
  • Footnotes
    Acknowledgements  NIH NEI 5R01EY020592 (LKC, ACH); NSF GRFP (supporting SMS)
Journal of Vision December 2022, Vol.22, 3946. doi:https://doi.org/10.1167/jov.22.14.3946
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Stephanie M. Shields, Alexander C. Huk, Lawrence K. Cormack; Binocular viewing geometry shapes neural processing of slanted planes: Results from theoretical V1 modeling and human psychophysics. Journal of Vision 2022;22(14):3946. https://doi.org/10.1167/jov.22.14.3946.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

An important component of real-world scene perception is the estimation of 3D surface orientation (slant and tilt). However, neither the neural circuits that compute 3D orientation nor the supporting computations have been fully uncovered. To help address that gap, we apply a projective geometry framework to the study of both the neural processing and perception of 3D orientation, building on recent work that suggests the mapping of the 3D environment onto the two retinae shapes both the tuning of neurons in area MT to motion-in-depth and corresponding patterns of perceptual errors in human psychophysics (Bonnen et al., 2020). The projection of slanted planes results in interocular disparities in retinal orientation, and interestingly, binocular V1 neurons can have different monocular orientation preferences (Bridge & Cumming, 2001), which could make them sensitive to orientation disparities. Therefore, we have constructed an encoding model that predicts responses of binocular V1 neurons to the orientation disparities produced by a stimulus plane with varying slant, given each theoretical neuron’s monocular orientation preferences and ocular dominance level. The resulting slant tuning curves are noncanonical in shape, do not tile slant evenly, and are strongly affected by viewing distance. Notably, at greater viewing distances, extreme slants (beyond approximately +/- 70 degrees) are encoded by steep portions of the curves, but moderate and low slants are encoded by relatively flat portions. As expected based on those observations, our model population decoder performs best at extreme slants, and performance degrades as slant approaches zero and as viewing distance increases. We additionally observed these patterns in human perceptual performance on a slant discrimination task. These results support the theory that the neural processing of 3D scene components is shaped by projective geometry and represent a step toward advancing understanding of the computations supporting 3D orientation perception.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×