December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
Movement-related signals support classification invariance for stable visual perception
Author Affiliations & Notes
  • Andrea Benucci
    RIKEN Center for Brain Science
  • Footnotes
    Acknowledgements  This work was funded by RIKEN BSI and RIKEN CBS institutional funding, JSPS grants 26290011, 17H06037, C0219129, and Fujitsu collaborative grant.
Journal of Vision December 2022, Vol.22, 3484. doi:https://doi.org/10.1167/jov.22.14.3484
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Andrea Benucci; Movement-related signals support classification invariance for stable visual perception. Journal of Vision 2022;22(14):3484. https://doi.org/10.1167/jov.22.14.3484.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Stable visual perception during eye and body movements suggests neural algorithms that convert location information—"where” type of signals—across multiple frames of reference, for instance, from retinocentric to craniocentric coordinates. Accordingly, numerous theoretical studies have proposed biologically plausible computational processes to achieve such transformations. However, how coordinate transformations can then be used by the hierarchy of cortical visual areas to produce stable perception remains largely unknown. Here, we explore the hypothesis that perceptual stability equates to robust classification of visual features relative to movements, that is, a “what” type of information processing. We demonstrate in CNNs that neural signals related to eye and body movements support accurate image classification by making “where” type of computations for coordinate transformations faster to learn and more robust relative to input perturbations. Accordingly, movement signals contributed to the emergence of activity manifolds associated with image categories in late CNN layers and to movement-related response modulations in network units as observed experimentally during saccadic eye movements. Therefore, by equating perception to classification, we provide a simple unifying computational framework to explain the role of movement signals in support of stable perception in dynamic interactions with the environment.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×