December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
Determining how color and form are integrated within macaque V1 neurons through combined neurophysiology and computational modeling
Author Affiliations & Notes
  • Felix Bartsch
    Department of Biology, University of Maryland, College Park, MD, United States of America
    Program in Neuroscience and Cognitive Science, University of Maryland, College Park, MD, United States of America
    Laboratory of Sensorimotor Research, National Eye Institute, National Institutes of Health, Bethesda, Maryland, United States of America
  • Bevil R. Conway
    Laboratory of Sensorimotor Research, National Eye Institute, National Institutes of Health, Bethesda, Maryland, United States of America
  • Daniel A. Butts
    Department of Biology, University of Maryland, College Park, MD, United States of America
    Program in Neuroscience and Cognitive Science, University of Maryland, College Park, MD, United States of America
  • Footnotes
    Acknowledgements  NIH/NEI Intramural Program; NSF Grant 2113197
Journal of Vision December 2022, Vol.22, 3991. doi:https://doi.org/10.1167/jov.22.14.3991
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Felix Bartsch, Bevil R. Conway, Daniel A. Butts; Determining how color and form are integrated within macaque V1 neurons through combined neurophysiology and computational modeling. Journal of Vision 2022;22(14):3991. https://doi.org/10.1167/jov.22.14.3991.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Color vision requires comparing the activity of cone types across visual space. V1 is likely important in carrying out these computations, but establishing the underlying neural mechanisms has been complicated by several issues: (1) cone density is highest at the fovea, where even tiny fixational eye movements can be disruptive for mapping spatial receptive fields; (2) V1 processing is nonlinear; and (3) the integration of spatial and chromatic information likely depends on circuit-level interactions that occur across layers within V1. A further challenge is the lack of theoretical framework for interpreting neurophysiological measurements. We designed two-dimensional spatiochromatic noise stimuli that contain a wide range of spatiochromatic combinations necessary to fit data-driven models to V1 neurons recorded while this stimulus is presented. The stimuli are defined in cone-opponent color space, corresponding to the subcortical inputs. Inspired by preliminary recordings from awake macaque V1, we simulated different receptive field (RF) arrangements consistent with observations of spatial and chromatic selectivity in V1 cells. We then applied different classes of data-driven models to determine the structure of the spatiochromatic computations in these neurons, to determine which modeling approaches would best distinguish the circuit-based construction of their selectivity, including subunits structure and excitatory-inhibitory interactions. The simulations show that linear and rudimentary nonlinear approaches fail to identify complex RF structures, while a linear-nonlinear cascade style subunit model was successful. These simulations show that both nonlinear models and stimuli exploring a wide range of spatiochromatic combinations are necessary to address how color and space are represented by neural populations in V1, and that our approach allows us to recover a rich diversity of ways that V1 cells integrate spatial and chromatic information. These results provide testable predictions for laminar multielectrode recordings from macaque V1 that will uncover how color and form are integrated in V1.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×