December 2014
Volume 14, Issue 15
Free
OSA Fall Vision Meeting Abstract  |   December 2014
Natural scene statistics predict allocation of resources to nonlinear visual feature extraction
Author Affiliations
  • Ann Hermundstad
    Department of Physics and Astronomy, University of Pennsylvania
  • John Briguglio
    Department of Physics and Astronomy, University of Pennsylvania
  • Mary Conte
    Brain and Mind Institute, Weill Cornell Medical College
  • Jonathan Victor
    Brain and Mind Institute, Weill Cornell Medical College
  • Vijay Balasubramanian
    Department of Physics and Astronomy, University of Pennsylvania
  • Gasper Tkacik
    Institute of Science and Technology Austria
Journal of Vision December 2014, Vol.14, 26. doi:10.1167/14.15.26
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Ann Hermundstad, John Briguglio, Mary Conte, Jonathan Victor, Vijay Balasubramanian, Gasper Tkacik; Natural scene statistics predict allocation of resources to nonlinear visual feature extraction. Journal of Vision 2014;14(15):26. doi: 10.1167/14.15.26.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Barlow's “efficient coding principle” has provided a powerful conceptual framework for the early stages of sensory processing. Specifically, in the regime that channel capacity is the limiting factor (e.g. the optic nerve bottleneck), sensory systems make efficient use of their limited resources by magnifying signal components with low variance, and reducing those with high variance. But it is much less widely recognized that the efficient coding principle makes predictions about another regime of operation: when no transmission bottleneck is present, but meaningful feature identification is limiting. This is the regime relevant to the cortex, and, interestingly, the efficient coding principle takes on a different character, and makes a different prediction: that greater resources are devoted to signals with high variance. We test this hypothesis in the human visual system by measuring cortically-determined perceptual thresholds to a mathematically well-defined set of textures (a set of synthetic images with complex but controlled statistical properties). Our theory unambiguously predicts tens of independent and non-trivial parameters without fitting, based on the statistics of natural scenes. We find a strikingly detailed quantitative match between the predicted and measured values of these parameters. This study confirms that the efficient coding principle applies beyond the sensory periphery to describe cortical processing, and it provides clues to the mechanisms that enable a detailed match between the statistics of sensory inputs and the allocation of central resources.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×