September 2019
Volume 19, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2019
Neural coding of non-visual properties inferred from images of natural scene
Author Affiliations & Notes
  • Yaelan Jung
    Department of Psychology, University of Toronto
  • Dirk B Walther
    Department of Psychology, University of Toronto
Journal of Vision September 2019, Vol.19, 189b. doi:https://doi.org/10.1167/19.10.189b
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Yaelan Jung, Dirk B Walther; Neural coding of non-visual properties inferred from images of natural scene. Journal of Vision 2019;19(10):189b. doi: https://doi.org/10.1167/19.10.189b.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

When people see an image of a natural scene, they can extract non-visual properties from the visual stimulus: Is the depicted environment likely to be hot or cold? Is it noisy or quiet? How are these properties represented in the brain, and how do their representations compared to those elicited by actual sensations of the same non-visual properties? In the present study, we address these questions using fMRI. Twenty participants saw images of natural scene while their brain activity was recorded. The scene images were selected so that their thermal (hot versus cold) and auditory (noisy versus quiet) features were orthogonal to their basic-level categories. To examine how thermal and auditory information is represented in the brain, we performed a leave-one-run-out classification analysis using neural activity patterns. We found that non-visual properties inferred from images could be decoded from the sensory cortices that are dedicated to their direct senses (auditory or thermal) and also from regions in prefrontal cortex (PFC). Thermal features inferred from images could be decoded from the post-central gyrus, which is known to process thermal sensations, and from the inferior prefrontal gyrus (IFG) opercularis. We were able to decode auditory features from the superior temporal gyrus, which processes auditory information, and also from IFG opercularis and IFG orbitalis. Furthermore, neural representations of inferred non-visual cues in prefrontal regions are similar to those elicited by direct stimulation with sounds and thermal stimuli, respectively. These findings indicate that non-visual properties inferred from images evoke similar neural activation patterns as direct sensations in PFC. Together, the current study demonstrates that non-visual aspects of visual stimuli are represented in the brain areas that are not exclusively dedicated to vision.

Acknowledgement: NSERC 
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×