August 2016
Volume 16, Issue 12
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2016
Low-level auditory and visual features can be decoded across early sensory cortices.
Author Affiliations
  • JOO HUANG TAN
    Neuroscience and Behavioural Disorders Programme, Duke-NUS Graduate Medical School Singapore
  • PO-JANG HSIEH
    Neuroscience and Behavioural Disorders Programme, Duke-NUS Graduate Medical School Singapore
Journal of Vision September 2016, Vol.16, 581. doi:https://doi.org/10.1167/16.12.581
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      JOO HUANG TAN, PO-JANG HSIEH; Low-level auditory and visual features can be decoded across early sensory cortices.. Journal of Vision 2016;16(12):581. https://doi.org/10.1167/16.12.581.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Hierarchical models of sensory processing suggest that incoming low-level sensory features are first processed in their respective unisensory cortices. Information is then processed and integrated as it gets passed serially along the hierarchy. However, recent evidence has suggested that the cortex is not strictly organized in a hierarchical manner. Here we ask the question: are neural representations of low-level visual and auditory features constrained to their own early sensory cortices? In this fMRI study, we sought to elucidate brain regions that contain pattern information that discriminated between different dimensions of low-level auditory and visual features across the brain. The four low-level features selected for this study were visual spatial frequency, grating orientation, auditory amplitude modulation rate, and pitch. During the experiment, subjects were presented with the visual and auditory stimuli alternately in separate experimental blocks. We demonstrate that patterns of activation elicited by these low-level sensory features are not simply constrained to their respective sensory cortices. Instead, the neural representations of these low-level sensory features can be decoded across many regions of the cortex including other modalities' sensory cortices. More importantly, we have demonstrated that the cross-modal signals in "unisensory" regions contain specific details of the different dimensions of another modality's low-level features. These results suggest that early sensory processing involves a larger network of brain regions than previously thought. It also suggests that multisensory interactions may be involved at the early stages of sensory processing.

Meeting abstract presented at VSS 2016

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×