August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Visual cortical regions carry information about auditory attention
Author Affiliations & Notes
  • Abigail Noyce
    Carnegie Mellon University
  • Weizhe Guo
    Carnegie Mellon University
  • Wenkang An
    Boston Children's Hospital
  • Barbara Shinn-Cunningham
    Carnegie Mellon University
  • Footnotes
    Acknowledgements  Supported by the Office of Naval Research, Grant/Award Number: N00014-20-1-2709.
Journal of Vision August 2023, Vol.23, 5809. doi:https://doi.org/10.1167/jov.23.9.5809
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Abigail Noyce, Weizhe Guo, Wenkang An, Barbara Shinn-Cunningham; Visual cortical regions carry information about auditory attention. Journal of Vision 2023;23(9):5809. https://doi.org/10.1167/jov.23.9.5809.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

So-called visual regions of the human brain often participate in tasks that have no visual elements. Visual-biased frontal and some parietal regions are active during spatial auditory tasks (Michalka 2015, 2016; Deng 2019); visual-biased frontal regions also support non-spatial auditory cognition (Noyce 2017, 2021). Here, we used functional magetic resonance imaging (fMRI) and representational similarity analysis (RSA) to compare spatial with non-spatial attention in the visual cortical network. On each trial, subjects were first cued to use spatial attention, non-spatial attention, or passive listening, then cued to the exact target feature (a location or pitch). Four temporally-overlapping syllables, spoken by different talkers and spatialized to different locations, were presented, and subjects reported the target’s identity (/ba/, /da/, or /ga/). After preprocessing, fMRI data were fitted with a separate general linear model for each trial, including a regressor for that trial and nuisance regressors for each attention condition (Turner 2012), yielding trial-wise activation maps. For each subject, we defined anatomical regions of interest (ROIs), then trained support vector machines (SVM) for pairwise classification of all conditions. SVM classifier accuracy measures dissimilarity between conditions within that ROI. The resulting representational dissimilarity matrices summarize the information about task state encoded in each ROI. Active attention could be classified from passive listening across a broad network of brain areas, including visual-biased superior and inferior precentral sulcus (sPCS, iPCS) and superior parietal lobule (SPL), as well as auditory-biased superior frontal gyrus, superior temporal gyrus, and planum temporale. All of these regions also encoded spatial versus non-spatial attention. Bilateral SPL, right sPCS, and bilateral calcarine sulcus encoded the direction of spatial attention (left vs. right). These results further demonstrate that brain regions within the well-established visual processing network can be more generally recruited, especially for spatial processing.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×