December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
Early visual cortex represents human sounds more distinctly than non-human sounds.
Author Affiliations & Notes
  • Giusi Pollicina
    Royal Holloway, University of London
  • Polly Dalton
    Royal Holloway, University of London
  • Petra Vetter
    University of Fribourg
  • Footnotes
    Acknowledgements  Departmental PhD studentship to GP; research grant from the John Templeton Foundation (Prime Award No 48365) as part of the Summer Seminars in Neuroscience and Philosophy (SSNAP, subcontract No 283-2608) to PV; PRIMA grant by the Swiss National Science Foundation (PR00P1_185918/1) to PV.
Journal of Vision December 2022, Vol.22, 3705. doi:https://doi.org/10.1167/jov.22.14.3705
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Giusi Pollicina, Polly Dalton, Petra Vetter; Early visual cortex represents human sounds more distinctly than non-human sounds.. Journal of Vision 2022;22(14):3705. https://doi.org/10.1167/jov.22.14.3705.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

A high number of feedback connections link early visual cortex to several other cortical areas. Among these, feedback sent by auditory cortex is sufficient to produce distinguishable neural activity in early visual cortex when participants listen to different natural sounds in the absence of visual stimulation or sight (Vetter, Smith & Muckli, 2014, Current Biology; Vetter, Bola et al., 2020, Current Biology). However, the content of this flux of information has not been fully explored yet. Our study focused on understanding to what degree of specificity auditory information is fed back to visual cortex. We presented a large sample of sounds to 18 blindfolded participants while acquiring functional MRI data. 36 natural sounds were selected according to different semantic categories (e.g. animate sounds, divided into humans and animals, divided into specific species or types of sound, and the same for inanimate sounds). The boundaries of V1, V2 and V3 were drawn using individual retinotopic mapping. We analysed the fMRI activity patterns produced by these sounds in each early visual region using Multivoxel Pattern Analysis (MVPA). Results showed that the MVPA classifier could distinguish significantly above chance animate from inanimate sounds, as well as between human, animal, vehicle and object sounds in early visual cortex. Pairwise classification demonstrated that sounds produced by humans were generally better distinguished compared to other semantic categories. Searchlight analyses showed that decoding also worked in regions of higher level visual and multisensory processing. These results suggest that auditory feedback relays categorical information about sounds, particularly human sounds, to areas that were once believed to be exclusively specialised for vision. We conclude that early visual cortex function is not restricted to the processing of low-level visual features, but includes representation and potential employment of semantic and categorical sound information, which might be used to predict visual stimuli.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×