December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
Impact of task-irrelevant auditory information on a visual rate categorization task
Author Affiliations & Notes
  • Mattia Zanzi
    SISSA (International School for Advanced Studies), Trieste, Italy
  • Silene Fornasaro
    SISSA (International School for Advanced Studies), Trieste, Italy
  • Davide Zoccolan
    SISSA (International School for Advanced Studies), Trieste, Italy
  • Footnotes
    Acknowledgements  We acknowledge the financial support of the European Research Council Consolidator Grant project no. 616803-LEARN2SEE (DZ)
Journal of Vision December 2022, Vol.22, 3560. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Mattia Zanzi, Silene Fornasaro, Davide Zoccolan; Impact of task-irrelevant auditory information on a visual rate categorization task. Journal of Vision 2022;22(14):3560.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

In naturalistic contexts, when unisensory information is weak or unreliable, successful discrimination often depends on the ability to integrate information from different sensory domains. Traditionally, it was believed that multimodal association cortices were deputed to the integration of multisensory information to guide decision-making. However, recent anatomical and functional studies have revealed the existence of cross-modal interactions already at the level of the primary sensory cortices. For instance, V1 neurons of naive rodents were shown to encode the temporal congruency of synchronous audiovisual stimuli. Yet, the perceptual role of multisensory interactions in the primary sensory cortex remains poorly explored at the behavioral level. To address this question, we designed a novel perceptual audiovisual task. We presented six rats with temporally modulated audiovisual stimuli. Rats were first trained to make a decision based on the temporal frequency (TF) of outward-moving circular gratings, paired with a fixed-amplitude white noise sound. To explore the impact of auditory information on visual perception, we then introduced amplitude modulation to the sounds, delivering both temporally congruent (i.e., matched TFs) and incongruent audiovisual stimuli. Although rats were not explicitly trained to judge the rate of the acoustic feature, the psychometric curves revealed a higher sensitivity to the TF of the visual stimuli when paired with rate-changing sounds, compared to fixed-amplitude sounds. Surprisingly, this sharpening was independent of the temporal coherence of the visual and the auditory features. By contrast, rats reacted faster when visual stimuli were paired with fixed-amplitude compared to temporally-modulated sounds. Our data suggest that information about task-irrelevant but behaviorally-salient sounds are reflexively routed to early visual areas, where it spontaneously facilitates the detection of visual events. Our experimental approach can thus pave the way to the investigation of the functional impact of auditory inputs over visual cortical representations, mediated by well-established cortico-cortical connections.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.