September 2021
Volume 21, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2021
Seeing sounds: Brain mechanisms underlaying auditory contributions to visual detection
Author Affiliations & Notes
  • Alexis Perez-Bellido
    Department of Cognition, Development and Educational Psychology, University of Barcelona, Spain
    Institute of Neurosciences, University of Barcelona, Barcelona, Spain
    Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
  • Eelke Spaak
    Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
  • Floris P. de Lange
    Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
  • Footnotes
    Acknowledgements  A.P.B. is supported by RTI2018-100977-J-I00 from MINECO (Spain) and F.P.d.L. is supported by a grant from the Horizon 2020 Framework Programme (ERC Starting Grant 678286)
Journal of Vision September 2021, Vol.21, 2563. doi:https://doi.org/10.1167/jov.21.9.2563
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Alexis Perez-Bellido, Eelke Spaak, Floris P. de Lange; Seeing sounds: Brain mechanisms underlaying auditory contributions to visual detection. Journal of Vision 2021;21(9):2563. https://doi.org/10.1167/jov.21.9.2563.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

How auditory information interacts with visual detection is a recurrent question in visual neuroscience. Whereas some studies propose that sounds interact automatically with incoming visual input, others instead claim that audiovisual interactions are dependent on top-down controlled processes like attention. In this study, we recorded magnetoencephalography (MEG) data while participants performed a visual detection task (where the audiovisual events were task-relevant) or a working memory task (where the audiovisual events were task-irrelevant). We trained multivariate pattern analysis classifiers and tested them at different time points to characterize how auditory information shaped visual stimulus representations over time in each task. Our results showed that sounds interact with visual detection via two different mechanisms. First, a mechanism by which observers actively used the auditory stimulus to orient their attention to the target onset, maintaining a stable representation of the visual stimulus along the whole trial. This mechanism allowed participants to improve their visual sensitivity and it was not automatic, as it required participants to attend the audiovisual signals. Second, a mechanism by which sounds elicit a neural response pattern akin to the one evoked by an actual visual stimulus. This latter mechanism was associated with an increase in false alarms and it is automatic since it was independent of participants attention to the audiovisual signals. This work shed light on a classic debate in regard to the automaticity of auditory dependent modulations of visual detection by showing that 1) sounds improve visual detection sensitivity via a top-down controlled mechanism; and 2) changes in criterion (i.e. signal detection theory parameter) due to sound presentation in visual detection experiments do not merely reflect decisional biases. Instead, our results suggest that sounds automatically evoke neural activity patterns that could be interpreted by the brain as a veridical visual stimulus.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×