September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
Top-down alpha dynamics mediate the neural representation of coherent visual experiences
Author Affiliations & Notes
  • Daniel Kaiser
    Mathematical Institute, Justus Liebig University Giessen
    Center for Mind, Brain and Behavior, Philipps University Marburg and Justus Liebig University Giessen
  • Lixiang Chen
    Mathematical Institute, Justus Liebig University Giessen
    Department of Education and Psychology, Freie Universität Berlin
  • Radoslaw M Cichy
    Department of Education and Psychology, Freie Universität Berlin
  • Footnotes
    Acknowledgements  This work is supported by the DFG (CI241/1-1, CI241/3-1, CI241/7-1, KA4683/5-1, SFB/TRR 135), the ERC (ERC-2018-STG 803370, ERC-2022-STG 101076057), the China Scholarship Council, and “The Adaptive Mind”, funded by the Hessian Ministry of Higher Education, Science, Research and Art.
Journal of Vision September 2024, Vol.24, 492. doi:https://doi.org/10.1167/jov.24.10.492
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Daniel Kaiser, Lixiang Chen, Radoslaw M Cichy; Top-down alpha dynamics mediate the neural representation of coherent visual experiences. Journal of Vision 2024;24(10):492. https://doi.org/10.1167/jov.24.10.492.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

In order to create coherent visual experiences, our visual system needs to aggregate inputs across space and time in a seamless manner. Here, we combine spectrally resolved EEG recordings and spatially resolved fMRI recordings to characterize the neural dynamics that mediate the integration of multiple spatiotemporally coherent inputs into a unified percept. To unveil integration-related brain dynamics, we experimentally manipulated the spatiotemporal coherence of two naturalistic videos presented in the left and right visual hemifields. In a first study, we show that only when spatiotemporally consistent information across both hemifields affords integration, EEG alpha dynamics carry stimulus-specific information. Combining the EEG data with regional mappings obtained from fMRI, we further show that these alpha dynamics can be localized to early visual cortex, indicating that integration-related alpha dynamics traverse the hierarchy in the top-down direction, all the way to the earliest stages of cortical vision. In a second study, we delineate boundary conditions for triggering integration-related alpha dynamics. Such alpha dynamics are observed when videos are coherent in their basic-level category and share critical features, but not when they are coherent in their superordinate category, thus characterizing the range of flexibility in cortical integration processes. Together, our results indicate that the construction of coherent visual experiences is not implemented within the visual bottom-up processing cascade. Our findings rather stress that integration relies on cortical feedback rhythms that fully traverse the visual hierarchy.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×