December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
Oscillatory brain signatures of dynamic visual integration in natural context
Author Affiliations & Notes
  • Lixiang Chen
    Department of Education and Psychology, Freie Universität Berlin, Berlin, Germany
  • Radoslaw Martin Cichy
    Department of Education and Psychology, Freie Universität Berlin, Berlin, Germany
  • Daniel Kaiser
    Mathematical Institute, Department of Mathematics and Computer Science, Physics, Geography, Justus-Liebig-Universität Gießen, Gießen, Germany
    Center for Mind, Brain and Behavior (CMBB), Philipps-Universität Marburg and Justus-Liebig-Universität Gießen, Marburg, Germany
  • Footnotes
    Acknowledgements  R.M.C. and D.K. are supported by the Deutsche Forschungsgemeinschaft (DFG) grants (CI241/1-1, CI241/3-1, KA4683/2-1). R.M.C. is supported by the European Research Council (ERC) grant (803370). L.C. is supported by the China Scholarship Council (CSC).
Journal of Vision December 2022, Vol.22, 3282. doi:https://doi.org/10.1167/jov.22.14.3282
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Lixiang Chen, Radoslaw Martin Cichy, Daniel Kaiser; Oscillatory brain signatures of dynamic visual integration in natural context. Journal of Vision 2022;22(14):3282. https://doi.org/10.1167/jov.22.14.3282.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

During our daily visual experience, our eyes constantly receive complex information from the environment. This information is characterized by spatiotemporal regularities, with predictable distributions of visual features across the visual field and across time. To create our unitary experience of reality, the brain needs to integrate these inputs in an efficient way. Here, we tested whether such integration processes are mediated by oscillatory neural codes. Specifically, we hypothesized that the integration of spatiotemporally regular information is governed by low-frequency oscillations in the alpha band that were previously linked to top-down modulations of sensory processing. In an EEG experiment, participants viewed short video clips (3s) depicting everyday situations, which were shown through circular apertures in the right and left visual fields. Videos were presented (1) through the right aperture only, (2) through the left aperture only, (3) through both apertures in a spatiotemporally congruent way, with the two apertures showing parts of the same video, or (4) through both apertures but in a spatiotemporally incongruent way, with the apertures showing parts of different videos. To quantify oscillatory activity, we first computed trial-wise EEG powerspectra during the video presentation. We then used multivariate classification analysis to decode the different videos in each condition from multi-electrode patterns of oscillatory power in three discrete frequency bands: alpha (8-12Hz), beta (13-30Hz) and gamma (31-70Hz). When videos were only presented in one hemifield, or when two inconsistent videos were presented, we could decode the videos from activity in the gamma range, indexing differences in feedforward visual processing. By contrast, we found that spatiotemporally consistent videos were primarily decodable from alpha activity, confirming our hypothesis that alpha oscillations mediate the dynamic integration of natural information into seamless visual experiences. Together, our results highlight differential oscillatory signatures for independent versus integrative processing of natural inputs.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×