Abstract
In order to create coherent visual experiences, our visual system needs to aggregate inputs across space and time in a seamless manner. Here, we combine spectrally resolved EEG recordings and spatially resolved fMRI recordings to characterize the neural dynamics that mediate the integration of multiple spatiotemporally coherent inputs into a unified percept. To unveil integration-related brain dynamics, we experimentally manipulated the spatiotemporal coherence of two naturalistic videos presented in the left and right visual hemifields. In a first study, we show that only when spatiotemporally consistent information across both hemifields affords integration, EEG alpha dynamics carry stimulus-specific information. Combining the EEG data with regional mappings obtained from fMRI, we further show that these alpha dynamics can be localized to early visual cortex, indicating that integration-related alpha dynamics traverse the hierarchy in the top-down direction, all the way to the earliest stages of cortical vision. In a second study, we delineate boundary conditions for triggering integration-related alpha dynamics. Such alpha dynamics are observed when videos are coherent in their basic-level category and share critical features, but not when they are coherent in their superordinate category, thus characterizing the range of flexibility in cortical integration processes. Together, our results indicate that the construction of coherent visual experiences is not implemented within the visual bottom-up processing cascade. Our findings rather stress that integration relies on cortical feedback rhythms that fully traverse the visual hierarchy.