September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
Neural interpolation of dynamic visual information in natural scenes
Author Affiliations
  • Lu-Chun Yeh
    Mathematical Institute, Department of Mathematics and Computer Science, Physics, Geography, Justus Liebig University Gießen
  • Max Bardelang
    Mathematical Institute, Department of Mathematics and Computer Science, Physics, Geography, Justus Liebig University Gießen
  • Daniel Kaiser
    Mathematical Institute, Department of Mathematics and Computer Science, Physics, Geography, Justus Liebig University Gießen
    Center for Mind, Brain and Behavior (CMBB), Philipps University Marburg and Justus Liebig University Gießen
Journal of Vision September 2024, Vol.24, 542. doi:https://doi.org/10.1167/jov.24.10.542
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Lu-Chun Yeh, Max Bardelang, Daniel Kaiser; Neural interpolation of dynamic visual information in natural scenes. Journal of Vision 2024;24(10):542. https://doi.org/10.1167/jov.24.10.542.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Adaptive natural vision requires our brain to interpolate missing information about occluded objects in the environment. Previous studies suggest that this process is supported by visual cortex “filling in” occluded parts of scene images. However, we live in a dynamic world, and objects keep moving in and out of occlusion (e.g., when trains move through tunnels). Here, we used multivariate pattern analysis on time-frequency-resolved EEG data to track neural representations during dynamic occlusion. Participants watched 4-second videos of a person walking across a scene (either left-to-right or right-to-left) while performing an unrelated fixation task. The videos featured three conditions: The person walking across a blank background (isolated condition), across the scene without occlusion (visible condition), or across the scene while being dynamically occluded between 1.5 and 3 seconds (occluded condition). We trained linear classifiers on EEG response patterns to discriminate rightward- and leftward-walking in the isolated condition and tested them on the visible and occluded conditions. Classifiers trained on time-locked broadband responses, as well as on alpha (8-12Hz) and beta (13-30Hz) rhythms, successfully discriminated walking direction in the visible condition. However, only classifiers trained on alpha rhythms could discriminate walking direction in the occluded condition. Critically, we introduced an additional condition during which the person stopped in front of a natural obstacle (e.g., a river). We found that alpha dynamics tracked the termination of motion in this condition, even when it was hidden by the occluder. Together, our results provide evidence for an automatic interpolation of information during dynamic occlusion. The alpha dynamics that mediate this interpolation may constitute a neural correlate of top-down processes that “fill in” missing information based on context.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×