September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
Asymmetry for shading direction in visual search persists in inattentional blindness
Author Affiliations
  • ShuiEr Han
    I2R and CFAR, Agency for Science, Technology and Research (A*STAR)
    Nanyang Technological University (NTU), Singapore
  • Yang Shen
    I2R and CFAR, Agency for Science, Technology and Research (A*STAR)
    Nanyang Technological University (NTU), Singapore
  • Mengmi Zhang
    I2R and CFAR, Agency for Science, Technology and Research (A*STAR)
    Nanyang Technological University (NTU), Singapore
Journal of Vision September 2024, Vol.24, 317. doi:https://doi.org/10.1167/jov.24.10.317
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      ShuiEr Han, Yang Shen, Mengmi Zhang; Asymmetry for shading direction in visual search persists in inattentional blindness. Journal of Vision 2024;24(10):317. https://doi.org/10.1167/jov.24.10.317.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

When human observers search among shaded visual stimuli, they find the targets in vertical shading much faster than the ones in horizontal shading. Here, we demonstrate that this asymmetry persists in an inattentional blindness (IAB) paradigm. In our study, subjects viewed naturalistic simulations of moving balls that were vertically or horizontally shaded. A portion of the trials contained an unexpected target, which had a reversed shading gradient, and was introduced to the simulation at random times. During each trial, subjects tracked a ball and counted the number of midline crossings made. They were also instructed to indicate when they noticed an unexpected target. Results showed that almost twice as many vertically shaded targets were detected compared to horizontally shaded targets, and this difference could not be attributed to differences in target visibility, false target detection rate, and average ball counting accuracy. To gain insight into the underlying mechanisms of these results, we propose a biologically inspired, computational IAB model based on predictive coding. The model undergoes unsupervised training to anticipate subsequent video frames by minimizing expected errors inherited from preceding predictions made during the structure analysis of naturalistic video sequences. Subsequently, the model is tested on the same videos used in human psychophysics experiments. Remarkably, this model exhibits a more pronounced variance in predictive errors, when the unexpected target is in horizontal shading. Together, our findings point to the emergence of IAB asymmetry through top-down expectation biases derived from the visual stimuli presented to both humans and the model.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×