September 2021
Volume 21, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2021
Object representations in visual cortex are scaled to account for viewing distance during visual search
Author Affiliations & Notes
  • Surya Gayet
    Donders Institute, Radboud University
  • Maëlle Lerebourg
    Donders Institute, Radboud University
  • Marius Peelen
    Donders Institute, Radboud University
  • Footnotes
    Acknowledgements  This research was funded by a VENI grant (191G.085) from the Netherlands Organization for Scientific Research (NWO) to Surya Gayet, and a Consolidator grant (grant agreement No 725970) from the European Research Council (ERC) under the European Union's Horizon 2020 research and innovation programme.
Journal of Vision September 2021, Vol.21, 1886. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Surya Gayet, Maëlle Lerebourg, Marius Peelen; Object representations in visual cortex are scaled to account for viewing distance during visual search. Journal of Vision 2021;21(9):1886.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Humans are remarkably proficient at finding objects within a complex visual world. It has been proposed that observers strategically increase their visual system’s responsivity to any object of interest, by pre-activating a visual representation of the target object during search preparation. Despite being widely accepted, this mechanism fails to account for an inherent property of real-world vision: the image that any given object will project on the retinae is unknown, as it depends on the object’s eventual location. For instance, the color and shape of the retinal image are determined by the illumination and viewpoint on the object, and –most dramatically– its size can vary by orders of magnitude depending on the distance to the object. How can preparatory activity in visual cortex benefit search in the real world, where the retinal image of an object is context-dependent? We addressed this question by testing whether human observers generate visual object representations during search preparation, and scale those object representations to account for viewing distance. In two fMRI experiments, (N=58) participants were cued to search for real-world objects at different distances within naturalistic scenes. We measured BOLD responses following the onset of the scene, from which the viewing distance could be inferred, and analyzed a subset of trials in which –unexpectedly– no array of objects appeared. This allowed for isolating brain activity related to search preparation only. Using multivariate pattern analysis, we related the patterns of brain activity evoked during search preparation, to those evoked by viewing isolated objects of different sizes. The data show that (1) observers generate visual representations of their target object during search preparation in object-selective regions, and (2) scale these representations to flexibly account for search distance. These findings reconcile current theories on visual selection with the functional demands of real-world vision.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.