August 2012
Volume 12, Issue 9
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2012
Searching for objects in a virtual apartment: the effect of experience on scene memory
Author Affiliations
  • Leor Katz
    Institute for Neuroscience, Center for Perceptual Systems, University of Texas
  • Dmitry Kit
    Computer Science Department, Center for Perceptual Systems, University of Texas
  • Brian Sullivan
    Psychology Department, Center for Perceptual Systems, University of Texas
  • Kat Snyder
    Psychology Department, Center for Perceptual Systems, University of Texas
  • Mary Hayhoe
    Psychology Department, Center for Perceptual Systems, University of Texas
Journal of Vision August 2012, Vol.12, 264. doi:https://doi.org/10.1167/12.9.264
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Leor Katz, Dmitry Kit, Brian Sullivan, Kat Snyder, Mary Hayhoe; Searching for objects in a virtual apartment: the effect of experience on scene memory. Journal of Vision 2012;12(9):264. https://doi.org/10.1167/12.9.264.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

How do we form memories for real-world environments over time and how do these memories influence gaze behavior? There is considerable evidence that humans develop fairly extensive, often implicit, memory representations of natural scenes. Most investigations of memory for natural environments, however, have used static 2D images often involving the presentation of multiple unrelated scenes. In contrast, natural experience entails immersion in a limited number of 3D environments for extended periods of time, which may facilitate the build up of more extensive memory representations. To investigate scene memory development in natural settings, we recorded the sequences of saccades and body movements while observers searched for and touched a series of different objects in a 3-room virtual apartment, over 30-minute periods on two consecutive days. Subjects rapidly learnt the global layout of the apartment and restricted gaze largely to regions where surfaces, e.g. counters, were located. For objects that appeared as search targets on repeated occasions, both search time and number of fixations diminished gradually over repeated search episodes (by factors of 3 and 2, respectively). Thus, the binding of particular objects to particular locations is learnt fairly slowly, despite the presence of a constant context. Surprisingly, learning appeared to require active search. When an object first became a search target there was no measurable reduction in the amount of time or number of fixations required to locate it, even if it had been spontaneously fixated upon multiple times (~40) while the subject was searching for other objects. This lack of passive learning may be a consequence of the highly task-specific processing that occurs when engaged in search, which might suppress the encoding of task-irrelevant distracters. Thus, visual search in natural environments appears to be largely guided by memory representations that are dependent upon task-directed attentional constraints.

Meeting abstract presented at VSS 2012

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×