August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Memory-based predictions facilitate perceptual judgements across head-turns in naturalistic scene perception
Author Affiliations
  • Anna Mynick
    Dartmouth College
  • Thomas L Botch
    Dartmouth College
  • Allie Burrows
    Dartmouth College
  • Brenda D Garcia
    Dartmouth College
  • Adithi Jayaraman
    Dartmouth College
  • Adam Steel
    Dartmouth College
  • Caroline E Robertson
    Dartmouth College
Journal of Vision August 2023, Vol.23, 5883. doi:https://doi.org/10.1167/jov.23.9.5883
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Anna Mynick, Thomas L Botch, Allie Burrows, Brenda D Garcia, Adithi Jayaraman, Adam Steel, Caroline E Robertson; Memory-based predictions facilitate perceptual judgements across head-turns in naturalistic scene perception. Journal of Vision 2023;23(9):5883. https://doi.org/10.1167/jov.23.9.5883.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Memory-based predictions influence perception, but how they are instantiated in naturalistic environments is unclear. Across five experiments, we used head-mounted virtual reality (VR) to test whether memory for a 360° environment facilitates rapid perceptual judgements of views from within that environment across head-turns. In Experiment 1, participants (N=26) studied 18 real-world panoramas in VR. On each trial of a subsequent Priming Test, participants head-turned left or right toward a snapshot from a studied panorama (target; 110º width) and made a perceptual (open/closed) judgement. Before target onset, another view from the same panorama was briefly presented to prime memory of the target image (prime; 110º width). Before prime onset, an arrow (left/right; valid/invalid) indicated which direction to plan a head-turn in to see the target. We found an interaction between prime condition (Same-scene/Neutral) and arrow validity (p=0.04). On trials with valid arrows (which correctly predicted where the target would land), Same-scene primes improved reaction times (RTs) compared to Neutral primes (p=<.001), but this effect was absent in the invalid arrow condition (p=0.52), showing that memory-based priming occurs in 360° environments and is skewed in the direction of a planned head-turn. Four additional experiments support these results. We replicate the RT advantage for Same-scene < Neutral primes (Experiment 2: N=18, p<.001), and demonstrate that unfamiliar panoramas produce no RT difference between prime conditions, indicating that the priming effect hinges on memory (Experiment 3: N=20, p=0.61). Additionally, by manipulating the spatial structure of studied panoramas, we show evidence that priming respects both the broad spatial structure of studied panoramas (Experiment 4: N=18, p<0.001) and the local spatial structure of upcoming scene views (Experiment 5: N=22, p<0.01). Together, these results indicate that memory of a visuospatial environment influences ongoing perception, priming perceptual judgments of scene views across head-turns in service of efficient perception.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×