July 2013
Volume 13, Issue 9
Free
Vision Sciences Society Annual Meeting Abstract  |   July 2013
Visual statistical learning guides perceptual selection
Author Affiliations
  • Rachel Denison
    Helen Wills Neuroscience Institute, University of California, Berkeley
  • Maxwell Schram
    Helen Wills Neuroscience Institute, University of California, Berkeley
  • Jacob Sheynin
    Helen Wills Neuroscience Institute, University of California, Berkeley
  • Michael Silver
    Helen Wills Neuroscience Institute, University of California, Berkeley\nSchool of Optometry, University of California, Berkeley
Journal of Vision July 2013, Vol.13, 1102. doi:https://doi.org/10.1167/13.9.1102
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Rachel Denison, Maxwell Schram, Jacob Sheynin, Michael Silver; Visual statistical learning guides perceptual selection. Journal of Vision 2013;13(9):1102. https://doi.org/10.1167/13.9.1102.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The visual system is primed to exploit temporal patterns in the visual environment. Humans can rapidly learn statistical dependencies between sequentially presented images without conscious intention or effort, a phenomenon called visual statistical learning. The consequences of statistical learning for subsequent perception, however, are largely unknown. To investigate the perceptual effects of visual statistical learning, we used binocular rivalry to measure the perceptual biases induced by learning sequences of natural images. Participants first viewed three-item image sequences (triplets) with sequential structure at both the image and category levels. In separate groups of participants, we manipulated the allocation of attention to images or categories during exposure. Next, participants performed a rivalry test. On each trial, the first two images from a learned triplet were presented unambiguously to the two eyes. A rivalry display immediately followed, in which the third image from the learned triplet was presented to one eye and the third image from a different triplet was presented to the other eye. We found that perceptual selection of an image in a given rivalrous pair depended on whether it was the image or category predicted by the preceding two unambiguous images. Further, the allocation of attention to images or categories during the exposure period influenced the strength of the subsequent rivalry effects. We confirmed that our attention manipulation affected the degree of statistical learning for images and categories with a 2AFC post-test in which participants judged the familiarity of trained vs. foil triplets. Our results show that recent, arbitrary visual statistical learning can alter subsequent perceptual selection. Such effects cannot be due to low-level priming or adaptation and provide evidence for flexible integration of visual memory with incoming sensory information in perception.

Meeting abstract presented at VSS 2013

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×