September 2011
Volume 11, Issue 11
Free
Vision Sciences Society Annual Meeting Abstract  |   September 2011
Contributions of visual and temporal similarity to statistical learning
Author Affiliations
  • Anna C. Schapiro
    Princeton University, USA
  • Lauren V. Kustner
    Princeton University, USA
  • Nicholas B. Turk-Browne
    Princeton University, USA
Journal of Vision September 2011, Vol.11, 986. doi:https://doi.org/10.1167/11.11.986
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Anna C. Schapiro, Lauren V. Kustner, Nicholas B. Turk-Browne; Contributions of visual and temporal similarity to statistical learning. Journal of Vision 2011;11(11):986. https://doi.org/10.1167/11.11.986.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Objects appear in reliable patterns over time, and these regularities are extracted with visual statistical learning (VSL). Prior VSL studies have focused on learning of arbitrary groups of objects. However, objects are not grouped arbitrarily in the natural environment, and instead often have some degree of visual similarity: multiple views of the same object, or multiple objects from a particular context (e.g., grocery stores or forests). VSL may exploit visual similarity to learn temporal statistics. Here we investigate the contributions of visual similarity and temporal co-occurrence (temporal similarity) to VSL. Observers viewed sequences of fractal images presented one at a time while performing an orthogonal task. Unbeknownst to them, the images were grouped into eight pairs: four where images always occurred successively (high temporal similarity) and four where images occurred successively 1/3 of the time (low temporal similarity). Two pairs in each condition contained images that were each other's color inverse (high visual similarity), and the remaining pairs' images were unrelated (low visual similarity). We administered a surprise familiarity test for VSL in which observers viewed a pair and rated their familiarity using a slider. High temporal similarity pairs were rated as more familiar than low temporal similarity pairs, providing evidence for VSL and revealing sensitivity to subtle probabilistic gradations. There was also an effect of visual similarity, but in the opposite direction: high visual similarity pairs were rated as less familiar than low visual similarity pairs (p < 0.05). We are investigating this surprising effect in a follow-up behavioral study using an implicit measure, and a follow-up fMRI study examining the impact of temporal and visual similarity on neural representations. Preliminary results suggest that high temporal similarity and low visual similarity increase pattern correlations in medial temporal lobe sub-regions. These studies begin to characterize how VSL operates over naturalistic visual regularities.

ACS was supported by an NSF graduate research fellowship. 
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×