Abstract
Linking eye-movement to visual perception or to learning has been notoriously difficult due to the fact that the visual stimulus is either too simplified providing no insights to the true nature of learning or with too rich input, the process of learning becomes intractable. Visual statistical learning (VSL) provides an ideal framework for such studies since it uses stimuli with precisely controlled statistics and regular spatial layout. We used the classical VSL paradigm combined with eye tracking and asked whether this controlled implicit learning paradigm allows following the contribution and development of eye movements during the learning process. Stimuli were based on 12 simple shapes combined into six base-pairs. From this alphabet, each scene was composed by randomly selecting three of the base-pairs and juxtaposing them on a grid to generate over 140 scenes that were shown sequentially for 3 sec each on a large 4*3 feet screen while the subjects’ eye movements were monitored. Subjects had no task beyond attentively observing the scenes. Post practice, subjects were given a test with multiple trials, where they had to choose between true building base-pairs and random combination of pairs based on their judgment of familiarity. Subjects typically became familiar with the base-pairs to a different degree, showing a wide variation of success in choosing the true base-pair over a foil. This distribution of percent correct values was correlated with various measures of eye-movement. We found a correlation between the amount of eye-fixations and the total fixation time on the shapes of the highly learned pairs versus the pairs that weren’t learned. These results provide a first indication that not only in highly explicit cognitive tasks, but even in implicit observational tasks, eye movements have a tight link to the acquired knowledge of the visual scenes.
Meeting abstract presented at VSS 2012