Abstract
The visual world is rich and complex, but it also contains many statistical regularities. Previous studies have shown that our perceptual system can encode surprisingly detailed information about feature distributions. For example, such feature distribution learning has been demonstrated in visual search tasks, where observers learn the distribution that distractors from consecutive visual search trials are drawn from. Their ability to find the target on a subsequent test trial depended on the distance between the current target feature and the distractor distribution on preceding trials. Response times were slowed if the target feature was drawn from the previous distractor distribution and reflected the shape of the distribution. These studies have typically assessed search performance through manual responses. However, we suggest that eye movements can provide a complementary, precise and ecological measure of feature distribution learning. Here, observers were asked to make a saccade towards an oddly colored target among heterogeneous distractors, taken from either a Gaussian or uniform color distribution. Preliminary results revealed better search performance on learning trials when the color of the distractors came from a Gaussian distribution rather than a uniform distribution. Moreover, saccade latencies during test trials depended on the distance between the color of the current target and that of the previous distractor. Indeed, saccade latencies were slowed when the target color was within the previous distractor's color distribution. Saccade latencies did not, however, reflect the precise shape of the previous distractor's color distribution. Overall, our results suggest that previous distractor characteristics can guide gaze in visual search tasks, but further investigations are needed to determine whether the precise shape of the distractor distributions is taken advantage of.