Purchase this article with an account.
Mark A. Segraves, Emory Kuo, Sara Caddigan, Emily A. Berthiaume, Konrad P. Kording; Predicting rhesus monkey eye movements during natural-image search. Journal of Vision 2017;17(3):12. doi: 10.1167/17.3.12.
Download citation file:
© 2017 Association for Research in Vision and Ophthalmology.
There are three prominent factors that can predict human visual-search behavior in natural scenes: the distinctiveness of a location (salience), similarity to the target (relevance), and features of the environment that predict where the object might be (context). We do not currently know how well these factors are able to predict macaque visual search, which matters because it is arguably the most popular model for asking how the brain controls eye movements. Here we trained monkeys to perform the pedestrian search task previously used for human subjects. Salience, relevance, and context models were all predictive of monkey eye fixations and jointly about as precise as for humans. We attempted to disrupt the influence of scene context on search by testing the monkeys with an inverted set of the same images. Surprisingly, the monkeys were able to locate the pedestrian at a rate similar to that for upright images. The best predictions of monkey fixations in searching inverted images were obtained by rotating the results of the model predictions for the original image. The fact that the same models can predict human and monkey search behavior suggests that the monkey can be used as a good model for understanding how the human brain enables natural-scene search.
This PDF is available to Subscribers Only