Abstract
The ability of humans to quickly and efficiently categorize natural scenes is often referred to as fast "gist" recognition, which then affects subsequent, more detailed analysis of the scene. However, little work has been done to demonstrate the influence of scene category on later stages of processing. Here we test how scene category affects eye movements in an exploration task. Participants freely viewed 198 color photographs of natural scenes from six categories (beaches, city streets, forests, highways, mountains, and offices) for two to eight seconds, while their eye movements were recorded. We then attempted to predict scene category based on the pattern of eye movements on a trial-by-trial basis using a procedure similar to the normalized scanpath salience by Peters et al. (2005) with fixation density maps (FDM) derived from training data. We could predict scene category correctly for 33.4% of the trials in the test data (chance: 16.7%). The category-specificity of the fixated locations may stem from two sources: consistent patterns of salience determined by scene layout or other, more abstract biases based on the category label. To test the influence of salience we repeated the analysis, but using the average salience maps (Walther and Koch 2006) for a given category for training instead of FDMs. In this analysis we could predict scene category with 22.3% accuracy. In a follow-up experiment we restricted visibility of the image to the central 7 degrees of visual angle (2.4% of the image area) around the current eye position, thereby suppressing global scene processing. Trials were blocked by scene categories in order to enhance category-based biases. Accuracy of predicting scene category was 23.0% in this condition. These results suggest that both shared patterns of salience and more abstract category-based information contribute to category-specificity of fixations.
Meeting abstract presented at VSS 2012