Abstract
In the current study, we tested the temporal and spatial properties of a new web-based eye tracking program called Gazer, developed at the University of Victoria. Using the Gazer program, we tracked the eye movements of participants during a Where's Waldo visual search task. On each trial, participants were presented with a Where’s Waldo scene and asked to identify the location of the Waldo target. Once they located Waldo, participants terminated the search by pressing the space bar. They were then shown a grid of 3 x 3 squares and asked to indicate the corresponding square in which Waldo appeared. During the visual search, gaze locations were recorded at a sampling rate of 30 Hz and timing was locked to the space bar response. To correlate gaze time with search behaviour, we calculated dwell time as the proportion of gaze spent on the on-target square versus time spent on the off-target squares. Dwell time was analyzed 1000ms prior to response and our results showed participants spent significantly more time looking at the on-target square than off-target squares (p < .001). At a finer grain of temporal and spatial analysis, we computed the average Euclidean distance between the Gazer coordinates and the Waldo target in 20ms time bins. This analysis revealed that participants began to move their gaze toward Waldo’s position approximately 3000ms before their space bar response. At the level of individual differences, we found a robust negative correlation between visual search speed and accuracy where participants who were very fast at finding the Waldo target were also very accurate. In conclusion, we believe that the Gazer program is a viable system for conducting remote eye tracking research. Our findings indicate Gazer has sufficient temporal and spatial resolution to investigate the relationship between eye movements and visual search performance.