Purchase this article with an account.
Duy Nguyen, Evan Palmer; Gaze Pattern Differences Between Objective and Subjective Search of E-Commerce Web Pages. Journal of Vision 2012;12(9):417. doi: 10.1167/12.9.417.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
In a classic study, Yarbus (1967) showed that an observer’s eye movements varied dramatically depending on the viewing task. That study was among the first to show that high-level tasks influence eye movement patterns. Here, we seek to understand how the goals of the user affect their gaze patterns while browsing e-commerce web pages. In a pilot study, participants rated 60 still images of online shopping web pages in terms of five attribute pairs: poorly designed vs. well designed; simple vs. complex; bright vs. dull; masculine vs. feminine; and attractive vs. unattractive. In the current study, participants saw the 12 online shopping web pages that were rated as the most gender neutral in the pilot study and we recorded their eye movements while they performed one of three different search tasks. Participants either browsed the web page freely, performed an objective search task (e.g., find the laptop), or performed a subjective search task (e.g., find a gift for your uncle). In both of the search tasks, participants clicked an item or link to end the trial while in the free browsing task, the trial ended after 15 seconds. Consistent with Yarbus, we found differences in eye movement patterns between the three tasks, demonstrating the influence of top-down goals on gaze behavior. Gaze patterns in the free browse and subjective search tasks were similar to each other and different than the objective search task. Specifically, average saccade amplitudes were significantly larger in the objective search condition than in subjective search condition (p <.05) and marginally larger than in the free browse condition (p = .068). We have computed saliency and clutter maps for the 12 webpages in this study and are in the process of analyzing the eye movement data with respect to these dimensions.
Meeting abstract presented at VSS 2012
This PDF is available to Subscribers Only