December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
COCO-CursorSearch: A large-scale cursor movement dataset approximating eye movement in visual search
Author Affiliations
  • Yupei Chen
    The Smith-Kettlewell Eye Research Institute
  • Gregory Zelinsky
    Stony Brook University
Journal of Vision December 2022, Vol.22, 3748. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Yupei Chen, Gregory Zelinsky; COCO-CursorSearch: A large-scale cursor movement dataset approximating eye movement in visual search. Journal of Vision 2022;22(14):3748.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Eye movements are widely used as an overt measure of attention in visual search, and recent work introduced COCO-Search18 to help build models to predict search fixation and scanpath. But accurate eye-movement recording still requires an eye-tracker, limiting the creation of far larger datasets. Here we introduce COCO-CursorSearch, a dataset paralleling COCO-Search18 (same images and targets) except that search behavior was obtained using a cursor-contingent moving window instead of measuring eye movements. A high-resolution central window (3° in radius) was repositioned over an otherwise blurred version of the search image, with target-present determination depending on an alignment of the cursor “fovea” with the target. Our goal is to identify patterns of cursor-search movements that correlate highly with eye movements during search, thereby validating the cursor movement data collected online. We compared COCO-CursorSearch to COCO-Search18 using both spatial and spatial-temporal measures. For target-present trials, the cursor-search density map accounted for 94% of the inter-observer-consistency (IOC) from the eye fixation-density map in Correlation Coefficient (CC) and 84% in Normalized Scanpath Saliency (NSS). Moreover, cursor-search scanpath accounted for more than 88% of the eye scanpath IOC in seven scanpath comparison metrics. For target-absent trials, the cursor-search density map accounted for 85% of the eye fixation IOC in CC and 80% in NSS. The cursor-search scanpath also accounted for around or more than 80% eye scanpath IOC on five scanpath metrics. For both target-present and target-absent conditions, correlations between cursor search and eye search were significant on all the applicable metrics across both target categories and trials (all ps < .05). We conclude that cursor search movement closely approximates eye-fixation search movement in multiple measures, thereby validating using a cursor to collect visual search data. The results suggest that both measures capture the fundamental constraint of aligning a central “fovea” with a search target.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.