August 2009
Volume 9, Issue 8
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2009
Eye-blinks and tracking
Author Affiliations
  • Deborah Aks
    Rutgers Center for Cognitive Science, Rutgers University
  • Harry Haladjian
    Rutgers Center for Cognitive Science, Rutgers University
  • Zenon Pylyshyn
    Rutgers Center for Cognitive Science, Rutgers University
  • Alexander Hakkinen
    Rutgers Center for Cognitive Science, Rutgers University
Journal of Vision August 2009, Vol.9, 249. doi:https://doi.org/10.1167/9.8.249
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Deborah Aks, Harry Haladjian, Zenon Pylyshyn, Alexander Hakkinen; Eye-blinks and tracking. Journal of Vision 2009;9(8):249. https://doi.org/10.1167/9.8.249.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Visual Indexing Theory proposes a referential mechanism that tracks objects in a visual scene without necessarily encoding object properties, as demonstrated through Multiple Object Tracking experiments. The encoding of location information during object tracking, however, remains a possible exception. In the current studies, we tracked eye movements during a standard MOT task and employed a blink-contingent methodology in which objects stopped moving or disappeared during eye-blinks. Because this halting is synchronized with eye-blinks, we were able to examine natural, intrinsically generated disruptions of the visual scene without inadvertently cuing the object change.

Experiment 1 examined the effect of changes in object motion that occurred during spontaneous eye-blinks. Subjects performed a standard MOT task (4 targets, 8 non-targets). In half the trials, the objects halted for the duration of all blinks; these trials were randomized among trials where objects continued their movement. The results indicate that a blink-contingent halting of objects produces fewer tracking errors, but this effect diminishes with practice. Also, fewer fixations were correlated with better tracking performance in the last block.

Experiment 2 tested for location-encoding during MOT when objects disappeared during more natural interruptions (instead of occlusions or abrupt disappearances). We replicated the main features of Keane & Pylyshyn (2006) except we replaced occlusions with blink-contingent disappearances. We used a simple sound to signal subjects to blink once during each trial. This voluntary blink induced a change in object motion (halting or continuing along trajectories) and disappearance (150, 300, 450, or 900 ms). The results revealed superior tracking performance in the halt conditions, with lower performance as disappearance duration increased in both conditions.

Overall, our results suggest that location information and trajectory extrapolation are not crucial for tracking. When abrupt changes in a scene are visually detected, the most recently sampled location may be retrieved.

Aks, D. Haladjian, H. Pylyshyn, Z. Hakkinen, A. (2009). Eye-blinks and tracking [Abstract]. Journal of Vision, 9(8):249, 249a, http://journalofvision.org/9/8/249/, doi:10.1167/9.8.249. [CrossRef]
Footnotes
 We thank Allan Kugel for his contributions.
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×