September 2005
Volume 5, Issue 8
Free
Vision Sciences Society Annual Meeting Abstract  |   September 2005
Collaborative search using shared eye gaze
Author Affiliations
  • Gregory J. Zelinsky
    Department of Psychology, State University of New York at Stony Brook
  • Christopher A. Dickinson
    Department of Psychology, State University of New York at Stony Brook
  • Xin Chen
    Department of Psychology, State University of New York at Stony Brook
  • Mark B. Neider
    Department of Psychology, State University of New York at Stony Brook
  • Susan E. Brennan
    Department of Psychology, State University of New York at Stony Brook
Journal of Vision September 2005, Vol.5, 700. doi:https://doi.org/10.1167/5.8.700
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Gregory J. Zelinsky, Christopher A. Dickinson, Xin Chen, Mark B. Neider, Susan E. Brennan; Collaborative search using shared eye gaze. Journal of Vision 2005;5(8):700. https://doi.org/10.1167/5.8.700.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Search need not be solitary. We explored the potential for people to collaborate during search, using only their gaze. Pairs of searchers (A, B), located in different rooms, jointly performed a difficult O in Qs search task. Searchers viewed identical displays and either participant could respond target present (TP) or absent (TA), with that response ending the trial for both. Collaboration was encouraged by instruction and payoff matrix. Both searchers wore ELII eyetrackers, which were interconnected via Ethernet. In the shared-gaze (SG) condition, a 1.7 deg yellow ring representing A's eye position was superimposed over B's search display, and vice versa. Each participant therefore knew where their partner was looking during search by the position of this gaze cursor on their display. In the non-shared gaze (NG) condition, searchers performed the identical task but could no longer see their partner's gaze cursor (i.e., no potential for collaboration). We found that TP RTs averaged 448 ms faster in the SG compared to the NG condition, suggesting a gaze-related benefit from collaboration. No RT differences were found in the TA data. For each trial, we also analyzed: (1) the spatial overlap between A's and B's distributions of fixations by display quadrant, and (2) the proportion of display items fixated by both searchers. If searchers were able to use shared gaze to divide the labor of the task, we would expect minimal overlap in (1) and a small percentage of doubly inspected items in (2). Both predictions were confirmed. Mean quadrant overlap in the SG TP (.21) and TA (.42) conditions was less than half that of the TP (.46) and TA (.81) NG conditions (0 = no overlap, 1 = complete overlap). Similarly, the percentage of items fixated by both searchers was smaller in the SG (10% TP, 29% TA) compared to the NG conditions (16% TP, 55% TA). We conclude that communication is possible using only eye gaze and that this communication can produce collaborative benefits in a search task.

Zelinsky, G. J. Dickinson, C. A. Chen, X. Neider, M. B. Brennan, S. E. (2005). Collaborative search using shared eye gaze [Abstract]. Journal of Vision, 5(8):700, 700a, http://journalofvision.org/5/8/700/, doi:10.1167/5.8.700. [CrossRef]
Footnotes
 This work was supported by NSF grant ITR 0082602 and NIMH grant R01-MH63748
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×