Abstract
Search need not be solitary. We explored the potential for people to collaborate during search, using only their gaze. Pairs of searchers (A, B), located in different rooms, jointly performed a difficult O in Qs search task. Searchers viewed identical displays and either participant could respond target present (TP) or absent (TA), with that response ending the trial for both. Collaboration was encouraged by instruction and payoff matrix. Both searchers wore ELII eyetrackers, which were interconnected via Ethernet. In the shared-gaze (SG) condition, a 1.7 deg yellow ring representing A's eye position was superimposed over B's search display, and vice versa. Each participant therefore knew where their partner was looking during search by the position of this gaze cursor on their display. In the non-shared gaze (NG) condition, searchers performed the identical task but could no longer see their partner's gaze cursor (i.e., no potential for collaboration). We found that TP RTs averaged 448 ms faster in the SG compared to the NG condition, suggesting a gaze-related benefit from collaboration. No RT differences were found in the TA data. For each trial, we also analyzed: (1) the spatial overlap between A's and B's distributions of fixations by display quadrant, and (2) the proportion of display items fixated by both searchers. If searchers were able to use shared gaze to divide the labor of the task, we would expect minimal overlap in (1) and a small percentage of doubly inspected items in (2). Both predictions were confirmed. Mean quadrant overlap in the SG TP (.21) and TA (.42) conditions was less than half that of the TP (.46) and TA (.81) NG conditions (0 = no overlap, 1 = complete overlap). Similarly, the percentage of items fixated by both searchers was smaller in the SG (10% TP, 29% TA) compared to the NG conditions (16% TP, 55% TA). We conclude that communication is possible using only eye gaze and that this communication can produce collaborative benefits in a search task.
This work was supported by NSF grant ITR 0082602 and NIMH grant R01-MH63748