August 2010
Volume 10, Issue 7
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2010
Eye-hand coordination in finding and touching a target among distractors
Author Affiliations
  • Hang Zhang
    Department of Psychology, New York University
    Center for Neural Science, New York University
  • Camille Morvan
    Department of Psychology, New York University
    Center for Neural Science, New York University
  • Louis-Alexandre Etezad-Heydari
    Department of Psychology, New York University
  • Laurence Maloney
    Department of Psychology, New York University
    Center for Neural Science, New York University
Journal of Vision August 2010, Vol.10, 532. doi:10.1167/10.7.532
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to Subscribers Only
      Sign In or Create an Account ×
    • Get Citation

      Hang Zhang, Camille Morvan, Louis-Alexandre Etezad-Heydari, Laurence Maloney; Eye-hand coordination in finding and touching a target among distractors. Journal of Vision 2010;10(7):532. doi: 10.1167/10.7.532.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

We asked observers to find and touch a target among distractors. Observers earned money by touching the target. Earnings decrease linearly with movement time. If observers did not initiate hand movement until the target was found they would earn much less than if they attempted to integrate visual search and reach. Two clusters of objects were located to left and right of the midline of the display, one cluster containing four objects, the other two. Each object was equally likely to be the target. Initially the observer did not know which object was the target but could gain information by searching. The observer could potentially update his movement trajectory based on information from visual search. Optimal initial movement strategy was to move toward the larger cluster while optimal visual search strategy was to first search the smaller, thereby quickly learning which cluster contained the target. We compare observers' initial search/reach to the performance leading to maximum expected gain (MEG). Methods: Objects for the search/reach task were presented on a 32′′ ELO touchscreen located on a virtual arc around a starting position. Eye movements were tracked with an Eyelink II tracker. Before the search/reach task, observers were trained in moving on the touchscreen and in visual search with keypress response. Five naïve observers participated. Results: For each trial we recorded the direction of initial movement and the cluster initially search. Two observers correctly searched the cluster with fewer objects first (p <.001) while moving toward the other (p <.01). These observers earned within 9% of MEG. The remaining observers failed to search/reach optimally (p > .05). These observers received on average 62% of MEG.. As observers searched objects, they eliminated possible targets. We will discuss how information accrued in visual search affected the movement trajectory.

Zhang, H. Morvan, C. Etezad-Heydari, L.-A. Maloney, L. (2010). Eye-hand coordination in finding and touching a target among distractors [Abstract]. Journal of Vision, 10(7):532, 532a, http://www.journalofvision.org/content/10/7/532, doi:10.1167/10.7.532. [CrossRef]
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×