May 2008
Volume 8, Issue 6
Free
Vision Sciences Society Annual Meeting Abstract  |   May 2008
Integration of object-centered and viewer-centered visual information in an open-loop pointing task
Author Affiliations
  • Patrick Byrne
    Centre for Vision Research, York University
  • Smiley Pallan
    Department of Psychology, York University
  • XiaoGang Yan
    Centre for Vision Research, York University
  • Doug Crawford
    Centre for Vision Research, York University, and Department of Psychology, York University
Journal of Vision May 2008, Vol.8, 47. doi:https://doi.org/10.1167/8.6.47
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Patrick Byrne, Smiley Pallan, XiaoGang Yan, Doug Crawford; Integration of object-centered and viewer-centered visual information in an open-loop pointing task. Journal of Vision 2008;8(6):47. https://doi.org/10.1167/8.6.47.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Investigations on the relative contribution of object-centered (allocentric) and viewer-centered (egocentric) visual information to motor behaviour often have focused on determining which of these sources of information predominate, and under which circumstances. In contrast, multi-sensory integration studies have shown that information from different sensory modalities is combined based on reliability estimates for each modality. We sought to determine if a similar process is applied to egocentric and allocentric visual information in an open-loop pointing task. METHOD: Head-restrained, gaze-fixated subjects were presented briefly with a peripheral visual stimulus consisting of a to-be-remembered yellow dot surrounded by an array of four vibrating blue dots (allocentric stimulus) situated at the vertices of an invisible square. During a delay period subjects made controlled eye movements. After a brief reappearance of the allocentric stimulus without the to-be-remembered yellow dot, subjects made a pointing response to indicate the remembered location of the yellow dot. In order to tease apart the contribution of egocentric and allocentric cues, the allocentric stimulus was shifted randomly by three degrees from presentation to test, thus introducing cue-conflict. The vibration amplitude of the allocentric stimulus and the eye movement amplitude were varied randomly between two fixed levels each from trial to trial in order to control the reliability of allocentric and egocentric information, respectively. PREDICTION: Based on multisensory integration findings, we predicted that lower vibration and higher eye movement amplitude would bias pointing results towards a position consistent with the shifted allocentric stimulus, while larger vibration and smaller saccade amplitudes would have the opposite effect. RESULTS: For small eye movement conditions mean pointing responses were biased as expected according the vibration amplitude of the allocentric stimulus. However, mean pointing responses were halfway between the two locations and indistinguishable from one another for both vibration amplitude conditions when eye movement amplitude was large.

Byrne, P. Pallan, S. Yan, X. Crawford, D. (2008). Integration of object-centered and viewer-centered visual information in an open-loop pointing task [Abstract]. Journal of Vision, 8(6):47, 47a, http://journalofvision.org/8/6/47/, doi:10.1167/8.6.47. [CrossRef]
Footnotes
 We wish to acknowledge financial backing from the Natural Sciences and Engineering Research Council of Canada and the Canadian Institutes of Health Research.
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×