August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Free Moving Gaze-related Electroencephalography in Mobile Virtual Environments
Author Affiliations & Notes
  • Ying Choon Wu
    UC San Diego
  • Chiyuan Chang
    UC San Diego
  • Weichen Liu
    UC San Diego
  • Cory Stevenson
    National Yang Ming Chiao Tung University
  • Russell Cohen Hoffing
    Army Research Laboratory
  • Steven Thurman
    Army Research Laboratory
  • Tzyy-Ping Jung
    UC San Diego
  • Footnotes
    Acknowledgements  This study was supported by grant #1734883 from the National Science Foundation and #W911NF2120126 and #W911NF2020088 from the Army Research Laboratory
Journal of Vision August 2023, Vol.23, 5305. doi:https://doi.org/10.1167/jov.23.9.5305
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Ying Choon Wu, Chiyuan Chang, Weichen Liu, Cory Stevenson, Russell Cohen Hoffing, Steven Thurman, Tzyy-Ping Jung; Free Moving Gaze-related Electroencephalography in Mobile Virtual Environments. Journal of Vision 2023;23(9):5305. https://doi.org/10.1167/jov.23.9.5305.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

In realistic contexts involving natural, unconstrained head and body movement, methods for parsing eye movements and using these segmentations to analyze other modalities of synchronously recorded physiological data are still in their infancy. In this study, we recorded eye and head movements, along with simultaneous electroencephalography (EEG), during a 3D visual oddball task in a virtual environment. Healthy adults evaluated standards and deviants presented to their near or far peripheral field of view either by moving their eyes only (low head movement (HM)) or turning their heads (high HM). Compensatory eye movements – likely related to vestibulo-ocular reflex – were found to accompany high HM, suggesting the possible inadequacy of algorithms based purely on the angular velocity of pupil movement for parsing fixations and saccades during head movement – as visual processing likely begins before compensatory eye movement attenuates. To assess the validity of a velocity-based parsing approach, we compared three approaches to computing fixation-related potentials (FRPs) during either low or high HM – stimulus-locking, gaze-related fixation-locking, and simple gaze-locking. Under low HM conditions, both gaze-related fixation-locking and simple gaze-locking yielded classic oddball effects within the expected time window of the P300. Further, evidence of sensitivity to the relative frequency of standards versus deviants was detectable from about 300 ms before fixation onset. On high HM trials, gaze-related fixation-locking yielded a more robust P300 effect than simple gaze-locking. This outcome reveals that despite uncertainty in fixation onset times during head turns, fixation-locking approaches incorporating gaze information are viable in paradigms involving head movement.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×