September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
Integrating Impaired Vision and Hearing to Improve Spatial Localization
Author Affiliations & Notes
  • Yingzi Xiong
    Johns Hopkins University
    University of Minnesota
  • Quan Lei
    Wichita State University
  • Shirin Hassan
    Indiana University
  • Daniel Kersten
    University of Minnesota
  • Gordon Legge
    University of Minnesota
  • Footnotes
    Acknowledgements  NIH R00 EY030145
Journal of Vision September 2024, Vol.24, 675. doi:https://doi.org/10.1167/jov.24.10.675
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Yingzi Xiong, Quan Lei, Shirin Hassan, Daniel Kersten, Gordon Legge; Integrating Impaired Vision and Hearing to Improve Spatial Localization. Journal of Vision 2024;24(10):675. https://doi.org/10.1167/jov.24.10.675.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Introduction. Spatial localization, which is critical for safe mobility and social interactions, relies heavily on vision and hearing. When vision and/or hearing impairment occurs, integrating vision and hearing may maximize the use of the residual senses. However, such impairment is often associated with degraded sensory input and unstable sensory status, which may influence the integration process. Here we investigated the integration of vision and hearing in a spatial localization task in individuals with heterogeneous vision and hearing impairment. Methods. Eighty-five participants completed a spatial localization task: 36 younger and 13 older controls with normal vision and hearing, 10 with hearing impairment only, 13 with vision impairment only, and 13 with dual vision and hearing impairment. Participants verbally reported the directions of visual (200ms, 3 deg diameter, 90% contrast target), auditory (200ms, pink noise with 200-8000 Hz, 60 dB Hearing Level), or audiovisual targets (simultaneous from the same location) across 17 locations spanning 180 degrees in the horizontal plane. Spatial biases (offsets) and uncertainties (variability) were obtained for each location in each condition. Results. Vision and hearing impairments were each associated with increased biases and uncertainties in unimodal localization, resulting in large variations across locations and individuals. To reconcile these variations, we identified individualized integration zones and segregation zones based on whether the audiovisual discrepancies support a common cause inference. Across all locations, people with sensory impairment, especially those with dual sensory impairment, showed less integration zones than controls. However, the benefit of integration (reduced uncertainty in the bimodal condition) in the integration zones, or lack thereof in the segregation zones, were consistent across all groups. Conclusion. Impairments in vision and hearing reduce the likelihood of making a common cause inference while localizing a bimodal target. However, the advantage of integration persists when the criteria for a common cause are satisfied.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×