Journal of Vision Cover Image for Volume 21, Issue 9
September 2021
Volume 21, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2021
How blurry are echoes? Quantifying the spatial resolution of echoic vs. visual object perception
Author Affiliations & Notes
  • Santani Teng
    Smith-Kettlewell Eye Research Institute
  • Michael Ezeana
    University of Central Arkansas
  • Nickolas Paternoster
    University of Central Arkansas
  • Amrita Puri
    University of Central Arkansas
  • Footnotes
    Acknowledgements  This work was supported by an NIH T32 Training Grant (to ST); The Foundation for Ophthalmology Research and Education – International (to ST); UCA AURS summer research award (to ME, AP); UCA Faculty Research Grant (to AP)
Journal of Vision September 2021, Vol.21, 2352. doi:https://doi.org/10.1167/jov.21.9.2352
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Santani Teng, Michael Ezeana, Nickolas Paternoster, Amrita Puri; How blurry are echoes? Quantifying the spatial resolution of echoic vs. visual object perception. Journal of Vision 2021;21(9):2352. https://doi.org/10.1167/jov.21.9.2352.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Recent research has explored the use of active echolocation by blind individuals, who, by generating mouth-clicks, elicit echoes and use them to perceive and interact with their surroundings. In prior work we showed that expert practitioners can distinguish the positions of objects separated by as little as ~1.5°, the approximate threshold of visual letter recognition at 35° retinal eccentricity. They can also echolocate household-sized objects, then distinguish them haptically from a distractor with significantly above-chance accuracy (~60%). Here we investigated whether the spatial resolution of crossmodal echo-haptic object discrimination is similar to that measured for localization. We found that blindfolded sighted participants tested on the same crossmodal match-to-sample design performed similarly, but with greater inter-individual variability. Performance was similar for both common household objects and novel (Lego) objects of arbitrary shape. This suggests that some coarse object information a) is available to both expert blind and novice sighted echolocators, b) transfers from auditory to haptic modalities, c) is not dependent on prior object familiarity, and d) may require a larger angular size than was subtended by our test objects. Thus, we repeated the match-to-sample experiments using stimuli enlarged by 50% along each dimension. Preliminary results do not show improved performance with larger object size; feedback after each trial in future sessions may improve accuracy. Next, we aimed to directly estimate the equivalent visual resolution of echoic object perception. In a pilot experiment, sighted participants examined target objects visually at 35° eccentricity and, subsequently, identified the target haptically. Performance was ~85%, suggesting that haptic recognition is better informed by visual object information at 35° than by object echoes at the scales we tested. Manipulating visual blur to equate visual and echoic performance will reveal more precisely the spatial resolution of echo-based object perception.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×