August 2014
Volume 14, Issue 10
Vision Sciences Society Annual Meeting Abstract  |   August 2014
Navigation patterns and spatial perception with and without vision using assistive technology for the blind
Author Affiliations
  • Shachar Maidenbaum
    Hebrew University department of Medical Neurobiology
  • Daniel-Robert Chebat
    Hebrew University department of Medical Neurobiology
  • Shelly Levy-Tzedek
    Hebrew University department of Medical Neurobiology
  • Amir Amedi
    Hebrew University department of Medical Neurobiology
Journal of Vision August 2014, Vol.14, 1355. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Shachar Maidenbaum, Daniel-Robert Chebat, Shelly Levy-Tzedek, Amir Amedi; Navigation patterns and spatial perception with and without vision using assistive technology for the blind. Journal of Vision 2014;14(10):1355.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

How does lack of vision affect the route one takes through an environment? How does this route change when different assistive tools are used? These questions have significant repercussions as Orientation and Mobility in unknown places pose one of the main challenges facing the blind. Currently, dedicated programs exist for helping the blind learn to navigate using the traditional white-cane. Throughout the years these programs were refined and were shown to significantly improve mobility. From this research have also emerged navigation patterns of white-cane users. Over the past decades many new devices have been developed for the blind. These devices offer different, and often more, information than the traditional white-cane, and may require different patterns of navigation and training for optimal use. Additionally it is unclear how useful some of these parameters, such as increased distance, actually are for non-visual navigation. Here, we use a series of virtual environments to explore the differences in navigation when using the virtual-EyeCane electronic travel aid (which offers increased distance information), when using a virtual version of the traditional white-cane, without using a device at all and when navigating visually. We show that the characteristics of navigating with the virtual-EyeCane differ from those of white-cane users and from navigation without an assistive device, and that virtual-EyeCane users complete more levels successfully, taking a shorter path and with less collisions than users of the white-cane or no device. Finally, we demonstrate that virtual navigation with the virtual-EyeCane takes on patterns relatively similar to those of navigating visually. In conclusion, these results suggest that navigation patterns learned from the white-cane are not necessarily optimal for other devices and that additional distance information is enough to change spatial perception and navigation patterns from those customarily used by the blind to patterns more similar to the sighted.

Meeting abstract presented at VSS 2014


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.