August 2016
Volume 16, Issue 12
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2016
Object localisation using visual to tactile and visual to auditory sensory substitution
Author Affiliations
  • Dustin Venini
    School of Psychology, The University of Queensland
  • Ernst Ditges
    School of Psychology, The University of Queensland
  • Nicholas Sibbald
    School of Psychology, The University of Queensland
  • Hayley Jach
    School of Psychology, The University of Queensland
  • Stefanie Becker
    School of Psychology, The University of Queensland
Journal of Vision September 2016, Vol.16, 1198. doi:https://doi.org/10.1167/16.12.1198
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Dustin Venini, Ernst Ditges, Nicholas Sibbald, Hayley Jach, Stefanie Becker; Object localisation using visual to tactile and visual to auditory sensory substitution. Journal of Vision 2016;16(12):1198. https://doi.org/10.1167/16.12.1198.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

With over 285 million visually impaired people worldwide there is growing interest in sensory substitution -- a non-invasive technology substituting information from one sensory modality (e.g., vision) with another sensory modality (e.g., touch). Previous work has focused primarily on how blind or vision-impaired people discriminate between different types of objects using sensory substitution devices (SSDs). A fraction of this work has explored whether and to what extent SSDs support precise localisation of objects in space; these studies report target location errors of around 8-14 cm. Here we investigated the object localisation ability of visually impaired participants using a visual to auditory (the vOICe) and a visual to tactile (custom built) SSD. In three separate conditions participants had to point to a white disk presented against a black background on a touchscreen. In the first task the SSD conveyed information only about the location of the disk, in the second task the participant's hand was displayed in addition to the disk, and in the third task a white reference border marking the monitor frames was also added to the display. We found participants were slightly more accurate overall than in previous studies (< 6 cm error), however localisation accuracy did not significantly differ across the three conditions. Participants' responses were slower in the "hand" and "reference" conditions, suggesting that the additional information acted like a distractor, rendering the task more difficult. This result suggests that the processing of otherwise visual information via the auditory and tactile modalities is severely limited, especially when multiple objects are presented in parallel, suggesting that filtering of relevant information is critical to enhancing performance of future SSDs.

Meeting abstract presented at VSS 2016

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×