Abstract
Background: Echolocation is the ability to perceive the environment by making sonar emissions and listening to returning echoes. For people, it has been suggested that echolocation may not only draw on auditory, but also 'visual' processing (Arnott, Thaler, Milne, Kish, & Goodale, 2013; Thaler, Arnott, & Goodale, 2011; Thaler, Wilson, & Gee, 2014). Here we used an interference paradigm to further explore the interaction between vision and echolocation. Method: Blindfolded sighted echo-naive participants used mouth-click based echolocation to discriminate sizes of objects. Participants wore black-out goggles fitted with LEDs. The goggles blacked out vision of the environment at all times, but LEDs when switched on also provided task-irrelevant visual input. In a control condition, blindfolded sighted and blind participants performed the same echolocation task, whilst also wearing electrode patches, which when switched on provided task-irrelevant tactile stimulation (i.e. Transcutaneous Electrical Nerve Stimulation). Results: Participants' echolocation accuracy scores were significantly reduced in conditions where luminance input had been provided, as compared to conditions where it had not been provided. This drop in performance was not observed in the control condition that had used task-irrelevant tactile stimulation. Discussion: The results suggest that visual but not tactile sensory input 'interferes' with echolocation, suggesting that vision and echolocation may interact at an early sensory stage in the human brain
Meeting abstract presented at VSS 2016