Abstract
It is common knowledge that animals such as bats and dolphins use echolocation to navigate the environment and/or to locate prey. It is less well known, however, that humans are capable of using echolocation as well. Here we present behavioral and fMRI data from two blind individuals (aged 27 and 45 years) who produce mouth-clicks and use click-based-echolocation to go about their everyday activities, which include walking through crowded streets in unknown environments, mountain biking, and other spatially demanding activities. Behavioral testing under regular conditions (i.e. in which each person actively produced clicks) showed that both individuals could resolve the angular position of an object placed in front of them with high accuracy (∼ 2° of auditory angle at a distance of 1.5 meters). This extremely high level of performance is remarkable, but not unexpected, given what they are capable of doing in everyday life. To validate the stimuli we planned to use in fMRI conditions, we took in-ear audio recordings from each individual during active echolocation and played those recordings back using MRI compatible earphones. In these conditions, both individuals were still able to use echolocation to determine with considerable accuracy the angular position, shape (concave vs. flat), motion (stationary vs. moving), and identity (car vs. tree vs. streetlight) of objects. Importantly, during the recordings, none of the objects emitted any sound but simply offered a sound-reflecting surface. We conclude that echolocation, during both active production and passive listening, enables our two participants to perform tasks that are typically considered impossible without vision. To investigate the neural substrates of their echolocation abilities, we employed our passive listening paradigm in combination with fMRI (see Abstract ‘Human Echolocation II’).
This research was supported by a grant to MAG from the Canadian Institutes of Health Research.