Abstract
Humans have the remarkable ability to integrate information from different senses in a way that greatly facilitates the detection, localization, and identification of events in the environment. Although much has been learned about how the senses work together to optimally perceive the world around us, knowledge about the impact of auditory impairments on multisensory integration is currently scarce. Given the ever growing number of hearing loss cases in our society, the goal of the current study was to examine how (simulated) asymmetric conductive hearing loss (AHL) influences unisensory and multisensory spatial localization. We examined the spatial and temporal properties of saccadic orienting behaviour towards auditory, visual, and audiovisual stimuli in normal hearing participants without and with a unilateral earplug. AHL was hypothesized to disrupt sound localization and cause a spatial conflict between hearing and vision. We expected that this conflict would distort MSI or would result in a decreased benefit from MSI. With normal hearing, multisensory integration improved participants’ localization as expected. AHL resulted in large auditory localization errors, but not on audiovisual trials. By comparing multisensory behaviour with predictions from maximum likelihood estimation, we found that the relative weight on vision greatly increased with AHL. This suggests that participants rapidly adapt their sensory integration strategy after AHL. Although spatial accuracy can thus be preserved by overweighting visual information, accuracy comes at the cost of localization speed. Our results indicate that observers dynamically change the weights of the different senses to effectively deal with impairments in one of the senses, in a way that is not fully explained by the statistical model of optimal cue integration. We propose that models of sensory integration should also encapsulate higher level cognitive factors on sensory weighing.