Abstract
Changes in the visual system due to deafness provide information about how multisensory processes feedback to scaffold the development of unisensory systems. One common perspective in the literature is that visual inputs are highly spatial, whereas auditory inputs, in contrasts, are highly temporal. A simple multisensory account for sensory reorganization therefore predicts spatial enhancements and temporal deficits within the visual system of deaf individuals. Here I will summarize our past and ongoing research which suggests that evidence for this multisensory scaffolding hypothesis is confounded due to language deprivation in many samples. This is because most deaf people are born to nonsigning parents, and deaf children do not have full access to the spoken language around them. By studying visual processing in deaf individuals who are exposed early to perceivable visual language, such as American Sign Language, we (i) gain a better understanding of the interplay between auditory and visual systems during development, and (ii) accumulate evidence for the importance of early social interaction for the development of higher order visual abilities. Our data suggest that changes in vision over space are ecologically driven and subject to cognitive control, and that early linguistic interaction is important for the development of sustained attention over time.