Abstract
To what extent is visual detection performance disrupted by auditory attention capture? More specifically, does visual detection performance suffer when an auditory stimulus induces people to direct attention inward to the meaning of that stimulus rather than outward toward the visual display? Studies of visual attention often focus on how individuals are able to allocate their attention to various perceptual sources in the environment. Yet, the internal processing of information, such as manipulating information in working memory or speech comprehension, is also attentionally demanding. If attention is conceptualized as a general pool of resources that gives processing priority to some sources over others, then internal information processing and visual attention must both draw from this pool. The ability for speech to impact visual attention, specifically the processing of incongruous or highly meaningful speech, was investigated in a series of experiments. Experiments 1 and 2 looked at how hearing a semantically inconsistent word in a stream of non-sentential, category-consistent words could affect visual detection performance. Experiments 3 and 4 assessed the impact of hearing taboo words on visual detection performance. Experiments 1 and 2 revealed no disruption to visual attention as a result of hearing semantically incongruent information, and actually trended toward showing a slight performance boost in the visual task when hearing category-inconsistent words. Auditory presentation of taboo words in Experiments 3 and 4 was consistently able to disrupt visual detection performance, delaying responses to visual targets presented immediately following taboo words. As a whole, the current data suggest that when attention is split between the processing of auditory information and a visual task, surprising or meaningful auditory information can impact visual attention and that the effects vary as a function of the nature of this information.
Meeting abstract presented at VSS 2015