Purchase this article with an account.
Warrick Roseboom, Shin'ya Nishida, Derek H. Arnold; The sliding window of audio–visual simultaneity. Journal of Vision 2009;9(12):4. doi: https://doi.org/10.1167/9.12.4.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Humans exist in an environment wherein many unrelated events occur in close spatial and temporal proximity. Audio–visual timing experiments, however, have often examined only isolated pairs of sensory events. We therefore decided to assess how audio–visual timing perception would be shaped by the presence of an additional audio or visual event. We found that the point of subjective synchrony for a sensory event can be shifted away from the presence of other temporally proximate events. These interactions made audio–visual pairs seem unrelated, or asynchronous, at timings at which they had seemed synchronous when presented in isolation. This shows that the interval across which humans are insensitive to audio–visual asynchrony is not fixed, but dynamic, shaped by interactions between multiple sensory events. Importantly, we establish that these interactions can enhance the sensitivity of timing judgments. These interactions could therefore help to segregate unrelated sensory events across time. Such effects are likely to be common in the cluttered environments in which humans exist.
This PDF is available to Subscribers Only