Purchase this article with an account.
David R. Wozny, Ladan Shams; Integration and segregation of visual-tactile-auditory information is Bayes-optimal. Journal of Vision 2006;6(6):176. doi: https://doi.org/10.1167/6.6.176.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
At any instant, the human brain is bombarded with sensory information originating from disparate sources. While acquiring multiple sensations tremendously enriches our perception of the environment, it also poses the nervous system with the critical problem of estimating which sensory signals have been caused by the same source and should be integrated, and which have been caused by different sources and should be segregated. We investigated how simultaneous signals in tactile, visual and auditory modalities are combined by the human nervous system, by examining when and how they get integrated and by comparing the human observers judgments with a Bayesian ideal observer. Methods: In each trial, participants were presented with variable number of flashes, taps, and beeps and were asked to report the number of flashes, taps, and beeps. Unimodal, bimodal, and trimodal (visual, auditory, and tactile) trials were interleaved. Results: As expected, we were able to produce both the sound-induced flash illusion (Shams et al., 2000), and the touch-induced flash illusion (Violentyev et al., 2005) previously reported. Smaller tactile-auditory illusory effects were also found. We also found three-way interactions among the three modalities when there was an inconsistency between the modalities. Importantly, observers' performance in all conditions was highly consistent with that of the Bayesian ideal observer. Conclusion: These results show that human auditory-visual-tactile perception is Bayes-optimal, and this optimal and general strategy may be the underlying principle for integration as well as segregation of sensory signals across modalities.
This PDF is available to Subscribers Only