October 2020
Volume 20, Issue 11
Open Access
Vision Sciences Society Annual Meeting Abstract  |   October 2020
Multisensory effects on causal perception
Author Affiliations
  • Kayla Soma Tsutsuse
    University of Hawaii at Manoa
  • Jonas Vibell
    University of Hawaii at Manoa
  • Scott Sinnett
    University of Hawaii at Manoa
Journal of Vision October 2020, Vol.20, 1759. doi:https://doi.org/10.1167/jov.20.11.1759
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Kayla Soma Tsutsuse, Jonas Vibell, Scott Sinnett; Multisensory effects on causal perception. Journal of Vision 2020;20(11):1759. https://doi.org/10.1167/jov.20.11.1759.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Previous research has shown that visual perception is influenced by Newtonian constraints. For instance, Kominsky et al. (2017) showed that humans can detect unnatural collision events at a faster rate, where objects break Newtonian motion constraints by moving at a faster speed after colliding with another object, compared to collisions that do not violate Newtonian constraints. Their results provide evidence that the perceptual system distinguishes between realistic and unrealistic causal events. However, collisions between two objects are rarely silent in the real-world. The present study extends this research by adding a sound at the collision point between two objects to evaluate how multisensory integration influences the perception of colliding events that either follow or violate Newtonian constraints. To accomplish this, participants viewed an array of three simultaneous videos, each video depicting two moving objects. Two of the videos showed discs that moved at the same speed in a horizontal back and forth motion, and an oddball video of discs that either moved faster before the collision and slower after (natural) or slower before the collision and faster after (unnatural), thereby violating Newtonian motion constraints in the latter. Participants were asked to indicate the oddball video via keypress. Results demonstrate that participants were faster and more accurate when identifying natural events that included sound compared to silent conditions. However, similar results were found when participants responded to unnatural conditions with no sound at the collision point in comparison to conditions with a sound. These findings suggest that the addition of a sound to the unnatural events lead the perceiver to view them as being more realistic, even though they continue to violate the constraints of the physical world. Furthermore, this provides evidence of the complexity of interactions that influence the human visual perceptual system and its ability to perceive causal events.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×