September 2021
Volume 21, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2021
Online vs. In-Person Investigations of Multisensory Motion Perception
Author Affiliations
  • Kayla Soma Tsutsuse
    University of Hawaii at Manoa
  • Jonas Vibell
    University of Hawaii at Manoa
  • Scott Sinnett
    University of Hawaii at Manoa
Journal of Vision September 2021, Vol.21, 2486. doi:https://doi.org/10.1167/jov.21.9.2486
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Kayla Soma Tsutsuse, Jonas Vibell, Scott Sinnett; Online vs. In-Person Investigations of Multisensory Motion Perception. Journal of Vision 2021;21(9):2486. https://doi.org/10.1167/jov.21.9.2486.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Introduction: The global pandemic has required researchers to rely on virtual interactions and online platforms. Others have compared performance between on-line and web-based assessments (questionnaires; see Riva, Teruzzi, & Anolli, 2003), although the reliability of conducting psychophysical studies via Zoom has yet to be established. The present study investigated the reliability and validity of experimental results collected over Zoom. Methods: An audiovisual motion perception paradigm, previously used in-person, was deployed via Zoom. Participants selected an oddball video that contained varying speeds of two moving discs amongst an array of two other videos with constant motions. One of the videos in each array included a sound (or not in the control condition) that occurred at the collision point of the two discs. In-person findings showed participants reliably detected the oddball video (Kominsky et al., 2017), and that performance was modulated for sound conditions (Soma Tsutsuse, Vibell, & Sinnett, 2020). This paradigm allows for a direct comparison of reaction time, accuracy, and lag across platforms using time-sensitive videos synced with auditory stimuli. Results: In-person results showed that participants were significantly more accurate for conditions without sound compared to with sound, and exhibited faster reaction times for unnatural targets without sound. There was no significant interaction between motion type and sound. Similar results were observed for the online condition. Importantly, when directly comparing performance between in-person and online conditions, no significant effect was observed. Participants of both in-person and online conditions showed similar accuracy rates between testing modalities. Lastly, no differences were found in video presentation times, suggesting no lag differences online compared to in-person. Conclusion: The lack of lag and similarity of results across platforms suggests that experimental data can be collected over Zoom reliably and accurately with this paradigm. Furthermore, the virtual replication reflects promising experimental integrity of online platforms.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×