September 2017
Volume 17, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2017
Touch dominates vision in a shape processing task – a virtual-reality study.
Author Affiliations
  • Hyeokmook Kang
    Cognitive Systems Lab, Korea University
  • Christian Wallraven
    Cognitive Systems Lab, Korea University
Journal of Vision August 2017, Vol.17, 595. doi:https://doi.org/10.1167/17.10.595
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Hyeokmook Kang, Christian Wallraven; Touch dominates vision in a shape processing task – a virtual-reality study.. Journal of Vision 2017;17(10):595. https://doi.org/10.1167/17.10.595.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Although humans are experts for visual shape processing, several recent studies have demonstrated that haptics can also create highly detailed shape representations. Here, we investigate how vision and touch are processed when both are put in a potentially noticeable conflict in a shape similarity judgment task. To have full control over the parameters of the conflict, we use a novel, calibrated virtual reality setup in which observers see their hand exploring an object, while touching a physical instantiation at the same time in the real world. For the experiment, N=18 participants explored two objects in succession (one baseline and one test object) in each trial and indicated whether they were the same or different. Stimuli were taken from a parametrized morph-space of novel, three-dimensional objects varying in perceptually equidistant steps. We used two randomly interleaved staircases to identify the morph-parameter difference for which an object would be perceived as "same", compared to one of two baseline objects. Importantly, the staircases were run in two conditions: a congruent condition, in which the visually- and haptically-explored objects were the same and an incongruent condition, in which the haptic information for the test object was much closer to the baseline object than the visual information. Paired t-tests on the final morph-parameter differences in the congruent and incongruent condition showed that participants perceived the incongruent conditions to be much closer to the baseline condition than the congruent conditions (t(17)=6.72, p< .001), indicating that the haptic input influenced the overall judgment. Surprisingly, 15 out of 18 participants even showed "haptic capture" in this conflict condition, choosing to largely ignore the visual information (which was judged by all participants to be highly realistic in the virtual-reality display). Our results show that even for shape processing, haptic information can override visual input in a supra-threshold conflict task.

Meeting abstract presented at VSS 2017

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×