September 2021
Volume 21, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2021
Distance Influences Affordance Perception of Standonability in Virtual Reality
Author Affiliations & Notes
  • Tyler Surber
    School of Psychology, University of Southern Mississippi, Hattiesburg, MS, USA
  • Alen Hajnal
    School of Psychology, University of Southern Mississippi, Hattiesburg, MS, USA
  • Krisztian Samu
    Department of Mechatronics, Optics and Engineering Informatics, Budapest University of Technology and Economics, Hungary
  • Tyler Overstreet
    School of Psychology, University of Southern Mississippi, Hattiesburg, MS, USA
  • Ashley Funkhouser
    School of Psychology, University of Southern Mississippi, Hattiesburg, MS, USA
  • Gabor Legradi
    College of Osteopathic Medicine, William Carey University, Hattiesburg, MS, USA
  • Footnotes
    Acknowledgements  The research reported in this paper has been supported by the National Research, Development and Innovation Fund (TUDFO/51757/2019-ITM, Thematic Excellence Program).
Journal of Vision September 2021, Vol.21, 1879. doi:https://doi.org/10.1167/jov.21.9.1879
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Tyler Surber, Alen Hajnal, Krisztian Samu, Tyler Overstreet, Ashley Funkhouser, Gabor Legradi; Distance Influences Affordance Perception of Standonability in Virtual Reality. Journal of Vision 2021;21(9):1879. https://doi.org/10.1167/jov.21.9.1879.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Hajnal, Rumble, Shelley-Tremblay, and Liu (2014) have discovered that the presence of a flat surface at a steep angle in front of an observer during quiet stance stabilized posture and increased movement complexity. Was this effect due to slant or distance from the eye? The present study was designed to test the effect of both in virtual reality using an affordance task. A virtual sloped surface was presented frontally at different geographical slants (0°-90°) and at three egocentric distances (near, mid-range, and far). Participants decided whether the ramp supported standing (affordance judgment) and how steep the slope was (angle judgment). Head movement data and response time was recorded from the Oculus Rift VR headset. The perceived action boundary computed using probit analysis was around 30 degrees for far distances and overestimated for near and mid distances (35 and 33 degrees, respectively). Response time for affordance judgments was the longest at the action boundary. For angular judgments, response time was longest at 45 degrees, the arithmetic midpoint of the stimulus range. Mean magnitude and standard deviation of head movements remained constant across slant angles for the far and mid-distance range but was lowest around task-relevant transition points for the near distance. Due to the brevity of responses, we could not use multifractal parameters, so effort-to-compress (ETC; see Nagaraj & Balasubramanian, 2017) was computed as a measure of complexity. ETC was minimal at transition points for the two corresponding tasks for near distances. Spatial proximity may have been a crucial performance factor, as it is within action-relevant distance. The results show that movement complexity deteriorates around task-relevant transition points independent of mean magnitude and variability of postural sway, potentially signaling task difficulty. The results demonstrate the importance of movement parameters in specifying perceptual performance in affordance tasks.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×