June 2004
Volume 4, Issue 8
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2004
Making virtual reality “more real” and the perception of potential collisions
Author Affiliations
  • Russell L. Woods
    Schepens Eye Research Institute, USA Harvard Medical School, Boston, USA
  • Aaron J. Mandel
    Schepens Eye Research Institute, USA
  • James Barabas
    Schepens Eye Research Institute, USA
  • Robert B. Goldstein
    Schepens Eye Research Institute, USA Harvard Medical School, Boston, USA
  • Eli Peli
    Schepens Eye Research Institute, USA Harvard Medical School, Boston, USA
Journal of Vision August 2004, Vol.4, 814. doi:10.1167/4.8.814
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Russell L. Woods, Aaron J. Mandel, James Barabas, Robert B. Goldstein, Eli Peli; Making virtual reality “more real” and the perception of potential collisions. Journal of Vision 2004;4(8):814. doi: 10.1167/4.8.814.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Making the experience in virtual reality closer to the “real world” experience (e.g. actually walking, rather than standing or sitting, in a walking simulator) may affect task performance. Improved experience of “presence” might make performance in the virtual reality similar to real-world performance, whereas poor presence or an incorrect rendition might impair performance. We measured perception of a potential collision with stationary obstacles using four experimental situations to compare: standing or walking; walking with or without subject speed control; and correct and incorrect viewpoint. Subjects stood or walked on a treadmill 75cm in front of a 95-degree-wide screen that displayed a “shopping mall” corridor with textured floor and shop fronts. Adult-man-size obstacles appeared for 1 second and the subject indicated whether they would collide if they continued on the same path. Data for 14 subjects were analyzed using a signal detection approach that gives the subject's perceived size and decision quality. When standing, subjects had a slightly smaller perceived size (p=0.02) and made slightly better decisions (p=0.03) than when walking. The incorrect viewpoint reduced the quality of decisions, but only slightly (p=0.01). Walking with and without subject speed control provided equivalent performance. Thus, walking in our walking simulator did not improve task performance. Our with-subject-speed-control walking condition required that the subject self-propel the treadmill (i.e. not motorized), which might reduce task performance compared to a feedback-controlled motorized system. An incorrect viewpoint (rendition) did reduce task performance, though performance was surprisingly good. Other issues that might affect the experience of presence, including head-tracking and binocular view (stereo cue of flat screen), are under investigation.

Woods, R. L., Mandel, A. J., Barabas, J., Goldstein, R. B., Peli, E.(2004). Making virtual reality “more real” and the perception of potential collisions [Abstract]. Journal of Vision, 4( 8): 814, 814a, http://journalofvision.org/4/8/814/, doi:10.1167/4.8.814. [CrossRef]
Footnotes
 Supported in part by NIH grant EY12890
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×