July 2019
Volume 19, Issue 8
Open Access
OSA Fall Vision Meeting Abstract  |   July 2019
Statistical characterization of heading stimuli in natural environments using SLAM
Author Affiliations
  • Christian Sinnott
    Department of Psychology, University of Nevada, Reno
  • Paul MacNeilage
    Department of Psychology, University of Nevada, Reno
Journal of Vision July 2019, Vol.19, 53. doi:https://doi.org/10.1167/19.8.53
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Christian Sinnott, Paul MacNeilage; Statistical characterization of heading stimuli in natural environments using SLAM. Journal of Vision 2019;19(8):53. https://doi.org/10.1167/19.8.53.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Heading is the direction of linear self-motion in head coordinates. It may be estimated based on vestibular signals that provide information about linear acceleration and based on visual optic flow signals that provide information about linear velocity. Prior psychophysical studies have documented significant repulsive biases in perception of both visual and vestibular heading (Cuturi & MacNeilage 2013), meaning that heading azimuth angle is perceived to be more eccentric than the presented stimulus. Theoretical work suggests that such biases may result from a combination of efficient encoding and probabilistic decoding, where both encoding and decoding mechanisms are constrained based on natural stimulus distributions (Wei & Stocker 2015). To our knowledge, these distributions for heading stimuli remain undocumented, so we set out to characterize them. Tracking linear head velocity in natural environments using a head-based system is challenging. Recording of linear head acceleration using an inertial measurement unit (IMU) results in velocity estimates subject to drift, while optic flow analysis of video from a head-mounted camera is subject to ambiguity due to superposition of linear and angular flow and unknown scene scale. To overcome these limitations we adopted visual-inertial odometry technology developed for autonomous robots that perform localization and mapping (SLAM). Subjects wore a head-mounted device with calibrated, integrated camera and IMU. The data fusion pipeline yielded robust estimates of linear and angular position (in world-frame coordinates) and velocity (in head-frame coordinates) as subjects moved freely. The distribution of heading azimuth and elevation was peaked near straight ahead, as expected based on natural walking with head facing forward. Standard deviation of heading azimuth and elevation was 19 deg and 14 deg respectively, in line with the observed range of angular compensatory head movements. These highly peaked distributions are qualitatively consistent with predictions of repulsive biases based on efficient encoding and probabilistic decoding.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×