September 2019
Volume 19, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2019
iMap4D: an Open Source Toolbox for Statistical Fixation Mapping of Eye-Tracking Data in Virtual Reality
Author Affiliations & Notes
  • Valentina Ticcinelli
    Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Switzerland
  • Peter De Lissa
    Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Switzerland
  • Denis Lalanne
    Human-IST Institute & Department of Informatics, University of Fribourg, Switzerland
  • Sebastien Miellet
    Active Vision Lab, School of Psychology, University of Wollongong, Australia
  • Roberto Caldara
    Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Switzerland
Journal of Vision September 2019, Vol.19, 127c. doi:https://doi.org/10.1167/19.10.127c
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Valentina Ticcinelli, Peter De Lissa, Denis Lalanne, Sebastien Miellet, Roberto Caldara; iMap4D: an Open Source Toolbox for Statistical Fixation Mapping of Eye-Tracking Data in Virtual Reality. Journal of Vision 2019;19(10):127c. https://doi.org/10.1167/19.10.127c.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

In the current days, virtual reality (VR) has become a widespread, easily accessible technology. The intrinsic advantage of generating experiments in fully controlled, realistic and engaging scenarios opens up endless possibilities in modern eye-movement research and in visual sciences. However, the readiness of this technology clashes with the unavailability of any user-friendly tool to analyze the highly multidimensional VR eye-movement data. In our previous work with iMap4 (Lao et al., 2017), we provided the possibility to turn 2D sparse fixation data (x, y eye-movement coordinates weighted by the fixation duration) in continuous statistical maps, and isolate significant differences between groups and conditions with linear mixed modeling. Here, we developed iMap4D, which allows to perform the same robust data-driven statistical evaluation as iMap4, while also handling two further VR dimensions (the z-coordinate and time), with smooth pursuits on moving objects also included in the map. To estimate average individual fixation maps on the model mesh, for every condition we perform a space convolution between the sparse fixation points and a 3D Gaussian kernel. The size of the kernel is scaled to account for objects’ apparent size due to their position in the 3D space. We then consider the continuous hypersur-face resulting from the statistical fixation intensity on each vertices of the mesh. Similar to iMap4, we apply for each vertex a univariate linear mixed model with subject as a random effect. All the possible linear contrasts can be performed, which statistical significance can be assessed by a spatial cluster test based on bootstrapping. To the best of our knowledge, iMap4D is the first free MATLAB open source toolbox for the statistical fixation mapping of 4D eye-movement data, and we believe that this toolbox could pave the way in boosting the number of vision science studies exploiting the ground-breaking VR technologies.

Acknowledgement: This study was supported by the Swiss National Science Foundation (100019_156541). 
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×