Abstract
In the current days, virtual reality (VR) has become a widespread, easily accessible technology. The intrinsic advantage of generating experiments in fully controlled, realistic and engaging scenarios opens up endless possibilities in modern eye-movement research and in visual sciences. However, the readiness of this technology clashes with the unavailability of any user-friendly tool to analyze the highly multidimensional VR eye-movement data. In our previous work with iMap4 (Lao et al., 2017), we provided the possibility to turn 2D sparse fixation data (x, y eye-movement coordinates weighted by the fixation duration) in continuous statistical maps, and isolate significant differences between groups and conditions with linear mixed modeling. Here, we developed iMap4D, which allows to perform the same robust data-driven statistical evaluation as iMap4, while also handling two further VR dimensions (the z-coordinate and time), with smooth pursuits on moving objects also included in the map. To estimate average individual fixation maps on the model mesh, for every condition we perform a space convolution between the sparse fixation points and a 3D Gaussian kernel. The size of the kernel is scaled to account for objects’ apparent size due to their position in the 3D space. We then consider the continuous hypersur-face resulting from the statistical fixation intensity on each vertices of the mesh. Similar to iMap4, we apply for each vertex a univariate linear mixed model with subject as a random effect. All the possible linear contrasts can be performed, which statistical significance can be assessed by a spatial cluster test based on bootstrapping. To the best of our knowledge, iMap4D is the first free MATLAB open source toolbox for the statistical fixation mapping of 4D eye-movement data, and we believe that this toolbox could pave the way in boosting the number of vision science studies exploiting the ground-breaking VR technologies.
Acknowledgement: This study was supported by the Swiss National Science Foundation (100019_156541).