Abstract
A quantitative methodology for comparison of tracked eye movements, or scanpaths, is essential for the evaluation of attentional viewing patterns generated by humans or artificial models of human visual attention. The string-editing scanpath comparison methodology, developed by Privitera and Stark, is one well-known and useful strategy for comparing the position and order of fixations made over a visual field, essentially resulting in correlation measures of fixation position and order (ordered scanpath correlations are usually much lower than those of fixation positions). This strategy has previously been employed to evaluate human and artificial (modeled) scanpaths over 2D images.
We have successfully adapted this strategy to examine the correlation between human and artificial 2 1/2D scanpaths in Virtual Reality (VR) (2D scanpaths obtained during stationary head positions). Artificial scanpaths, generated by a computer model of visual attention used to guide real-time ray-tracing in VR, surprisingly resulted in very low correlations with human viewing patterns. On further examination, human eye movements were found to be mostly restricted to the central 30 deg of the Head Mounted Display while algorithmic “fixations” were dispersed uniformly over the entire viewing area.
We have recently extended of our scanpath comparison methodology to 3D eye movements measured in Virtual Reality with no restriction on head movement. The algorithm calculates correlations of 1.0 for identical scanpaths, indicating correctness of the algorithm for these cases. The technique is applicable for quantitative comparison of novice and expert scanpaths in a Virtual Reality visual inspection training simulator.