Abstract
We examined the extent to which image shape (square vs. circle), image rotation, and image content (landscapes vs. fractal images) influenced eye and head movements. Both the eyes and head were tracked while observers looked at natural scenes in a virtual reality (VR) environment. In line with previous work, we found a bias for saccades in line with the image horizon for landscape images, and unusually, for fractal images as well. Interestingly, when viewing landscapes, but not fractals, observers rotated their head in line with the image rotation, thereby enabling saccades to be made in cardinal, rather than oblique directions. This clear distinction between how the eyes and head respond to image content suggests that they may be subserved by different control strategies. We discuss our findings in relation to current theories of attentional control, and how insights from VR might inform past and future eye-tracking studies.