There were two interesting situations which moderated the effects of gaze upon steering. Firstly, steering bias did not increase linearly with fixation eccentricity; instead, the extreme fixation points caused similar biases to mid fixation points. This pattern could be interpreted as saturation of steering bias in a similar way to that reported by Readinger, Chatziastros, Cunningham, Bülthoff, and Cutting (
2002), who studied the impact of directed gaze on a straight road; however, the lateral fixation eccentricities used in our conditions were much smaller than those used previously. Secondly, on the narrower roads where the peripheral view of the road edges was strongest, we only saw weak changes in steering bias based on fixation location. A 2-stage model of steering (e.g., Land,
1998; Land & Horwood,
1995) would suggest that near road edges provide feedback information about position in lane, whereas a more distant point on the road provides feedforward information about upcoming road curvature. It seems that a weighted combination of these two types of information can describe our data well. For example, wide roads provided the weakest feedback information, tight bends provided the strongest feedforward information, and as a result gaze fixation had the largest influence when steering down wide tight roads. On narrow gentle bends, the balance of feedback and feedforward information is reversed, and the influence of gaze fixation over steering was at its weakest. The idea that a feedback signal from peripheral road information can inform steering control is not incompatible with the Wilkie and Wann model since the model relies on a variety of sources that can be both retinal and extra-retinal in nature (Wilkie et al.,
2008). Recent neuroimaging evidence supports a distinction between the processing requirements for feedforward and feedback information when steering, with distinct parietal regions being implicated for each type of information (Field, Wilkie, & Wann,
2007). Feedforward information can also be functionally divided since the control of eye movements during steering selectively activates the parietal eye fields (PEF) whereas processing “future path” information independent of eye movements activates a region anterior to the PEFs, tentatively labeled the parietal path area (Field et al.,
2007). A clear direction for future research is to determine how a number of sources of peripheral information can be integrated with visual information that is modulated by gaze direction and how these processes are supported by parietal brain regions.