Heading from optic flow, bearing, and splay angle information can all be used for lane keeping on a straight path. Here we investigated the relative contributions of these three visual cues to accurate lane-keeping control in a novel way. The displays simulated observers steering a vehicle down a straight path defined by a pair of posts (providing bearing angles only) or a segment of lane edges (providing bearing and splay angles) at a fixed viewing distance, and the ground contained no flow, sparse flow, or dense flow. Observers used a joystick to control the vehicle's lateral movement to stay in the center of the lane while facing random perturbations to both the vehicle's lateral position and orientation. The lateral position perturbation affected the use of both splay and bearing angle cues, but the vehicle orientation perturbation only affected the use of bearing angles. We found that performance improved as more flow information was added to the scene regardless of the availability of bearing or splay angle information. In the presence of splay angles, observers would ignore bearing and rely mainly on splay angles for lane keeping.

*B*) is given by

*X*is the vehicle's lateral distance from the left or right lane edge,

*D*is the viewing distance (i.e., the distance along the reference direction) of a point on the left or right lane edge that the observer attends to, and

*θ*is the angle between the vehicle orientation and the path (see 1 for the derivation of Equation 1). With two lane edges defining the road, there are left and right bearing angles. For regular lane keeping on a straight path when the vehicle is orientated along the path (i.e.,

*θ*= 0), to maintain traveling in the center of a lane, observers can adopt the strategy of keeping the left and right bearing angles equal (Figure 1a). However, in cases when the vehicle orientation deviates from the path, equalizing the left and right bearing angles would not help stay in the center of the lane, as a clockwise vehicle rotation normally increases the left but decreases the right bearing angle whereas a counterclockwise vehicle rotation increases the right but decreases the left bearing angle (Figure 1b). As also shown by Equation 1, when

*θ*is small, the bearing angle is inversely related to distance, i.e., the further away the reference point on the lane edge, the smaller the bearing angle.

*S*), the angle between the optical projection of the lane edge and a vertical line in the image plane (Figure 2a), given by

*H*is the observer's eye height (see 2 for the derivation of Equation 2). Similar to bearing angle, the two lane edges provide a left and a right splay angle corresponding to the angle between the optical projection of the left and right lane edges and a vertical line on the image plane. Unlike bearing angle, observers can keep the left and right splay angles equal to stay in the center of a lane regardless of the vehicle orientation, as the deviation of the vehicle orientation from the path direction simply shifts the positions of lane edges on the screen and enlarges the left and right splay angles by the same amount (Figure 2b). For a rotation angle (

*θ*) less than 25°, the change in the magnitude of splay angles is barely noticeable (<10%, Figure 2c). Furthermore, in contrast to bearing angle, as the orientation of the optical projection of any two points on the lane edge relative to a vertical line in the image plane provides the same splay angle (Figure 2a), splay angle is a property of the image plane, independent of distance.

*X,*see 1) decreases with viewing distance, whereas the change in bearing angle due to the perturbation to the vehicle orientation (i.e., the partial derivative of Equation 1 with respect to

*θ,*see 1) increases with distance. Thus, if participants respond to the change in bearing angle to stay in the center of the lane as reported by Beall and Loomis (1996), their control response to the lateral position perturbation should decrease with viewing distance whereas their control response to the vehicle orientation perturbation should increase with viewing distance. As the control response to the lateral position perturbation decreases the vehicle's deviation from the center of the lane but the control response to the vehicle orientation perturbation increases the deviation, the overall performance error should be the smallest for the near, followed by the medium and the far viewing distances.

*I*) to the vehicle's lateral position (

*u*) and its orientation (

*q*) had the following form as a function of time (

*t*):

*k*were used to result in two different sets of seven non-harmonic frequencies (

*ω*

_{ i }=

*k*

_{ i }/90 Hz) for the perturbations to the vehicle's lateral position (

*u*) and its orientation (

*q*). Table 1 lists the values of

*a, k,*and

*ω*used for

*u*and

*q*.

*D*was set to a value of 0.2 m for the lateral position perturbation and 2.3° for the vehicle orientation perturbation, respectively. The phase offset of each sine component (

*ρ*

_{ i }) was randomly varied from −

*π*to

*π*. The use of two different sets of harmonically independent sum of sines for the perturbations to the vehicle's lateral position and its orientation made the position and orientation perturbations unrelated to each other and appear pseudorandom. The average magnitude of the uncorrected input perturbation to the vehicle's lateral position and its orientation was 0.62 m (peak: 2.09 m) and 6.75° (peak: 25.03°), respectively. The joystick controlled the lateral movement of the vehicle, i.e., the joystick displacement was proportional to the vehicle's lateral velocity while the vehicle's overall speed remained constant at 5 m/s. The joystick position was sampled at 60 Hz (i.e., every frame of the display).

i | a _{ i } | Lateral position (u) | Vehicle orientation (q) | ||
---|---|---|---|---|---|

k _{ i } | ω _{ i } (Hz) | k _{ i } | ω _{ i } (Hz) | ||

1 | 2 | 9 | 0.1 | 10 | 0.11 |

2 | 2 | 13 | 0.14 | 14 | 0.16 |

3 | 2 | 22 | 0.24 | 24 | 0.27 |

4 | 0.2 | 37 | 0.41 | 38 | 0.42 |

5 | 0.2 | 67 | 0.74 | 69 | 0.77 |

6 | 0.2 | 115 | 1.28 | 118 | 1.31 |

7 | 0.2 | 197 | 2.19 | 199 | 2.21 |

*u*) or the vehicle orientation (

*q*) perturbation, we computed the control power (

*P*) correlated with each of the two input perturbations

*C*(

*i*) are the coefficients of the Discrete Fourier Transform of the control output (

*δ*),

*n*is the total number of frames in each trial (60 Hz × 90 s = 5400), and

*C*(

*ω*

_{ i }) are the coefficients at the input lateral position or the vehicle orientation perturbation frequency

*ω*

_{ i }. To examine how the RMS error, the control power correlated with the lateral position perturbation (

*P*

_{ δu }), and the control power correlated with the vehicle orientation perturbation (

*P*

_{ δq }) change with viewing distance and display condition, we conducted a 3 (viewing distance) × 3 (display type) repeated-measures ANOVA on each of these three measurements. For any violation of the sphericity assumption, the degrees of freedom were adjusted using conservative Greenhouse–Geisser estimates. The pattern of data from the experienced participant was similar to that of the rest seven naïve participants, so the data from all participants were analyzed together.

*F*(1.02,7.11) = 95.48,

*p*< 0.0001 and

*F*(1.07,7.51) = 27.15,

*p*< 0.001, respectively), as well as the interaction effect of viewing distance and display type (

*F*(1.12,7.83) = 21.72,

*p*< 0.01). For all three display conditions, the RMS error is smallest at the near viewing distance and increases with distance, indicating that participants relied on bearing angles provided by the two posts for lane keeping. However, the increase of the RMS error with viewing distance is the largest for the empty ground display, followed by the sparse and then the dense random-dot ground display, indicating the effect of optic flow information on lane-keeping control.

*P*

_{ δu }) averaged across eight participants as a function of viewing distance for the three display conditions. A repeated-measures ANOVA on

*P*

_{ δu }also shows that both main effects of viewing distance and display type are significant (

*F*(1.08,7.58) = 45.22,

*p*< 0.001 and

*F*(2,14) = 52.45,

*p*< 0.00001, respectively), as well as the interaction effect of viewing distance and display type (

*F*(1.63,11.39) = 9.29,

*p*< 0.01). For all three display conditions, in contrast to the RMS error,

*P*

_{ δu }decreases with viewing distance, indicating that participants' control response to the lateral position perturbation decreases with viewing distance. This is again consistent with the use of the bearing angle strategy as the lateral position perturbation causes a larger change in bearing angle at a near than far viewing distance. The overall increase of

*P*

_{ δu }from the empty ground to the sparse and then to the dense random-dot ground display and the relatively shallow slope of

*P*

_{ δu }against distance for the two random-dot ground displays support the claim that with added optic flow information, participants relied more on their perceived heading from optic flow for lane keeping.

*P*

_{ δq }) as a function of viewing distance for the three display conditions. Note that the nature of the control task is to respond to the lateral position perturbation while ignoring the vehicle orientation perturbation, thus good performance should be highly correlated with the lateral position perturbations (i.e., high

*P*

_{ δu }values) but not the vehicle orientation perturbation (i.e., low

*P*

_{ δq }values). A repeated-measures ANOVA on

*P*

_{ δq }once again shows that both main effects of viewing distance and display type are significant (

*F*(2,14) = 35.1,

*p*< 0.00001 and

*F*(2,14) = 47.41,

*p*< 0.00001, respectively), as well as the interaction effect of viewing distance and display type (

*F*(4,28) = 16.37,

*p*< 0.00001). For all three display conditions,

*P*

_{ δq }increases with viewing distance, indicating that the control response to the vehicle orientation perturbation increases with distance. This agrees with the prediction that opposite to the lateral position perturbation, the vehicle orientation perturbation causes a larger change in bearing angle at a far than near distance. The overall decrease of

*P*

_{ δq }from the empty ground to the sparse and then to the dense random-dot ground display as well as the smaller increase of

*P*

_{ δq }with distance for the two random-dot ground displays provide further supporting evidence of the use of the heading strategy in lane keeping with enriched optic flow displays.

*Y*-axis scale in Figure 7a is smaller than that in Figure 5a. Thus, on average, the RMS errors were smaller than those from Experiment 1 especially for the empty ground display at the far viewing distance. A repeated-measures ANOVA on the RMS error reveals that only the main effect of display type is significant (

*F*(1.14,7.95) = 26.12,

*p*< 0.001). Newman–Keuls tests show that the overall RMS error for the empty ground display (mean: 0.72 m) is significantly larger than that for the sparse (0.55 m,

*p*< 0.001) and the dense random-dot ground displays (0.51 m,

*p*< 0.001), and the overall RMS error for the two random-dot ground displays are not significantly different from each other. While the overall small RMS error and the lack of viewing distance effect are consistent with the use of the splay angle cue in lane keeping, the decrease of the RMS error from the empty ground display to the two random-dot ground displays confirms the role of added optic flow information in lane-keeping control.

*P*

_{ δu }averaged across eight participants as a function of viewing distance for the three display conditions. A repeated-measures ANOVA on

*P*

_{ δu }also shows that only the main effect of display type is significant (

*F*(2,14) = 56.23,

*p*≪ 0.001). Newman–Keuls tests show that the overall

*P*

_{ δu }for the empty ground display (mean: 34.49%) is significantly smaller than that for the sparse (48.77%,

*p*< 0.001) and the dense random-dot ground displays (53.38%,

*p*< 0.001), and the overall

*P*

_{ δu }for the sparse random-dot ground display is significantly smaller than that for the dense random-dot ground display (

*p*< 0.05). Again, while the lack of the viewing distance effect of

*P*

_{ δu }supports the use of splay angles to stay in the center of the lane, the systematic increase of

*P*

_{ δu }mirrors the trend of the RMS error data and confirms that with enriched optic flow display, participants relied more on their perceived heading from optic flow for lane keeping.

*P*

_{ δq }as a function of viewing distance for the three display conditions. A repeated-measures ANOVA on

*P*

_{ δq }shows that both the main effect of display type and the interaction effect of display type and viewing distance are significant (

*F*(1.06,7.42) = 6.46,

*p*< 0.05 and

*F*(4,28) = 4.66,

*p*< 0.01, respectively). Newman–Keuls tests show that

*P*

_{ δq }for the empty ground display is significantly larger than that for the two random-dot ground displays at the near and medium viewing distances (

*p*< 0.01 for all cases) but not at the far viewing distance. Consistent with the use of splay angles, even for the empty ground display, participants' control response to the vehicle orientation perturbation was low (

*P*

_{ δq }< 30%) and not affected by viewing distance. Adding more optic flow information in the display led to participants responding to the vehicle orientation perturbation even less, especially at the near and medium viewing distances.

*θ*with respect to the path due to the orientation perturbation, the left and right bearing angles (

*B*

_{L}and

*B*

_{R}) to reference points

*A*′ and

*B*′ on the lane edges (Figure 1b) are, respectively, given by

*B*) due to the perturbations to the vehicle's lateral position and its orientation is thus the partial derivative of

*B*with respect to

*X*and

*θ,*respectively,

*X*and

*θ*values) used in the study,

*D*and

*D*.

*XYZ*represent the world coordinate system, and

*XYZ*

_{eye}represent the viewer's eye-centered coordinate system as depicted in Figure 2a. Each point in the world has an associated homogeneous position vector,

*P*= (

*X, Y, Z,*1)

^{ T }. For a known eye position in the world,

*O*= (

*O*

_{ X },

*O*

_{ Y },

*O*

_{ Z })

^{ T }, the position of the same surface point in the eye-centered coordinate system,

*P*

_{eye}= (

*X*

_{eye},

*Y*

_{eye},

*Z*

_{eye}, 1)

^{ T }, is

*R*is a 3 × 3 rotation matrix defined by the orientation of the eye-centered coordinate system relative to the world. Under perspective projection, this surface point projects to a point in the image plane,

*p*= (

*x, y*)

^{ T }, with

*f*is the focal length of the eye.

*P*

_{1}= (−

*X*

_{R}, 0,

*Z*

_{1}, 1)

^{ T }and

*P*

_{2}= (−

*X*

_{R}, 0,

*Z*

_{2}, 1)

^{ T }be two points on a right lane marker in the world, and

*O*= (0,

*H,*0)

^{ T }the coordinates of the viewer's eye position in the world (Figure 2a). Assuming that the perturbation to the vehicle orientation (i.e., the simulated observer viewing direction through the windshield) rotates the

*Z*

_{eye}-axis clockwise at angle

*θ*with respect to the

*Z*-axis in the world, the rotation matrix

*R*is given by

*P*

_{1eye}= (

*X*

_{1eye},

*Y*

_{1eye},

*Z*

_{1eye}, 1)

^{ T }and

*P*

_{2eye}= (

*X*

_{2eye},

*Y*

_{2eye},

*Z*

_{2eye}, 1)

^{ T }are

*p*

_{1}= (

*x*

_{1},

*y*

_{1})

^{ T }and

*p*

_{2}= (

*x*

_{2},

*y*

_{2})

^{ T }in the image plane, with

*S*

_{R}) is then given by the orientation of ∣

*p*

_{1}

*p*

_{2}∣ relative to a vertical line in the image plane

*Vision Research*, 36, 431–443. [CrossRef] [PubMed]

*Perception*, 25, 481–494. [CrossRef] [PubMed]

*Center of outflow is not used to control locomotion*(Technical Report CBR TR 95-5). Cambridge, MA: Cambridge Basic Research.

*The use of splay angle and optical flow in steering a central path*(Technical Report 72). Tubingen, Germany: Max Planck Institute for Biological Cybernetics.

*Human Factors*, 20, 691–707.

*Psychological Science*, 13, 272–278. [CrossRef] [PubMed]

*The perception of the visual world*. Boston: Houghton Mifflin.

*The ecological approach to visual perception*. Boston: Houghton Mifflin.

*Journal of the Optical Society of America A*, 16, 2079–2091. [CrossRef]

*Vision Research*, 42, 1619–1626. [CrossRef] [PubMed]

*Perception*, 30, 811–818. [CrossRef] [PubMed]

*Journal of Vision*, 9, (1):11, 1–11, http://www.journalofvision.org/content/9/1/11, doi:10.1167/9.1.11. [PubMed] [Article] [CrossRef] [PubMed]

*Journal of Vision*, 10, (4):24, 1–24, http://www.journalofvision.org/content/10/4/24, doi:10.1167/10.4.24. [PubMed] [Article] [CrossRef] [PubMed]

*Vision and action*. (pp. 163–180). Cambridge, New York: Cambridge University Press.

*Journal of Vision*, 9, (3):29, 1–14, http://www.journalofvision.org/content/9/3/29, doi:10.1167/9.3.29. [PubMed] [Article] [CrossRef] [PubMed]

*Journal of Vision*, 6, (9):2, 874–881, http://www.journalofvision.org/content/6/9/2, doi:10.1167/6.9.2. [PubMed] [Article] [CrossRef]

*Vision Research*, 40, 3873–3894. [CrossRef] [PubMed]

*Vision Research*, 44, 1879–1889. [CrossRef] [PubMed]

*Human Factors*, 15, 421–430. [PubMed]

*Psychological Review*, 101, 329–335. [CrossRef] [PubMed]

*Science*, 215, 194–196. [CrossRef] [PubMed]

*Current Biology*, 8, 1191–1194. [CrossRef] [PubMed]

*Perception*, 33, 1233–1248. [CrossRef] [PubMed]

*Vision Research*, 37, 573–590. [CrossRef] [PubMed]

*Trends in Cognitive Sciences*, 4, 319–324. [CrossRef] [PubMed]

*Ecological Psychology*, 10, 177–219. [CrossRef]

*Nature Neuroscience*, 4, 213–216. [CrossRef] [PubMed]

*Automatica*, 6, 87–98. [CrossRef]

*Highway [Transportation] Research Record*, 364, 1–15.

*Current Biology*, 12, 2014–2017. [CrossRef] [PubMed]

*Journal of Experimental Psychology: Human Perception and Performance*, 29, 363–378. [CrossRef] [PubMed]

*Human Factors*, 3, 222–228.

*Perception and control of self-motion*(pp. 101–126). Hillsdale, NJ: Lawrence Erlbaum.