Free
Research Article  |   September 2010
Relative contributions of optic flow, bearing, and splay angle information to lane keeping
Author Affiliations
Journal of Vision September 2010, Vol.10, 16. doi:https://doi.org/10.1167/10.11.16
  • Views
  • PDF
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Li Li, Jing Chen; Relative contributions of optic flow, bearing, and splay angle information to lane keeping. Journal of Vision 2010;10(11):16. https://doi.org/10.1167/10.11.16.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Heading from optic flow, bearing, and splay angle information can all be used for lane keeping on a straight path. Here we investigated the relative contributions of these three visual cues to accurate lane-keeping control in a novel way. The displays simulated observers steering a vehicle down a straight path defined by a pair of posts (providing bearing angles only) or a segment of lane edges (providing bearing and splay angles) at a fixed viewing distance, and the ground contained no flow, sparse flow, or dense flow. Observers used a joystick to control the vehicle's lateral movement to stay in the center of the lane while facing random perturbations to both the vehicle's lateral position and orientation. The lateral position perturbation affected the use of both splay and bearing angle cues, but the vehicle orientation perturbation only affected the use of bearing angles. We found that performance improved as more flow information was added to the scene regardless of the availability of bearing or splay angle information. In the presence of splay angles, observers would ignore bearing and rely mainly on splay angles for lane keeping.

Introduction
Accurate and efficient control of self-motion is vital for human survival. A self-motion control task that we commonly experience in our daily life is lane keeping, i.e., we follow a preexisting path defined by lane markers or road edges and keep a safe distance from them. A natural question arises as to what visual cues we use for lane keeping. Theoretically, there are at least three types of visual information that humans can use for lane keeping on a straight path. The first one is one's direction of self-motion (heading) specified by optic flow, the global image motion of the environment projected on the retina when one moves in the world. When one travels on a straight path without eye, head, or body rotation, the focus of expansion (FOE) in the resulting radial flow pattern indicates one's heading (Gibson, 1950, 1979). When one travels on a curved path or rotates one's head or eyes, the flow pattern is not radial anymore as the rotation shifts the FOE away from the heading direction (Regan & Beverley, 1982). In this case, although some behavioral studies have reported that observers need extra-retinal information to remove the rotational component in the flow field for accurate heading perception (e.g., Banks, Ehrlich, Backus, & Crowell, 1996), other studies have shown that observers can use the global motion and motion parallax information in optic flow to estimate heading within 2° of visual angle (e.g., Grigo & Lappe, 1999; Li, Sweet, & Stone, 2006; Li & Warren, 2000; Stone & Perrone, 1997). Lane keeping can then be achieved by keeping the perceived heading centered on the path (Warren, 1998). 
When traveling along a straight path, the lane edges provide two other visual cues for lane keeping: bearing and splay angle. Bearing angle refers to the direction of a reference point on the lane edge, measured from the observer, with respect to a reference direction such as a north–south line or meridian (Beall & Loomis, 1996). When there is no visual cue indicating an external reference direction, the observer naturally uses the vehicle orientation (i.e., the simulated observer viewing direction through the windshield) as the reference direction, in which case bearing angle (B) is given by 
B = arctan ( X D cos θ ± tan θ ) ,
(1)
where X is the vehicle's lateral distance from the left or right lane edge, D is the viewing distance (i.e., the distance along the reference direction) of a point on the left or right lane edge that the observer attends to, and θ is the angle between the vehicle orientation and the path (see 1 for the derivation of Equation 1). With two lane edges defining the road, there are left and right bearing angles. For regular lane keeping on a straight path when the vehicle is orientated along the path (i.e., θ = 0), to maintain traveling in the center of a lane, observers can adopt the strategy of keeping the left and right bearing angles equal (Figure 1a). However, in cases when the vehicle orientation deviates from the path, equalizing the left and right bearing angles would not help stay in the center of the lane, as a clockwise vehicle rotation normally increases the left but decreases the right bearing angle whereas a counterclockwise vehicle rotation increases the right but decreases the left bearing angle (Figure 1b). As also shown by Equation 1, when θ is small, the bearing angle is inversely related to distance, i.e., the further away the reference point on the lane edge, the smaller the bearing angle. 
Figure 1
 
An illustration of the use of bearing angles in lane keeping. (a) The vehicle is oriented along the straight path. The bearing angles to reference points A and B on the left and right lane edges are given by B L = tan−1(X L/D) and B R = tan−1(X R/D). When B L = B R, the vehicle is at the center of the lane. (b) The vehicle orientation is now rotated θ = 25° clockwise with respect to the path. The bearing angles to reference points A′ and B′ are now given by BL = tan−1(X L/(D*cos(25)) + tan(25)) and BR = tan−1(X R/(D*cos(25)) − tan(25)). Even when the vehicle is still at the center of the lane, BLBR.
Figure 1
 
An illustration of the use of bearing angles in lane keeping. (a) The vehicle is oriented along the straight path. The bearing angles to reference points A and B on the left and right lane edges are given by B L = tan−1(X L/D) and B R = tan−1(X R/D). When B L = B R, the vehicle is at the center of the lane. (b) The vehicle orientation is now rotated θ = 25° clockwise with respect to the path. The bearing angles to reference points A′ and B′ are now given by BL = tan−1(X L/(D*cos(25)) + tan(25)) and BR = tan−1(X R/(D*cos(25)) − tan(25)). Even when the vehicle is still at the center of the lane, BLBR.
A third cue provided by lane edges that can be used for lane keeping is splay angle (S), the angle between the optical projection of the lane edge and a vertical line in the image plane (Figure 2a), given by 
S = arctan ( X H cos θ ) ,
(2)
where H is the observer's eye height (see 2 for the derivation of Equation 2). Similar to bearing angle, the two lane edges provide a left and a right splay angle corresponding to the angle between the optical projection of the left and right lane edges and a vertical line on the image plane. Unlike bearing angle, observers can keep the left and right splay angles equal to stay in the center of a lane regardless of the vehicle orientation, as the deviation of the vehicle orientation from the path direction simply shifts the positions of lane edges on the screen and enlarges the left and right splay angles by the same amount (Figure 2b). For a rotation angle (θ) less than 25°, the change in the magnitude of splay angles is barely noticeable (<10%, Figure 2c). Furthermore, in contrast to bearing angle, as the orientation of the optical projection of any two points on the lane edge relative to a vertical line in the image plane provides the same splay angle (Figure 2a), splay angle is a property of the image plane, independent of distance. 
Figure 2
 
An illustration of the use of splay angles in lane keeping. (a) Splay angle (S) illustrated as the orientation of the optical projection of two points on a lane edge relative to a vertical line in the image plane. (b) The vehicle is oriented along the straight path, and the red dashed line is a vertical line in the image plane. The splay angles of the left and right lane edges are given by S L = tan−1(X L/H) and S R = tan−1(X R/H). When S L = S R, the vehicle is at the center of the lane. (c) The vehicle orientation is now rotated θ = 25° clockwise with respect to the path. The splay angles of the left and right lane edges are now given by SL = tan−1(X L/(H*cos(25))) and SR = tan−1(X R/(H*cos(25))). The vehicle is still at the center of the lane, and SL = SR.
Figure 2
 
An illustration of the use of splay angles in lane keeping. (a) Splay angle (S) illustrated as the orientation of the optical projection of two points on a lane edge relative to a vertical line in the image plane. (b) The vehicle is oriented along the straight path, and the red dashed line is a vertical line in the image plane. The splay angles of the left and right lane edges are given by S L = tan−1(X L/H) and S R = tan−1(X R/H). When S L = S R, the vehicle is at the center of the lane. (c) The vehicle orientation is now rotated θ = 25° clockwise with respect to the path. The splay angles of the left and right lane edges are now given by SL = tan−1(X L/(H*cos(25))) and SR = tan−1(X R/(H*cos(25))). The vehicle is still at the center of the lane, and SL = SR.
The use of heading from optic flow, bearing, or splay angle information in real-life lane keeping on a straight path is often redundant and predicts the same control behavior. Much research has been conducted to find out which strategy humans actually depend on for lane keeping. Early human factor studies on driving in both the real world and driving simulators have reported that human operators use both heading from optic flow and the vehicle's lateral position (which defines bearing and splay angles) for lane keeping (McLean & Hoffmann, 1973; Weir & Wojcik, 1971; Wohl, 1961). For example, Weir and McRuer (1970) found that drivers controlled their heading most of the time and controlled their lateral position error from the center of the lane in an intermittent, trimming manner, when it became excessive. 
Several later studies, however, have reported that people may not use heading from optic flow for lane keeping. By oscillating the FOE in optic flow to shift the perceived heading direction, Beusmans (1995) found that there was no correlation between the movement of the FOE and participants' lane-keeping control performance, suggesting that people do not use their perceived heading from optic flow for lane keeping. More importantly, Beall and Loomis (1996) found that splay angle information provided by lane edges alone was sufficient for participants to stay in the center of the lane when steering down a straight path while facing lateral crosswind perturbation. Adding global optic flow in the display did not further improve participants' lane-keeping performance. The importance of road edges in steering control was also reported for driving along a curving path, in which case the driver's gaze tended to direct at the tangent point of the inside edge of the road (Kandil, Rotter, & Lappe, 2009, 2010; Land & Lee, 1994). 
However, later studies have challenged the idea that people rely exclusively on splay angles when they are available for lane keeping. For example, Chatziastros, Wallis, and Bulthoff (1999) examined lane keeping between two walls and found that the simulated unequal velocities on the left and right walls' surfaces induced the vehicle's lateral displacement from the center of the lane even when splay angles were provided by road edges. Similarly, Duchon and Warren (2002) asked participants to steer down a corridor and found that apart from splay angles, equating the speed of optic flow in the left and right lateral field of view contributed to maintaining a centered position in the road. 
Despite the above evidence showing that equating the speed of the left and right parts of the global flow field has an effect on lane keeping, it still remains a question whether heading from optic flow is used for lane keeping. Beall and Loomis (1996) used a small field of view (about 20°H × 15°V) and the display contained about 100 dots uniformly distributed on a ground plane in the depth range of about 9–4800 m. The small number of dots and their uniform distribution on the ground resulted in sparse dot motion in the foreground of the ground plane. The small field of view and the sparse dot motion especially in the foreground could have led to insufficient optic flow information for participants to accurately perceive their heading during a crosswind lateral perturbation and could thus be the reason why they failed to observe any effect of optic flow on lane keeping. In addition, Beall and Loomis found that participants reacted to the change in bearing angle to equalize the left and right bearing angles to stay in the center of the lane when the display did not contain splay angle information. Given that they used a segment of lane edges to provide splay angle information, and the lane edge segment provided not only splay but also bearing angles, it is unclear whether participants in their study used bearing angles for lane keeping in the presence of splay angles, as participants could be using both cues for accurate lane-keeping performance. 
In this study, by using a large field of view (110°H × 94°V) and displays that contained an increasing amount of global flow and motion parallax information for accurate heading perception during rotation (Li, Chen, & Peng, 2009; Li & Warren, 2000, 2004), we reexamined the role of optic flow information in lane keeping. Furthermore, by taking advantage of the fact that the use of bearing but not splay angle information for lane keeping is affected by perturbing the vehicle orientation (i.e., the simulated observer viewing direction) relative to the path, we for the first time separated the use of these two strategies and investigated their relative contributions to accurate lane-keeping performance. Specifically, we presented participants with visual displays simulating an observer steering a vehicle down a straight path defined by either a pair of posts (providing bearing angles only) or a segment of lane edges (providing bearing and splay angles) at a fixed viewing distance. The ground plane contained no flow, sparse flow, or dense flow information. Observers used a joystick to control the vehicle's lateral movement to stay in the center of the lane while facing random perturbations to both the vehicle's lateral position in the lane and its orientation relative to the path in the frequency range of 0.1–2.21 Hz. The lateral position perturbations affected the use of both splay and bearing angles for lane keeping, but the vehicle orientation perturbations affected the use of bearing angles only. In Experiment 1, we first measured participants' baseline performance on displays containing bearing angle information alone and then examined how added optic flow information affected lane keeping. In Experiment 2, we added splay angle information to the display and examined whether optic flow and bearing angles still affected lane keeping in the presence of splay angles provided by lane edges. 
Experiment 1
In this experiment, by systematically varying optic flow and bearing angle information in the display, we measured lane-keeping performance with bearing angle information alone and the effect of added optic flow information on lane keeping. To remove splay angle information in the display, as in the study by Beall and Loomis (1996), we used a pair of vertical posts to define the path. We tested three types of visual displays that provided an increasing amount of global flow and motion parallax information in optic flow: an empty ground, a sparse random-dot ground, and a dense random-dot ground (Figure 3). To vary the effectiveness of bearing angle information, we placed the two posts at three viewing distances (3.48, 9.18, and 36.7 m, Figure 4). 
Figure 3
 
Display conditions in the study. (a) Empty ground. (b) Sparse random-dot ground. (c) Dense random-dot ground.
Figure 3
 
Display conditions in the study. (a) Empty ground. (b) Sparse random-dot ground. (c) Dense random-dot ground.
Figure 4
 
The lane edges (the dashed lines, invisible in the experimental displays) depicted by a pair of vertical posts placed at (a) 3.48 m, (b) 9.18 m, and (c) 36.7 m along the simulated observer viewing direction (parallel to the path in this figure), corresponding to the near, medium, and far viewing distances.
Figure 4
 
The lane edges (the dashed lines, invisible in the experimental displays) depicted by a pair of vertical posts placed at (a) 3.48 m, (b) 9.18 m, and (c) 36.7 m along the simulated observer viewing direction (parallel to the path in this figure), corresponding to the near, medium, and far viewing distances.
Within the range of the perturbation magnitudes used in this study, the change in bearing angle due to the perturbation to the vehicle's lateral position (i.e., the partial derivative of Equation 1 with respect to X, see 1) decreases with viewing distance, whereas the change in bearing angle due to the perturbation to the vehicle orientation (i.e., the partial derivative of Equation 1 with respect to θ, see 1) increases with distance. Thus, if participants respond to the change in bearing angle to stay in the center of the lane as reported by Beall and Loomis (1996), their control response to the lateral position perturbation should decrease with viewing distance whereas their control response to the vehicle orientation perturbation should increase with viewing distance. As the control response to the lateral position perturbation decreases the vehicle's deviation from the center of the lane but the control response to the vehicle orientation perturbation increases the deviation, the overall performance error should be the smallest for the near, followed by the medium and the far viewing distances. 
Meanwhile, if participants use their perceived heading from optic flow for lane keeping, their control performance accuracy should improve as more optic flow information is added to the display. Due to the fact that the lateral position perturbation shifts their heading away from the center of the lane, participants should respond to the lateral position perturbation more when more optic flow information is added to the display to increase their heading estimation accuracy. In contrast, as the vehicle orientation perturbation does not move heading away from the center of the lane, we expect that participants' control response to the vehicle orientation perturbation should decrease and be less affected by viewing distance with added optic flow information. 
Methods
Participants
Eight students (two females and six males; seven naïve to the purpose of the experiment) between the ages of 21 and 30 at the University of Hong Kong participated in the experiment. All of them had normal or corrected-to-normal vision. 
Visual stimuli and control
The display simulated an observer steering a vehicle down a straight lane (width: 2.52 m) at a slow driving speed of 5 m/s over a ground plane (depth range: 1.4–100 m). During a trial, the vehicle's lateral position as well as its orientation was perturbed by the sum of seven harmonically independent sinusoids, and participants were asked to use a joystick (B&G Systems, JF3) to control the vehicle's lateral movement to stay in the center of the lane. The input perturbation (I) to the vehicle's lateral position (u) and its orientation (q) had the following form as a function of time (t): 
I ( t ) = D i = 1 7 a i sin ( 2 π k i 90 t + ρ i ) .
(3)
Different values of k were used to result in two different sets of seven non-harmonic frequencies (ω i = k i /90 Hz) for the perturbations to the vehicle's lateral position (u) and its orientation (q). Table 1 lists the values of a, k, and ω used for u and q. D was set to a value of 0.2 m for the lateral position perturbation and 2.3° for the vehicle orientation perturbation, respectively. The phase offset of each sine component (ρ i ) was randomly varied from −π to π. The use of two different sets of harmonically independent sum of sines for the perturbations to the vehicle's lateral position and its orientation made the position and orientation perturbations unrelated to each other and appear pseudorandom. The average magnitude of the uncorrected input perturbation to the vehicle's lateral position and its orientation was 0.62 m (peak: 2.09 m) and 6.75° (peak: 25.03°), respectively. The joystick controlled the lateral movement of the vehicle, i.e., the joystick displacement was proportional to the vehicle's lateral velocity while the vehicle's overall speed remained constant at 5 m/s. The joystick position was sampled at 60 Hz (i.e., every frame of the display). 
Table 1
 
Magnitudes and frequencies of the seven harmonically independent sinusoids in the input perturbations to the vehicle's lateral position (u) and its orientation (q).
Table 1
 
Magnitudes and frequencies of the seven harmonically independent sinusoids in the input perturbations to the vehicle's lateral position (u) and its orientation (q).
i a i Lateral position (u) Vehicle orientation (q)
k i ω i (Hz) k i ω i (Hz)
1 2 9 0.1 10 0.11
2 2 13 0.14 14 0.16
3 2 22 0.24 24 0.27
4 0.2 37 0.41 38 0.42
5 0.2 67 0.74 69 0.77
6 0.2 115 1.28 118 1.31
7 0.2 197 2.19 199 2.21
Three display conditions providing an increasing amount of optic flow information were tested: (a) Empty ground—the ground plane was filled with solid gray color thus providing no global flow information (Figure 3a); (b) sparse random-dot ground—the ground was composed of 100 white dots (0.5° in diameter, luminance contrast +99%) that were uniformly distributed on the ground plane. This display provided sparse global motion and motion parallax information in optic flow (Figure 3b); and (c) dense random-dot ground—the ground was composed of 300 white dots (Figure 3c). Dots were placed on the ground such that about the same number of dots at each distance in depth was displayed on each frame. This was to ensure that nearby parts of the ground were not too sparsely covered with dots. This display thus provided more global motion and motion parallax information than did the sparse random-dot ground display due to the increased number of dots and foreground motion. For both the sparse and dense random-dot ground displays, the number of visible dots per frame and the dot density distribution in depth were kept constant throughout the trial. The background sky was black in all three displays. 
In this experiment, the lane edges were depicted by a pair of red posts (1.5°H × 10.2°V) to ensure that participants could only rely on the bearing angles provided by the two posts for lane keeping. The posts were placed at a fixed viewing distance of 3.48, 9.18, or 36.7 m along the simulated observer viewing direction (i.e., the vehicle orientation), corresponding to the near, medium, and far viewing distances, respectively (Figure 4). These three viewing distances were chosen to maximize the change in bearing angle caused by the two perturbations at different viewing distances. The posts moved with the simulated vehicle/observer movement to maintain a constant viewing distance throughout the trial (Figures 1a and 1b), therefore the posts did not expand. 
The visual stimuli were generated on a Dell Precision Workstation 670n with an NVIDIA Quadro FX 1800 graphics card at the frame rate of 60 Hz. They were rear-projected on a large screen (110°H × 94°V) with an Epson EMP-9300 LCD projector (native resolution: 1400 × 1050 pixels, refresh rate: 60 Hz) in a light-excluded viewing booth. The screen edges were covered in matte black cloth to minimize the availability of an artificial frame of reference. Participants viewed the visual stimuli monocularly with their dominant eye from a chin rest. They thus could not move their head but could move their eye during a course of a trial. The simulated eye height in the display was at 1.51 m corresponding to the average eye height of participants sitting on a high chair at 0.56 m away from the screen. 
Procedure
Participants pulled the trigger of the joystick to start each trial. They were instructed to imagine looking through the windshield of a vehicle that was traveling on a straight path while facing crosswind perturbation to both the vehicle's lateral position in the lane and its orientation (i.e., their simulated viewing direction through the windshield) relative to the path. The vehicle initially moved according to the sum-of-sines perturbation input, but its lateral deviation from the center of the lane was reduced as the participant moved the joystick leftward and rightward to control the vehicle's lateral movement to maintain centered in the lane. The participant did not have control of the vehicle orientation (i.e., their simulated viewing direction through the windshield) and was asked to ignore the perturbation to the vehicle orientation as much as possible. The duration of each trial was 95 s. 
A 3 (viewing distance) × 3 (display type) within-subject design was used in this experiment. Three experimental sessions were run with each session containing 18 randomized trials (2 trials × 3 display types × 3 viewing distances). To ensure participants understood the task and became familiar with the joystick control dynamics, they received practice trials on a textured-ground display with lane markers spanning the whole depth range of the ground (1.4–100 m) before the experiment commenced. They first received practice trials containing the perturbation to the vehicle's lateral position in the lane only. The practice continued until their performance appeared stable, which usually required six trials. They then performed two practice trials with both the lateral position and the vehicle orientation perturbations. In total, the experiment lasted about 2 h. 
Data analysis
Time series of the vehicle's lateral deviation from the center of the lane (defined as lateral position error), the joystick control output, the input lateral position, and the vehicle orientation perturbations were recorded. We analyzed the data beginning 5 s after the start of the trial to ensure that we skipped the initial transient response. Total performance error was measured as the root mean square (RMS) of the time series of the recorded lateral position error. To describe the extent to which participants responded to the lateral position (u) or the vehicle orientation (q) perturbation, we computed the control power (P) correlated with each of the two input perturbations 
P = 2 i = 1 7 | C ( ω i ) | 2 i = 1 n | C ( i ) | 2 ,
(4)
where C(i) are the coefficients of the Discrete Fourier Transform of the control output (δ), n is the total number of frames in each trial (60 Hz × 90 s = 5400), and C(ω i ) are the coefficients at the input lateral position or the vehicle orientation perturbation frequency ω i . To examine how the RMS error, the control power correlated with the lateral position perturbation (P δu ), and the control power correlated with the vehicle orientation perturbation (P δq ) change with viewing distance and display condition, we conducted a 3 (viewing distance) × 3 (display type) repeated-measures ANOVA on each of these three measurements. For any violation of the sphericity assumption, the degrees of freedom were adjusted using conservative Greenhouse–Geisser estimates. The pattern of data from the experienced participant was similar to that of the rest seven naïve participants, so the data from all participants were analyzed together. 
Results and discussion
Figure 5a plots the mean RMS error averaged across eight participants as a function of viewing distance for the three display conditions. A repeated-measures ANOVA on the RMS error reveals that both main effects of viewing distance and display type are significant (F(1.02,7.11) = 95.48, p < 0.0001 and F(1.07,7.51) = 27.15, p < 0.001, respectively), as well as the interaction effect of viewing distance and display type (F(1.12,7.83) = 21.72, p < 0.01). For all three display conditions, the RMS error is smallest at the near viewing distance and increases with distance, indicating that participants relied on bearing angles provided by the two posts for lane keeping. However, the increase of the RMS error with viewing distance is the largest for the empty ground display, followed by the sparse and then the dense random-dot ground display, indicating the effect of optic flow information on lane-keeping control. 
Figure 5
 
(a) Mean RMS error, (b) mean P δu , and (c) mean P δq as a function of viewing distance for the three display conditions in Experiment 1. Error bars represent SEs across eight participants.
Figure 5
 
(a) Mean RMS error, (b) mean P δu , and (c) mean P δq as a function of viewing distance for the three display conditions in Experiment 1. Error bars represent SEs across eight participants.
Figure 5b plots the mean control power correlated with the lateral position perturbation (P δu ) averaged across eight participants as a function of viewing distance for the three display conditions. A repeated-measures ANOVA on P δu also shows that both main effects of viewing distance and display type are significant (F(1.08,7.58) = 45.22, p < 0.001 and F(2,14) = 52.45, p < 0.00001, respectively), as well as the interaction effect of viewing distance and display type (F(1.63,11.39) = 9.29, p < 0.01). For all three display conditions, in contrast to the RMS error, P δu decreases with viewing distance, indicating that participants' control response to the lateral position perturbation decreases with viewing distance. This is again consistent with the use of the bearing angle strategy as the lateral position perturbation causes a larger change in bearing angle at a near than far viewing distance. The overall increase of P δu from the empty ground to the sparse and then to the dense random-dot ground display and the relatively shallow slope of P δu against distance for the two random-dot ground displays support the claim that with added optic flow information, participants relied more on their perceived heading from optic flow for lane keeping. 
Lastly, Figure 5c plots the mean control power correlated with the vehicle orientation perturbation (P δq ) as a function of viewing distance for the three display conditions. Note that the nature of the control task is to respond to the lateral position perturbation while ignoring the vehicle orientation perturbation, thus good performance should be highly correlated with the lateral position perturbations (i.e., high P δu values) but not the vehicle orientation perturbation (i.e., low P δq values). A repeated-measures ANOVA on P δq once again shows that both main effects of viewing distance and display type are significant (F(2,14) = 35.1, p < 0.00001 and F(2,14) = 47.41, p < 0.00001, respectively), as well as the interaction effect of viewing distance and display type (F(4,28) = 16.37, p < 0.00001). For all three display conditions, P δq increases with viewing distance, indicating that the control response to the vehicle orientation perturbation increases with distance. This agrees with the prediction that opposite to the lateral position perturbation, the vehicle orientation perturbation causes a larger change in bearing angle at a far than near distance. The overall decrease of P δq from the empty ground to the sparse and then to the dense random-dot ground display as well as the smaller increase of P δq with distance for the two random-dot ground displays provide further supporting evidence of the use of the heading strategy in lane keeping with enriched optic flow displays. 
Combining the RMS error and the control power correlation data, the above results show that both bearing angle and optic flow information are used for lane keeping. As more optic flow information is added to the display, participants start relying more on their perceived heading from optic flow for lane keeping. The largest effect of optic flow information on lane keeping at the far viewing distance is consistent with the fact that the use of heading from optic flow for lane keeping is not affected by viewing distance but the use of bearing angles is. At the near viewing distance of 3.5 m, bearing angle information itself can lead to reasonably accurate lane-keeping control. In the next experiment, we examine whether bearing angle and optic flow information are still used for lane keeping in the presence of splay angle information. 
Experiment 2
In this experiment, we added splay angle information to the display by placing a segment of lane edges on the ground to define the straight path. The midpoint of the lane edge segment was placed at the same three viewing distances (3.48, 9.18, and 36.7 m, Figure 6) as the pair of posts in Experiment 1. The projected vertical extent (3.1°V) of the lane edge segment on the image plane was kept constant to ensure that the perceptual salience of splay angle information did not change across the three viewing distances. 
Figure 6
 
The path (the dashed lines, invisible in the experimental displays) depicted by a segment of lane edges placed at (a) 3.48 m, (b) 9.18 m, and (c) 36.7 m along the simulated observer viewing direction (parallel to the path in this figure), corresponding to the near, medium, and far viewing distances.
Figure 6
 
The path (the dashed lines, invisible in the experimental displays) depicted by a segment of lane edges placed at (a) 3.48 m, (b) 9.18 m, and (c) 36.7 m along the simulated observer viewing direction (parallel to the path in this figure), corresponding to the near, medium, and far viewing distances.
If participants rely solely on splay angles when they are available for lane keeping as reported by Beall and Loomis (1996), as they are equally salient across the three viewing distances, we expect that the overall performance error would be small and similar at all three viewing distances for all display conditions. Furthermore, because the perturbation to the vehicle's lateral position in the lane causes the same amount of change in splay angle at different viewing distances, participants should respond to the lateral position perturbation similarly at all three viewing distances. Lastly, due to the fact that the vehicle orientation perturbation affects the use of the bearing but not splay angle information for lane keeping, participants should not respond much to the vehicle orientation perturbation. 
In contrast, if participants use their perceived heading from optic flow in addition to splay angle information for lane keeping, we expect that their control performance would improve as more optic flow information is added to the display. If bearing angles are also used, we expect that the performance error would be smaller than in Experiment 1 but still increase with viewing distance especially for the empty ground display containing no flow information. In addition, participants would respond to the lateral position and the vehicle orientation perturbations in a similar way as they did in Experiment 1
Methods
Participants
The same eight participants from Experiment 1 participated in the experiment. 
Visual stimuli and procedure
The visual stimuli were the same as those in Experiment 1 except that the traveling path was depicted by a segment of red lane markers on the ground (3.1°V). The midpoint of the lane marker segment was placed at 3.48 m, 9.18 m, or 36.7 m along the simulated observer viewing direction (i.e., the vehicle orientation), corresponding to the near, medium, and far viewing distances as in Experiment 1 (Figure 6). The lane marker segment moved with the simulated vehicle/observer movement to maintain a constant size and viewing distance throughout the trial. 
As in Experiment 1, a 3 (viewing distance) × 3 (display type) within-subject design was used. Three experimental sessions were run with each session containing 18 randomized trials (2 trials × 3 display types × 3 viewing distances). The experiment lasted about 2 h. 
Results and discussion
Figure 7a plots the mean RMS error averaged across eight participants as a function of viewing distance for the three display conditions. Note that the Y-axis scale in Figure 7a is smaller than that in Figure 5a. Thus, on average, the RMS errors were smaller than those from Experiment 1 especially for the empty ground display at the far viewing distance. A repeated-measures ANOVA on the RMS error reveals that only the main effect of display type is significant (F(1.14,7.95) = 26.12, p < 0.001). Newman–Keuls tests show that the overall RMS error for the empty ground display (mean: 0.72 m) is significantly larger than that for the sparse (0.55 m, p < 0.001) and the dense random-dot ground displays (0.51 m, p < 0.001), and the overall RMS error for the two random-dot ground displays are not significantly different from each other. While the overall small RMS error and the lack of viewing distance effect are consistent with the use of the splay angle cue in lane keeping, the decrease of the RMS error from the empty ground display to the two random-dot ground displays confirms the role of added optic flow information in lane-keeping control. 
Figure 7
 
(a) Mean RMS error, (b) mean P δu , and (c) mean P δq as a function of viewing distance for the three display conditions in Experiment 2. Error bars represent SEs across eight participants. Note that the Y-axis scale in (a) is smaller than that in Figure 5a.
Figure 7
 
(a) Mean RMS error, (b) mean P δu , and (c) mean P δq as a function of viewing distance for the three display conditions in Experiment 2. Error bars represent SEs across eight participants. Note that the Y-axis scale in (a) is smaller than that in Figure 5a.
For participants' control response to the input lateral position perturbation, Figure 7b plots the mean P δu averaged across eight participants as a function of viewing distance for the three display conditions. A repeated-measures ANOVA on P δu also shows that only the main effect of display type is significant (F(2,14) = 56.23, p ≪ 0.001). Newman–Keuls tests show that the overall P δu for the empty ground display (mean: 34.49%) is significantly smaller than that for the sparse (48.77%, p < 0.001) and the dense random-dot ground displays (53.38%, p < 0.001), and the overall P δu for the sparse random-dot ground display is significantly smaller than that for the dense random-dot ground display (p < 0.05). Again, while the lack of the viewing distance effect of P δu supports the use of splay angles to stay in the center of the lane, the systematic increase of P δu mirrors the trend of the RMS error data and confirms that with enriched optic flow display, participants relied more on their perceived heading from optic flow for lane keeping. 
For participants' control response to the input vehicle orientation perturbation, Figure 7c plots the mean P δq as a function of viewing distance for the three display conditions. A repeated-measures ANOVA on P δq shows that both the main effect of display type and the interaction effect of display type and viewing distance are significant (F(1.06,7.42) = 6.46, p < 0.05 and F(4,28) = 4.66, p < 0.01, respectively). Newman–Keuls tests show that P δq for the empty ground display is significantly larger than that for the two random-dot ground displays at the near and medium viewing distances (p < 0.01 for all cases) but not at the far viewing distance. Consistent with the use of splay angles, even for the empty ground display, participants' control response to the vehicle orientation perturbation was low (P δq < 30%) and not affected by viewing distance. Adding more optic flow information in the display led to participants responding to the vehicle orientation perturbation even less, especially at the near and medium viewing distances. 
To sum up, the lack of the viewing distance effect on all three performance measurements indicates that although the lane markers provided both bearing and splay angle information, participants ignored bearing but relied on splay angles for lane keeping. However, participants did not exclusively rely on splay angle information to stay in the center of the lane. The effect of added optic flow information in the sparse and dense random-dot ground displays on lane keeping is consistently observed on all three performance measurements, indicating that even when the display contains splay angles provided by lane edges, enriched optic flow displays can still further improve the control performance in many aspects. 
General discussion
Combing the results from the two experiments, we find that lane-keeping control improves as more optic flow information is added to the scene regardless of the availability of bearing or splay angle information. As the lateral position perturbation affects the use of both bearing and splay angle cues but the vehicle orientation perturbation only affects the use of bearing angles for lane keeping, by analyzing the extent to which participants respond to these two perturbations, we separate the use of these two cues simultaneously provided by lane edges. We find that in the presence of splay angles, observers tend to ignore bearing and rely mainly on splay angles for lane keeping. This is likely due to the fact that splay angles are a more robust source of information for lane keeping than bearing angles as they are independent of viewing distance and not affected by vehicle rotation. 
Although optic flow normally accompanies locomotion, many studies have challenged the claim (see Gibson, 1950, 1979) that humans use heading specified by information in optic flow for the control of self-motion (e.g., Beall & Loomis, 1996; Harris & Bonas, 2002; Nakayama, 1994; Rushton, Harris, Lloyd, & Wann, 1998; Wann & Land, 2000), and many other studies show that humans use both optic flow and egocentric direction cues to control self-motion (e.g., Harris & Carré, 2001; Warren, Kay, Zosh, Duchon, & Sahuc, 2001; Wilkie & Wann, 2002, 2003). Our findings are consistent with the latter studies in showing that participants use information from the flow field for lane keeping, and the accuracy of their control performance improves as more flow information is added to the scene. This is at odds with what was reported by Beall and Loomis (1996) that when splay angles were available, participants ignored the global flow field of dot motion on the ground and relied solely on splay angles for lane keeping. 
We surmise that the different findings reported by Beall and Loomis (1996) might be due to the following reasons. First, Beall and Loomis placed 100 dots on the ground in the depth range of about 9 to 4800 m. During a trial, dots passed below the lower edge of the screen were placed back at 0.85° below the horizon on the screen. This resulted in almost all the dots clustering near the horizon and very few dots in the foreground, which significantly reduced the amount of global motion parallax information in the display and could have made heading estimation more difficult. Second, the field of view (FOV) used in their study was about 20°H × 15°V while it was 110°H × 94°V in the current study. A large FOV has been shown to help perception and control of self-motion (see Wolpert, 1990, for a review) and is essential for providing sufficient motion parallax information in the flow field for accurate heading perception during rotation (Grigo & Lappe, 1999; Li et al., 2009; Li & Warren, 2000, 2004). Lastly, in their study, the input perturbation was the crosswind force (thus affected the vehicle's lateral acceleration) and the perturbation frequency ranged from 0.02 to 0.5 Hz. Given that the temporal integration of a sum-of-sines perturbation function smoothes the high-frequency components, and the vehicle's heading is specified by the temporal integration of the vehicle's lateral acceleration, the perturbation to the vehicle's heading in their study contained mainly low-frequency components. In contrast, in our current study, the input perturbation affected the vehicle's lateral position and the perturbation frequency ranged from 0.1 to 2.19 Hz. Given that the temporal derivative of a sum-of-sines perturbation function magnifies the high-frequency components, and the vehicle's heading is specified by the temporal derivative of the vehicle's lateral position, the perturbation to the vehicle's heading in our study contained largely high-frequency components. As a result, the movement of heading away from the center of the lane might be more obvious, which led to more use of the heading strategy. 
It is also possible that the effect of optic flow on lane keeping we observed in the current study is caused by the limited depth range of the lane markers in the visual stimuli, which is different from what we normally experience in real-life driving when the lane markers spread the whole depth range of the ground plane (as shown by the white dashed lines in Figure 6a). However, we believe that this possibility is unlikely due to the following reasons. First, splay angle is given by the orientation of the optical projection of any two points on a lane edge relative to a vertical line in the image plane independent of distance as shown by Equation 2. Accordingly, the use of splay angles provided by the lane edges for lane keeping should not depend on the depth range of the lane markers. Second, the vertical visual angle of the lane marker segment used in the study by Beall and Loomis (1996) is 1.15°, and they did not find the contribution of optic flow to lane keeping. In contrast, the vertical visual angle of the lane marker segment used in the current study is 3.1° (about 2.7 times larger), yet we found significant improvement in control performance as dense flow information was added to the scene. This is inconsistent with the interpretation that the significant contribution of optic flow observed in the current study is due to a limited depth range across which the lane markers were visible. Last, in a pilot experiment we conducted in which the lane makers spread the depth range of 3.2–100 m and the vehicle's lateral position was perturbed, we found that for 14 observers, their lane-keeping control performance (measured by the RMS error) was better on a random-dot ground than an empty ground display. All these support the claim that the effect of optic flow on lane keeping on a straight path is not affected by the depth range of visible lane markers. 
Several researchers have proposed that drivers take a parallel two-level approach for driving: (1) a guidance level using information to plan and execute the steering trajectory, and (2) a stabilization level using information to keep the vehicle's current position within the road boundaries (Donges, 1978; Land, 1998; Salvucci & Gray, 2004). The findings from the current study shed light on the specific sources of information people use for lane keeping on a straight path at these two levels. As heading specified by information in optic flow informs people of their travel direction, it can be used to plan the steering trajectory on a straight path. However, heading does not inform people about their positions relative to the lane edges, thus for immediate stabilization to stay within the lane boundaries, people need to use splay angles to eliminate any deviation from the center of the lane. 
The findings from the current study have practical implications for the design of modern highways and driving simulators. As both optic flow and splay angle information are used for lane keeping, highways and simulator interface design engineers should provide drivers with multiple sources of information (such as reference objects on the road sides, road lights, and road signs) in addition to road edges to support accurate control of lane keeping and prevent traffic accidents. 
Appendix A
Bearing angle calculation
Assuming that the vehicle is rotated clockwise at angle θ with respect to the path due to the orientation perturbation, the left and right bearing angles (B L and B R) to reference points A′ and B′ on the lane edges (Figure 1b) are, respectively, given by 
B L = arctan ( | A G | D ) a n d B R = arctan | G B | D ,
(A1)
where 
| A G | = | A C | + | C G | = X L cos θ + D tan θ , | G B | = | C B | | C G | = X R cos θ D tan θ .
(A2)
Combining Equations A1 and A2, we get 
B = arctan ( X cos θ ± D tan θ D ) = arctan ( X D cos θ ± tan θ ) .
(A3)
 
The change of bearing angle (B) due to the perturbations to the vehicle's lateral position and its orientation is thus the partial derivative of B with respect to X and θ, respectively, 
B X = 1 D cos θ ( 1 + ( X D cos θ ± tan θ ) 2 ) ,
(A4)
and 
B θ = X sin θ D cos 2 θ ± 1 ± tan 2 θ 1 + ( X D cos θ ± tan θ ) 2 .
(A5)
 
Within the range of the lateral position and orientation perturbation magnitudes (i.e., the range of X and θ values) used in the study,
B X
decreases with viewing distance D and
B θ
increases with D
Appendix B
Splay angle calculation
Given that splay angle is the angle between the optical projection of a lane edge and any vertical line in the image plane, we first describe how to calculate the projected position in the image plane of a surface point in the world. Let XYZ represent the world coordinate system, and XYZ eye represent the viewer's eye-centered coordinate system as depicted in Figure 2a. Each point in the world has an associated homogeneous position vector, P = (X, Y, Z, 1) T . For a known eye position in the world, O = (O X , O Y , O Z ) T , the position of the same surface point in the eye-centered coordinate system, P eye = (X eye, Y eye, Z eye, 1) T , is 
P e y e = [ R R * O 0 1 ] * P = [ X e y e Y e y e Z e y e 1 ] ,
(B1)
where R is a 3 × 3 rotation matrix defined by the orientation of the eye-centered coordinate system relative to the world. Under perspective projection, this surface point projects to a point in the image plane, p = (x, y) T , with 
x = f X e y e Z e y e , y = f Y e y e Z e y e ,
(B2)
where f is the focal length of the eye. 
Now we describe how to calculate splay angle. Let P 1 = (−X R, 0, Z 1, 1) T and P 2 = (−X R, 0, Z 2, 1) T be two points on a right lane marker in the world, and O = (0, H, 0) T the coordinates of the viewer's eye position in the world (Figure 2a). Assuming that the perturbation to the vehicle orientation (i.e., the simulated observer viewing direction through the windshield) rotates the Z eye-axis clockwise at angle θ with respect to the Z-axis in the world, the rotation matrix R is given by 
R = [ cos θ 0 sin θ 0 1 0 sin θ 0 cos θ ] .
(B3)
The positions of the two points on the lane marker in the eye-centered coordinate system P 1eye = (X 1eye, Y 1eye, Z 1eye, 1) T and P 2eye = (X 2eye, Y 2eye, Z 2eye, 1) T are 
P 1 e y e = [ R R * O 0 1 ] * P 1 = [ X R cos θ Z 1 sin θ H Z 1 cos θ X R sin θ 1 ] = [ X 1 e y e Y 1 e y e Z 1 e y e 1 ] ,
(B4)
and 
P 2 e y e = [ R R * O 0 1 ] * P 2 = [ X R cos θ Z 2 sin θ H Z 2 cos θ X R sin θ 1 ] = [ X 2 e y e Y 2 e y e Z 2 e y e 1 ] .
(B5)
Under perspective projection, these two points project to points p 1 = (x 1, y 1) T and p 2 = (x 2, y 2) T in the image plane, with 
x 1 = f X 1 e y e Z 1 e y e = f ( X R cos θ Z 1 sin θ ) Z 1 cos θ X R sin θ , y 1 = f Y 1 e y e Z 1 e y e = f H Z 1 cos θ X R sin θ ,
(B6)
and 
x 2 = f X 2 e y e Z 2 e y e = f ( X R cos θ Z 2 sin θ ) Z 2 cos θ X R sin θ , y 2 = f Y 2 e y e Z 2 e y e = f H Z 2 cos θ X R sin θ .
(B7)
The splay angle provided by the right lane marker (S R) is then given by the orientation of ∣p 1 p 2∣ relative to a vertical line in the image plane 
S R = arctan ( | x 1 x 2 | | y 1 y 2 | ) = arctan ( X R H cos θ ) .
(B8)
Similarly, the splay angle provided by the left lane maker is 
S L = arctan ( X L H cos θ ) .
(B9)
 
Acknowledgments
This study was supported by a grant from the Research Grants Council of Hong Kong (HKU 7471/06H) to L. Li. We thank Shuda Li for his assistance in programming, and Jack Loomis, Diederick Niehorster, and two anonymous reviewers for their helpful suggestions. 
Commercial relationships: none. 
Corresponding author: Li Li. 
Email: lili@hku.hk. 
Address: Department of Psychology, The University of Hong Kong, Pokfulam, Hong Kong, China SAR. 
References
Banks M. S. Ehrlich S. M. Backus B. T. Crowell J. A. (1996). Estimating heading during real and simulated eye movements. Vision Research, 36, 431–443. [CrossRef] [PubMed]
Beall A. C. Loomis J. M. (1996). Visual control of steering without course information. Perception, 25, 481–494. [CrossRef] [PubMed]
Beusmans J. (1995). Center of outflow is not used to control locomotion (Technical Report CBR TR 95-5). Cambridge, MA: Cambridge Basic Research.
Chatziastros A. Wallis G. M. Bülthoff H. H. (1999). The use of splay angle and optical flow in steering a central path (Technical Report 72). Tubingen, Germany: Max Planck Institute for Biological Cybernetics.
Donges E. (1978). A two level model of driver steering behaviour. Human Factors, 20, 691–707.
Duchon A. P. Warren W. H. (2002). A visual equalization strategy for locomotor control: Of honeybees, humans, and robots. Psychological Science, 13, 272–278. [CrossRef] [PubMed]
Gibson J. J. (1950). The perception of the visual world. Boston: Houghton Mifflin.
Gibson J. J. (1979). The ecological approach to visual perception. Boston: Houghton Mifflin.
Grigo A. Lappe M. (1999). Dynamical use of different sources of information in heading judgments from retinal flow. Journal of the Optical Society of America A, 16, 2079–2091. [CrossRef]
Harris J. M. Bonas W. (2002). Optic flow and scene structure do not always contribute to the control of human walking. Vision Research, 42, 1619–1626. [CrossRef] [PubMed]
Harris M. G. Carré G. (2001). Is optic flow used to guide walking while wearing a displacing prism? Perception, 30, 811–818. [CrossRef] [PubMed]
Kandil F. I. Rotter A. Lappe M. (2009). Driving is smoother and more stable when using the tangent point. Journal of Vision, 9, (1):11, 1–11, http://www.journalofvision.org/content/9/1/11, doi:10.1167/9.1.11. [PubMed] [Article] [CrossRef] [PubMed]
Kandil F. I. Rotter A. Lappe M. (2010). Car drivers attend to different gaze targets when negotiating closed vs open bends. Journal of Vision, 10, (4):24, 1–24, http://www.journalofvision.org/content/10/4/24, doi:10.1167/10.4.24. [PubMed] [Article] [CrossRef] [PubMed]
Land M. F. (1998). The visual control of steering. In Harris L. R. Jenkin M. (Eds.), Vision and action. (pp. 163–180). Cambridge, New York: Cambridge University Press.
Land M. F. Lee D. N. (1994). Where we look when we steer. Nature, 369, 742–744. [CrossRef] [PubMed]
Li L. Chen J. Peng X. (2009). Influence of visual path information on human heading perception during rotation. Journal of Vision, 9, (3):29, 1–14, http://www.journalofvision.org/content/9/3/29, doi:10.1167/9.3.29. [PubMed] [Article] [CrossRef] [PubMed]
Li L. Sweet B. T. Stone L. S. (2006). Humans can perceive heading without visual path information. Journal of Vision, 6, (9):2, 874–881, http://www.journalofvision.org/content/6/9/2, doi:10.1167/6.9.2. [PubMed] [Article] [CrossRef]
Li L. Warren W. H. (2000). Perception of heading during rotation: Sufficiency of dense motion parallax and reference objects. Vision Research, 40, 3873–3894. [CrossRef] [PubMed]
Li L. Warren W. H. (2004). Path perception during rotation: Influence of instructions, depth range, and dot density. Vision Research, 44, 1879–1889. [CrossRef] [PubMed]
McLean J. R. Hoffmann E. R. (1973). The effects of restricted preview on driver steering control and performance. Human Factors, 15, 421–430. [PubMed]
Nakayama K. (1994). James Gibson—An appreciation. Psychological Review, 101, 329–335. [CrossRef] [PubMed]
Regan D. Beverley K. I. (1982). How do we avoid confounding the direction we are looking and the direction we are moving? Science, 215, 194–196. [CrossRef] [PubMed]
Rushton S. K. Harris J. M. Lloyd M. Wann J. P. (1998). Guidance of locomotion on foot uses perceived target location rather than optic flow. Current Biology, 8, 1191–1194. [CrossRef] [PubMed]
Salvucci D. D. Gray R. (2004). A two-point visual control model of steering. Perception, 33, 1233–1248. [CrossRef] [PubMed]
Stone L. S. Perrone J. A. (1997). Human heading estimation during visually simulated curvilinear motion. Vision Research, 37, 573–590. [CrossRef] [PubMed]
Wann J. Land M. (2000). Steering with or without the flow: Is the retrieval of heading necessary? Trends in Cognitive Sciences, 4, 319–324. [CrossRef] [PubMed]
Warren W. H. (1998). Visually controlled locomotion: 40 years later. Ecological Psychology, 10, 177–219. [CrossRef]
Warren W. H. Kay B. A. Zosh W. D. Duchon A. P. Sahuc S. (2001). Optic flow is used to control human walking. Nature Neuroscience, 4, 213–216. [CrossRef] [PubMed]
Weir D. H. McRuer D. T. (1970). Dynamics of driver vehicle steering control. Automatica, 6, 87–98. [CrossRef]
Weir D. H. Wojcik C. K. (1971). Simulator studies of the driver's dynamic response in steering control tasks. Highway [Transportation] Research Record, 364, 1–15.
Wilkie R. M. Wann J. P. (2002). Driving as night falls: The contribution of retinal flow and visual direction to the control of steering. Current Biology, 12, 2014–2017. [CrossRef] [PubMed]
Wilkie R. M. Wann J. P. (2003). Controlling steering and judging heading: Retinal flow, visual direction, and extra-retinal information. Journal of Experimental Psychology: Human Perception and Performance, 29, 363–378. [CrossRef] [PubMed]
Wohl J. G. (1961). Man–machine steering dynamics. Human Factors, 3, 222–228.
Wolpert L. (1990). Field-of-view information for self-motion perception. In Warren R. Wertheim A. H. (Eds.), Perception and control of self-motion (pp. 101–126). Hillsdale, NJ: Lawrence Erlbaum.
Figure 1
 
An illustration of the use of bearing angles in lane keeping. (a) The vehicle is oriented along the straight path. The bearing angles to reference points A and B on the left and right lane edges are given by B L = tan−1(X L/D) and B R = tan−1(X R/D). When B L = B R, the vehicle is at the center of the lane. (b) The vehicle orientation is now rotated θ = 25° clockwise with respect to the path. The bearing angles to reference points A′ and B′ are now given by BL = tan−1(X L/(D*cos(25)) + tan(25)) and BR = tan−1(X R/(D*cos(25)) − tan(25)). Even when the vehicle is still at the center of the lane, BLBR.
Figure 1
 
An illustration of the use of bearing angles in lane keeping. (a) The vehicle is oriented along the straight path. The bearing angles to reference points A and B on the left and right lane edges are given by B L = tan−1(X L/D) and B R = tan−1(X R/D). When B L = B R, the vehicle is at the center of the lane. (b) The vehicle orientation is now rotated θ = 25° clockwise with respect to the path. The bearing angles to reference points A′ and B′ are now given by BL = tan−1(X L/(D*cos(25)) + tan(25)) and BR = tan−1(X R/(D*cos(25)) − tan(25)). Even when the vehicle is still at the center of the lane, BLBR.
Figure 2
 
An illustration of the use of splay angles in lane keeping. (a) Splay angle (S) illustrated as the orientation of the optical projection of two points on a lane edge relative to a vertical line in the image plane. (b) The vehicle is oriented along the straight path, and the red dashed line is a vertical line in the image plane. The splay angles of the left and right lane edges are given by S L = tan−1(X L/H) and S R = tan−1(X R/H). When S L = S R, the vehicle is at the center of the lane. (c) The vehicle orientation is now rotated θ = 25° clockwise with respect to the path. The splay angles of the left and right lane edges are now given by SL = tan−1(X L/(H*cos(25))) and SR = tan−1(X R/(H*cos(25))). The vehicle is still at the center of the lane, and SL = SR.
Figure 2
 
An illustration of the use of splay angles in lane keeping. (a) Splay angle (S) illustrated as the orientation of the optical projection of two points on a lane edge relative to a vertical line in the image plane. (b) The vehicle is oriented along the straight path, and the red dashed line is a vertical line in the image plane. The splay angles of the left and right lane edges are given by S L = tan−1(X L/H) and S R = tan−1(X R/H). When S L = S R, the vehicle is at the center of the lane. (c) The vehicle orientation is now rotated θ = 25° clockwise with respect to the path. The splay angles of the left and right lane edges are now given by SL = tan−1(X L/(H*cos(25))) and SR = tan−1(X R/(H*cos(25))). The vehicle is still at the center of the lane, and SL = SR.
Figure 3
 
Display conditions in the study. (a) Empty ground. (b) Sparse random-dot ground. (c) Dense random-dot ground.
Figure 3
 
Display conditions in the study. (a) Empty ground. (b) Sparse random-dot ground. (c) Dense random-dot ground.
Figure 4
 
The lane edges (the dashed lines, invisible in the experimental displays) depicted by a pair of vertical posts placed at (a) 3.48 m, (b) 9.18 m, and (c) 36.7 m along the simulated observer viewing direction (parallel to the path in this figure), corresponding to the near, medium, and far viewing distances.
Figure 4
 
The lane edges (the dashed lines, invisible in the experimental displays) depicted by a pair of vertical posts placed at (a) 3.48 m, (b) 9.18 m, and (c) 36.7 m along the simulated observer viewing direction (parallel to the path in this figure), corresponding to the near, medium, and far viewing distances.
Figure 5
 
(a) Mean RMS error, (b) mean P δu , and (c) mean P δq as a function of viewing distance for the three display conditions in Experiment 1. Error bars represent SEs across eight participants.
Figure 5
 
(a) Mean RMS error, (b) mean P δu , and (c) mean P δq as a function of viewing distance for the three display conditions in Experiment 1. Error bars represent SEs across eight participants.
Figure 6
 
The path (the dashed lines, invisible in the experimental displays) depicted by a segment of lane edges placed at (a) 3.48 m, (b) 9.18 m, and (c) 36.7 m along the simulated observer viewing direction (parallel to the path in this figure), corresponding to the near, medium, and far viewing distances.
Figure 6
 
The path (the dashed lines, invisible in the experimental displays) depicted by a segment of lane edges placed at (a) 3.48 m, (b) 9.18 m, and (c) 36.7 m along the simulated observer viewing direction (parallel to the path in this figure), corresponding to the near, medium, and far viewing distances.
Figure 7
 
(a) Mean RMS error, (b) mean P δu , and (c) mean P δq as a function of viewing distance for the three display conditions in Experiment 2. Error bars represent SEs across eight participants. Note that the Y-axis scale in (a) is smaller than that in Figure 5a.
Figure 7
 
(a) Mean RMS error, (b) mean P δu , and (c) mean P δq as a function of viewing distance for the three display conditions in Experiment 2. Error bars represent SEs across eight participants. Note that the Y-axis scale in (a) is smaller than that in Figure 5a.
Table 1
 
Magnitudes and frequencies of the seven harmonically independent sinusoids in the input perturbations to the vehicle's lateral position (u) and its orientation (q).
Table 1
 
Magnitudes and frequencies of the seven harmonically independent sinusoids in the input perturbations to the vehicle's lateral position (u) and its orientation (q).
i a i Lateral position (u) Vehicle orientation (q)
k i ω i (Hz) k i ω i (Hz)
1 2 9 0.1 10 0.11
2 2 13 0.14 14 0.16
3 2 22 0.24 24 0.27
4 0.2 37 0.41 38 0.42
5 0.2 67 0.74 69 0.77
6 0.2 115 1.28 118 1.31
7 0.2 197 2.19 199 2.21
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×