Free
Research Article  |   July 2006
Bayesian models of binocular 3-D motion perception
Author Affiliations
Journal of Vision July 2006, Vol.6, 14. doi:10.1167/6.4.14
  • Views
  • PDF
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to Subscribers Only
      Sign In or Create an Account ×
    • Get Citation

      Martin Lages; Bayesian models of binocular 3-D motion perception. Journal of Vision 2006;6(4):14. doi: 10.1167/6.4.14.

      Download citation file:


      © 2016 Association for Research in Vision and Ophthalmology.

      ×
  • Supplements
Abstract

Psychophysical studies on three-dimensional (3-D) motion perception have shown that perceived trajectory angles of a small target traveling in depth are systematically biased. Here, predictions from Bayesian models, which extend existing models of motion-first and stereo-first processing, are investigated. These statistical models are based on stochastic representations of monocular velocity and binocular disparity input in a binocular viewing geometry. The assumption of noise in these inputs together with a plausible prior for 3-D motion leads to testable predictions of perceived trajectory angle and velocity. Results from two experiments are reported, suggesting that disparity rather than motion processing introduces perceptual bias.

Introduction
Observers have to infer trajectory and velocity of three-dimensional (3-D) object motion from 2-D projections into the left and right eyes. Under natural viewing conditions, there are many depth cues to 3-D motion, but in a sparse environment, only binocular depth cues may be available. In the following psychophysical experiments, interocular velocity difference and disparity change are investigated because motion and disparity detectors feature early and prominently in the processing of visual information (Howard & Rogers, 2002). 
This is reflected in current models for the perception of motion in depth. (1) The motion-first model postulates monocular motion processing followed by stereo processing (Lu & Sperling, 1995; Regan, Beverley, & Cynader, 1979). In this model, monocular motion is independently detected in the left and right eyes before motion in depth is established. (2) The stereo-first model assumes disparity encoding followed by binocular motion processing (Cumming & Parker, 1994). This model first extracts binocular disparities and then computes change of disparity over time. Note that tracking of spatial location is also required to recover a 3-D motion trajectory. (3) Finally, the stereo–motion model suggests joint encoding of binocular disparity and interocular delay (Carney, Paradiso, & Freeman, 1989; Morgan & Fahle, 2000; Qian & Andersen, 1997). 
Psychophysical evidence based on detection and discrimination thresholds has been inconclusive, supporting interocular velocity difference (Brooks, 2002; Portfors-Yeomans & Regan, 1996; Shioiri, Saisho, & Yaguchi, 2000), changing disparity (Cumming & Parker, 1994; Tyler, 1971), or both (Brooks & Stone, 2004) as possible inputs to 3-D motion perception. However, detection and discrimination thresholds cannot reveal bias in 3-D motion perception, and therefore, accuracy rather than precision of observers' perception was measured in recent psychophysical studies (Harris & Dean, 2003; Welchman, Tuck, & Harris, 2004). Using the method of adjustment, observers reported perceived trajectory angle of a previously seen stimulus moving in depth. The results indicate overestimation of trajectory angle for a range of trajectories. It was suggested that observers exploit cyclopean azimuth α by using the endpoint of motion relative to a fixation point straight ahead (Harris & Drga, 2005). Although appealing in its simplicity, this heuristic requires additional depth cues to solve the inverse problem and to recover 3-D motion. 
A constantly moving target changes position in x and z over time. If the eyes remain verged and accommodated on a binocular fixation point straight ahead, then motion information is projected onto the retina of the left and right eyes as illustrated in Figure 1. The projection angles onto the retinae depend on the trajectory and velocity of the motion stimulus, as well as viewing distance D and interpupilar distance IPD. The average of left and right angles approximates visual angle in cyclopean view, and the difference defines binocular horizontal disparity. If projection angles are interpreted as angular velocities, then their difference also defines interocular velocity difference. Although motion and disparity input share the same geometry and are mathematically equivalent, processing of these inputs may be subject to different noise, uncertainty, and bias. 
Figure 1
 
Binocular viewing geometry in top view. If the two eyes are verged on a fixation point at viewing distance D with vergence angle β 0, then projections of a moving target (arrow) with angle α L in the left eye and α R in the right eye constrain motion of the target in xz space. The intersection of constraints (IOC) determines stimulus trajectory β and radius r (see Appendix A).
Figure 1
 
Binocular viewing geometry in top view. If the two eyes are verged on a fixation point at viewing distance D with vergence angle β 0, then projections of a moving target (arrow) with angle α L in the left eye and α R in the right eye constrain motion of the target in xz space. The intersection of constraints (IOC) determines stimulus trajectory β and radius r (see Appendix A).
In the following, the motion-first and stereo-first models are extended to uncertainty models in a Bayesian framework. Predictions of these models are derived for 3-D motion trajectories over the full range of 360 deg. They are tested under experimental conditions that try to minimize the influence of monocular depth cues and response heuristics. 
Bayesian models
Some promising Bayesian models have been developed in vision (cf. Knill & Richards, 1996). Most notably, Weiss, Simoncelli, and Adelson (2002) combined motion constraints of local motion detectors with a Gaussian prior for slow motion to predict perceived motion direction and velocity of luminance-defined objects in 2-D space. With this elegant approach, they could explain a variety of 2-D motion illusions. 
Most objects in natural environments are stationary. If one assumes that objects move slowly on any trajectory in xz space, then a symmetric bivariate Gaussian probability distribution centered on the starting point of a target, as shown in Figure 2, provides a plausible prior for 3-D motion in xz space. Symmetric perspective projections of this world prior into the left and right eyes give rise to marginal Gaussian distributions defining velocity priors centered on zero velocity. Similarly, the difference of the marginal distributions in the left and right eyes defines a prior for disparity (change) centered on zero disparity. Thus, the same 3-D motion prior in the world results in Gaussian velocity and disparity priors depending on the processing of signals. 
Figure 2
 
Illustration of a symmetric bivariate Gaussian prior for 3-D motion. Symmetric perspective projections into the left and right eyes give rise to marginal Gaussian distributions defining velocity priors and disparity prior centered on zero velocity and disparity, respectively.
Figure 2
 
Illustration of a symmetric bivariate Gaussian prior for 3-D motion. Symmetric perspective projections into the left and right eyes give rise to marginal Gaussian distributions defining velocity priors and disparity prior centered on zero velocity and disparity, respectively.
The binocular viewing geometry also suggests constraints for stimulus trajectory and velocity in both eyes. The intersection of visual constraint lines in xz space determines trajectory angle and velocity of a target moving in depth (see Appendix A). Under certainty, these constraint lines intersect at the same point, whether they are based on angular velocities or binocular disparities. If noise or uncertainty is introduced together with different priors for motion or disparity processing, then the intersection of constraints falls on different points, as illustrated in Figure 3
Figure 3
 
Illustration of predicted trajectory and velocity for the (A) motion-first and (B) stereo-first Bayesian models in top view. The constraint lines and IOC (gray) shift as a consequence of biased velocity (blue) or disparity (red) processing (see Appendices B and C for details).
Figure 3
 
Illustration of predicted trajectory and velocity for the (A) motion-first and (B) stereo-first Bayesian models in top view. The constraint lines and IOC (gray) shift as a consequence of biased velocity (blue) or disparity (red) processing (see Appendices B and C for details).
There are several potential sources for uncertainty and noise in binocular motion processing. For example, small moving targets in a sparse 3-D environment offer limited motion and disparity input, thereby introducing a degree of uncertainty in the observer. Mini-saccades during fixation or early noise in the encoding system is another possible source (Hogervorst & Eagle, 1998). 
First, assume that noise is present in the activation of monocular motion detectors optimally tuned to different velocities (see Appendix B). The representation of angular velocity in each eye is therefore not exact but subject to noise (Ascher & Grzywacz, 2000). The corresponding likelihood distributions for angular velocity in the left and right eyes are conveniently expressed as Gaussian distributions with equal variance centered on the true angular velocity of the stimulus in each eye. Each likelihood distribution is then combined with the velocity prior in each eye. Velocity priors favoring slow and smooth motion have been suggested in the context of 2-D motion (Ullman & Yuille, 1989; Weiss et al., 2002). 
Alternatively, internal noise may be introduced by the activation of binocular disparity detectors tuned to different disparities (see Appendix C). The likelihood distribution for disparity (change) is conveniently expressed as a Gaussian distribution centered on the true disparity (change) of the stimulus. This representation of disparity is combined with the disparity prior favoring zero disparity. A similar disparity prior has been suggested in the context of sustained and transient stereo images (Read, 2002a, 2002b). 
Following Bayes's rule, likelihoods and priors are combined to establish a posterior distribution for each model and trajectory. Applying a simple decision rule to the posterior distributions provides a posteriori estimates of angular velocity and disparity. These estimates correspond to biased constraint lines, and their intersection point determines a predicted trajectory angle and radial distance in xz space. 
In Figure 4, predictions over the full range of stimulus trajectories are derived for the two alternative Bayesian models of binocular 3-D motion perception. Predictions are based on motion trajectories between 0 and 360 deg with a radius of 3.3 cm at a viewing distance of 114 cm and at a interpupilar distance of 6.5 cm. Uncertainty is modeled by the ratio of likelihood and prior standard deviation varying from 0.1 to 1.9 in steps of 0.2. Predictions are initially veridical, but with increasing uncertainty, they approximate a shrinking isotropic circle for the motion-first Bayesian model and an ellipse for the stereo-first Bayesian model. In Figure 4A, the unbiased prediction of trajectory angles or isotropic shape is the result of or multiplying left and right angular velocities by the same factor (see also Equation B5). In Figure 4B, the increasingly elliptical shape is the consequence of stronger bias for large disparities near the z-axis (0 and 180 deg) and weaker bias for small disparities near the x-axis (90 and 270 deg). 
Figure 4
 
Predictions of trajectory angle and velocity in polar coordinates for the (A) motion-first and (B) stereo-first Bayesian models. Points denote model predictions for a target moving in three dimensions on trajectory angles of 10 to 350 deg, in steps of 20 deg. Uncertainty is modeled by the ratio of standard deviation for likelihood and prior, ranging from 0.1 to 1.9, in steps of 0.2. At a viewing distance of 1,140 mm, predictions are initially veridical, describing a circle of radius 33 mm, but with increasing uncertainty, they approximate a shrinking circle for the motion-first Bayesian model (blue) and a laterally oriented ellipse for the stereo-first Bayesian model (red; see text for explanation).
Figure 4
 
Predictions of trajectory angle and velocity in polar coordinates for the (A) motion-first and (B) stereo-first Bayesian models. Points denote model predictions for a target moving in three dimensions on trajectory angles of 10 to 350 deg, in steps of 20 deg. Uncertainty is modeled by the ratio of standard deviation for likelihood and prior, ranging from 0.1 to 1.9, in steps of 0.2. At a viewing distance of 1,140 mm, predictions are initially veridical, describing a circle of radius 33 mm, but with increasing uncertainty, they approximate a shrinking circle for the motion-first Bayesian model (blue) and a laterally oriented ellipse for the stereo-first Bayesian model (red; see text for explanation).
General methods
Tasks were programmed in MatLab using the Psychophysics Toolbox extensions (Brainard, 1997; Pelli, 1997) and run on a Macintosh G4 computer with a 21-in. Sony GDM-F500R cathode-ray tube flat screen monitor. The monitor was calibrated for luminance using a Minolta photometer (Cambridge Research Systems). Stimuli were presented in a split-screen Wheatstone configuration at a viewing distance of 1,140 mm with a frame rate of 120 Hz. Stimuli were shown at 50% Michelson contrast using a ray-tracing technique to compensate for vergence and projection on a flat screen. Observers were seated in front of haploscopic mirrors with their head supported by a chin rest and a headrest. The experimental room remained dark with lights switched off during testing. 
All observers had normal or corrected-to-normal visual acuity and were screened for stereo deficiencies. Before testing, observers attended a training block where they received feedback on trajectory angle and radial distance of each previously seen target motion. Observers who placed more than 25% of trajectory angles in the wrong quadrant were excluded. Informed written consent was obtained from naive observers before participation in experiments. 
The method of adjustment was used to assess perceived trajectory angle and radial distance of the stimulus in xz space after each stimulus presentation. As illustrated in Figure 5, observers saw a line probe inside a circle with a diameter of 88.8 mm, representing motion trajectory in top view. In four blocks of trials, they adjusted either orientation or length of the line probe. Observers could make coarse or fine adjustments by pressing corresponding keys on a keyboard. When observers adjusted line length, the probe was set to an orientation that corresponded to the correct trajectory angle. Once an adjustment was completed, observers press a key to confirm their setting and to continue with the next trial. 
Figure 5
 
Illustration of adjustment methods. In separate blocks of trials, observers rotated a black line inside a circle or adjusted its length to indicate perceived trajectory and radial distance in top view.
Figure 5
 
Illustration of adjustment methods. In separate blocks of trials, observers rotated a black line inside a circle or adjusted its length to indicate perceived trajectory and radial distance in top view.
Ideally, observers would have made both adjustments after a single stimulus presentation, but pilot studies suggested that a dual task was too demanding for most naive observers. 
Experiment 1: Variation of 3-D stimulus trajectory
Using the method of adjustment, we measured perceived trajectory angle and velocity of a small target stimulus in separate blocks of trials. Motion-first and stereo-first Bayesian models were fitted to individual data sets, and results are compared. 
Methods
Observers
Four unpaid undergraduate students from the University of Glasgow took part. They were naive as to the aim of the experiment. 
Stimuli and procedure
Stimuli were presented to the left and right eyes using the split-screen Wheatstone configuration. Participants viewed a single Gaussian dot subtending less than 4.4 arcmin, which was initially positioned 0.5 deg above a fixation cross. The fixation cross flanked by nonius lines above and below subtended 15.5 arcmin. The target dot moved 33.3 mm at 0.05 m/s on each of the 18 trajectories in xz space for 666 ms. Note that monocular image velocity changed due to the projections of the animated 3-D stimulus on a flat screen. 
On each trial, observers verged on the fixation cross before they initiated motion of the target by key-press. Trajectory angle of motion ranged between 10 and 350 deg in steps of 20 deg. In a 2 × 2 block design, each observer judged stimulus trajectories either to the front or to the back and reported either perceived trajectory angle or radial distance. In each block, adjustments to nine different trajectories were repeated in 10 randomly intermixed trials. 
Results and discussion
In Figure 6, individual data sets are depicted in polar plots where orientation of 0 deg indicates motion to the front; 90 deg, motion to the left; 180 deg, motion to the back; and 270 deg, motion to the right. Adjustments for each trajectory angle were averaged across 10 trials. 
Figure 6
 
Results from four observers in Experiment 1. Polar plots of perceived trajectory angles and radial distances (black) and best fitting motion-first (blue) and stereo-first (red) Bayesian models. Black data points denote average adjustments to stimulus trajectories between 10 and 350 deg, in steps of 20 deg, as well as average radial distance; filled data points correspond to trajectories at 90 and 270 deg.
Figure 6
 
Results from four observers in Experiment 1. Polar plots of perceived trajectory angles and radial distances (black) and best fitting motion-first (blue) and stereo-first (red) Bayesian models. Black data points denote average adjustments to stimulus trajectories between 10 and 350 deg, in steps of 20 deg, as well as average radial distance; filled data points correspond to trajectories at 90 and 270 deg.
For each observer, average adjustments (black points) systematically deviate from veridical, that is, a circle with a 33.3-mm radius and trajectories from 10 to 350 deg in steps of 20 deg. Individual data averaged over 10 trials are noisy but appear more elliptical than circular. Trajectory angle was overestimated near the median plane and underestimated near the frontal plane, whereas radial distance was overestimated near the frontal plane and underestimated near the median plane. 
Parametric fits to individual data are provided for the motion-first (blue) and the stereo-first (red) Bayesian model. The ratio of standard deviations for likelihood and prior was estimated (cf. Hürlimann, Kiper, & Carandini, 2002) together with radius for each model. Estimated ratios σv/σ for angular velocity and σd/σ for disparity are presented together with estimated radius rv and rd in Table 1. These two parameters were fitted in the maximum likelihood sense for each observer and model. 
Table 1
 
Results from four observers in Experiment 1. Parameter estimates of uncertainty and radius, as well as of goodness of fit for motion-first and stereo-first Bayesian models. Note. *p < .05.
Table 1
 
Results from four observers in Experiment 1. Parameter estimates of uncertainty and radius, as well as of goodness of fit for motion-first and stereo-first Bayesian models. Note. *p < .05.
Observer Motion-first Stereo-first
σ v/ σ r v χ 2(2) σ d/ σ r d χ 2(2)
E.M. 2.41 134.9 6.03* 1.44 36.1 2.04
K.K. 0.00 23.3 6.53* 1.00 32.2 1.98
L.B. 1.04 53.5 1.60 0.80 34.0 0.96
S.Y. 2.91 147.1 7.32* 1.51 28.3 1.13
 

Note. * p < .05.

As shown in Table 1, the stereo-first model gives better fits, and parameter estimates assume more plausible values because the ratio of standard deviations varies around 1.0 and estimates of radius are close to the true value of 33.3 mm. For the motion-first model, the ratio of standard deviations indicates no uncertainty for Observer K.K. and radius is overestimated by a factor of 4 in Observers E.M. and S.Y. 
Despite the wide range of trajectories in Experiment 1, there is concern that results were affected by response heuristics. The presentation of motion to the front and back in separate blocks of trials may have encouraged observers to judge trajectory by comparing angular velocities or cyclopean azimuth of the motion endpoint across trials (Harris & Drga, 2005). Monocular image velocity and cyclopean azimuth of motion endpoint are confounded with trajectory angle (Brooks & Stone, 2004). All observers assigned a crossed disparity to 90 deg trajectory, an uncrossed disparity to 270 deg trajectory, and relatively constant depths to motion trajectories in the same block. The assignment of stimulus motion to depth planes that correspond with the presentation of stimulus trajectories to the front and back in separate blocks could be the result of sequential dependencies between trials (cf. Lages & Treisman, 1998). 
Experiment 2: Variation of 3-D stimulus trajectory and velocity
The same method and apparatus as in Experiment 1 were employed unless mentioned otherwise. In each block of trials, motion targets traveled on 36 trajectory angles at three different velocities in randomly intermixed trials. These changes in stimulus and design were introduced to eliminate response heuristics of observers who may rely on cues such as monocular velocity or endpoint of motion when adjusting trajectory angle and radial distance. 
Methods
Observers
Four observers took part. A.G., R.G., and S.S. were naive as to the aim of the experiment, whereas M.L. is the author. 
Stimuli and procedure
To increase binocular input and to reduce uncertainty, each observer viewed three Gaussian blobs presented above and below the fixation cross surrounded by a rectangular fusion lock at fixation depth. Each dot subtended less than 4.4 arcmin at 27.7, 38.8, and 50 arcmin above and below fixation. In randomly intermixed trials, the dots moved 16.6, 25, or 33.3 mm in xz space for 833 ms (0.02, 0.03, or 0.04 m/s) on 36 trajectories. 
On each trial, the observer verged on the fixation cross before they initiated motion of the target dots by key-press. Trajectory angle ranged between 0 and 350 deg in steps of 10 deg. Each observer attended a total of 8 separate blocks (2 tasks × 4 repetitions), each comprising 108 trials (3 velocities × 36 trajectories). In each block of trials, observers judged either motion trajectory or radial distance. Adjustments to 36 trajectories and 3 velocities were repeated four times in randomly intermixed trials. 
Results and discussion
Individual adjustments of radial distance were first standardized to 1.0 and then averaged across 12 trials. As shown in Figure 7, all observers deviate systematically from veridical, that is, a circle with equidistant trajectories and constant radius. Average adjustments (black points) appear elliptical rather than circular. Maximum likelihood parametric fits to individual data are provided for the motion-first (blue) and stereo-first (red) Bayesian models. 
Figure 7
 
Results from four observers in Experiment 2. Polar plots for perceived trajectory angle and radial distance standardized to radius 1.0 (black) and best fitting motion-first (blue) and stereo-first (red) Bayesian models. Black data points denote average adjustments to stimulus trajectories between 0 and 350 deg in steps of 10 deg, as well as average standardized radial distance; filled data points correspond to cardinal trajectories.
Figure 7
 
Results from four observers in Experiment 2. Polar plots for perceived trajectory angle and radial distance standardized to radius 1.0 (black) and best fitting motion-first (blue) and stereo-first (red) Bayesian models. Black data points denote average adjustments to stimulus trajectories between 0 and 350 deg in steps of 10 deg, as well as average standardized radial distance; filled data points correspond to cardinal trajectories.
As in Experiment 1, standard deviation of the likelihood distribution for angular velocity or disparity together with radius served as free parameters. The two parameters were fitted, and results are shown for each observer and model in Table 2. The stereo-first Bayesian model gives better fits, and parameter values assume more plausible values for each observer, confirming that disparity processing determines perceived trajectory angle and velocity. However, radial adjustments for approaching and receding motions at 0 and 180 deg appear special. Reduced variability of trajectory adjustments also suggests that opposite motion directions in the left and right eyes may have helped observers identify these trajectory angles. 
Table 2
 
Results from four observers in Experiment 2. Data were first standardized to radius 1.0 and then averaged across 12 trials. Parameter estimates and goodness of fit are reported for motion-first and stereo-first Bayesian models.
Table 2
 
Results from four observers in Experiment 2. Data were first standardized to radius 1.0 and then averaged across 12 trials. Parameter estimates and goodness of fit are reported for motion-first and stereo-first Bayesian models.
Observer Radius ( r) Motion-first Stereo-first
σ v/ σ r v χ 2(2) σ d/ σ r d χ 2(2)
A.G. 1.0 6.86 46.8 1.39 0.79 1.21 0.46
M.L. 1.0 0.00 1.24 1.20 0.72 1.47 0.61
R.G. 1.0 1.23 3.15 1.02 0.83 1.52 0.21
S.S. 1.0 0.00 1.36 0.61 0.74 1.60 0.22
In a more detailed analysis, adjustments of trajectory and radial distance were averaged across trials with the same stimulus velocity only. Two-parameter models were fitted to data of each observer and stimulus velocity, and results are shown in Table 3. Again, the stereo-first Bayesian model gives better fits, and parameter values assume more plausible values. As illustrated in Figure 8, fits of the stereo-first model for each stimulus velocity suggest that an increase in stimulus velocity systematically raises uncertainty. This trend appears in all four observers except for Observer M.L. in the 16.6-mm condition. 
Table 3
 
Results from four observers and three stimulus velocities (radius in millimeters) in Experiment 2. Parameter estimates and goodness of fit for motion-first and stereo-first Bayesian models are shown. Note. *p < .05.
Table 3
 
Results from four observers and three stimulus velocities (radius in millimeters) in Experiment 2. Parameter estimates and goodness of fit for motion-first and stereo-first Bayesian models are shown. Note. *p < .05.
Observer Radius ( r) Motion-first Stereo-first
σ v/ σ r v χ 2(2) σ d/ σ r d χ 2(2)
A.G. 16.6 0.00 17.6 2.40 0.32 17.7 2.29
25.0 0.00 20.6 6.63* 0.81 27.7 2.12
33.3 0.00 25.2 17.2* 1.01 32.9 5.79
M.L. 16.6 0.00 26.3 2.89 0.56 28.2 1.92
25.0 0.00 30.8 4.07 0.49 34.3 3.09
33.3 0.00 34.4 9.24* 0.77 39.5 3.00
R.G. 16.6 1.83 113.5 1.40 0.48 28.8 1.32
25.0 2.84 274.3 3.40 0.87 35.4 0.58
33.3 2.22 213.1 8.97* 1.00 43.2 1.45
S.S. 16.6 0.00 22.2 2.31 0.00 22.2 2.31
25.0 0.00 36.1 1.45 0.58 39.3 1.14
33.3 0.82 64.5 1.74 0.64 44.5 1.35
 

Note. * p < .05.

Figure 8
 
Results from four observers in Experiment 2. Polar plots for perceived trajectory angle and radial distance for stimulus velocities of 0.02 m/s or 16.6 mm (blue), 0.03 m/s or 25.0 mm (magenta), and 0.04 m/s or 33.3 mm (red) and best fitting stereo-first Bayesian models. Filled data points correspond to cardinal stimulus trajectories. With increasing stimulus velocity, estimates of radius and uncertainty increase, and model fits assume a more compressed elliptical shape (see Table 3 for details of both model fits and Supplements for separate plots of perceived vs. physical trajectory angle and radial distance).
Figure 8
 
Results from four observers in Experiment 2. Polar plots for perceived trajectory angle and radial distance for stimulus velocities of 0.02 m/s or 16.6 mm (blue), 0.03 m/s or 25.0 mm (magenta), and 0.04 m/s or 33.3 mm (red) and best fitting stereo-first Bayesian models. Filled data points correspond to cardinal stimulus trajectories. With increasing stimulus velocity, estimates of radius and uncertainty increase, and model fits assume a more compressed elliptical shape (see Table 3 for details of both model fits and Supplements for separate plots of perceived vs. physical trajectory angle and radial distance).
General discussion
Rendering 3-D motion in a stereoscopic setup is difficult and can introduce various artifacts and cue conflicts. In the present experiments, constant size and blur of the target stimuli moving in depth could have influenced perceived depth (Watt, Akeley, Ernst, & Banks, 2005). On the other hand, cue conflict due to looming and accommodative cues are too small to account for the substantial and systematic bias found for small blurred targets that move up to ±3.3 cm in depth at a viewing distance of 114 cm. Using LEDs moving in real depth, Harris and Dean (2003) and Welchman et al. (2004) reported systematic overestimation of stimulus trajectories near the median plane, confirming the perceptual bias for trajectory angle. 
In the two experiments reported here, angular velocities ranged from 0.03 to 2 deg/s and disparities ranged from 0.0 to 5.87 arcmin. It seems reasonable to assume that velocity-tuned and disparity-tuned channels have relatively constant bandwidth for this limited range of monocular velocities and binocular disparities. This was modeled by a single parameter estimate of the likelihood standard deviation for all trajectories. However, velocity discrimination thresholds tend to be lower for low velocities (e.g., McKee & Nakayama, 1984), and disparity discrimination thresholds tend to decrease close to zero disparity (e.g., Farell, Li, & McKee, 2004). Hence, it can be argued that uncertainty, modeled by the spread of the likelihoods, may be linearly related to monocular image velocities and binocular disparity for different trajectories. 
Within a Bayesian framework, only the posterior is directly observable, whereas likelihoods and priors need to be estimated. Thus, just-noticeable differences in a standard discrimination task do not necessarily inform about the spread of likelihood distributions because reference and test stimuli are both exposed to a velocity or disparity prior (see however Stocker & Simoncelli, 2006). Nevertheless, if one assumes that standard deviations of likelihoods vary linearly with monocular image velocity and with binocular disparity, then predictions of the motion-first model are no longer isotropic but are compressed along the x-axis, whereas predictions of the stereo-first model are further compressed along the z-axis (see Supplements). When a linear relationship between input values and likelihoods is introduced, then the model predictions strengthen the claim that disparity processing is the source of the bias. 
There is a third option for the encoding of 3-D motion that needs to be considered. A stereo–motion model usually refers to joint encoding of binocular disparity and interocular delay (Carney et al., 1989; Morgan & Fahle, 2000; Qian & Andersen, 1997). In neurophysiological studies, it was shown that a number of binocular complex cells in cats (Anzai, Ohzawa, & Freeman, 2001) and cells in V1 and MT of monkey (Pack, Born, & Livingstone, 2003) are tuned to interocular spatial–temporal shifts, but the significance of these findings has been questioned (Read & Cumming, 2005a). 
In the present framework, it is difficult to test the possibility of joint velocity and disparity encoding, but a combined input of velocity and disparity is easily implemented in a Bayesian model with three parameters (see Appendix D). Uncertainty in velocity and disparity processing was combined, and both uncertainty parameters were estimated together with a parameter for perceived radius. The maximum likelihood fits of this stereo–motion model were almost identical to the stereo-first model fits with no significant improvement in goodness of fit for any of the individual data sets (results not shown). 
Conclusions
Bayesian model fits to perceived trajectory angle and radial distance promotes the idea that bias in 3-D motion perception is introduced by disparity processing. This confirms previous findings in psychophysical studies that used different stimuli and methods (e.g., Cumming & Parker, 1994; Lages, Mamassian, & Graf, 2003). It is possible, however, that interocular velocity difference or optic flow contributes to 3-D motion perception, especially when stimuli are large and move on or near the median plane (Brooks & Stone, 2004). 
In the stereo-first Bayesian model, disparity estimates are derived from the endpoint of stimulus motion rather than integrated over time. As a consequence, the stereo-first Bayesian model may be interpreted as (1) temporal integration of biased disparities or (2) biased temporal integration of disparity. The latter interpretation seems more plausible because uncertainty estimates increased systematically with stimulus velocity as reported in Table 3 of Experiment 2. If 3-D motion perception is based on velocity-tuned processing, a small change of stimulus velocity should have very little effect on uncertainty. Disparity-tuned processing, on the other hand, may well increase uncertainty levels for faster stimuli due to temporal limits of disparity integration (Read & Cumming, 2005b; Tyler, 1971) in a transient stereo system (e.g., Edwards & Schor, 1999). 
In addition, using the same method and procedure as in Experiment 2, observers showed reduced bias in their adjustments of trajectory angle and radius when a static target stimulus appeared at the endpoint of each motion trajectory. 
One of the main goals of visual processing is to segregate and identify objects in space and time. With increasing proximity or size of a moving object, monocular motion detectors signal a wider range of velocities. As a consequence, a system that processes motion input first needs to establish correspondence between different monocular motions before it can build a global 3-D motion percept. Computationally, it appears more parsimonious to solve the correspondence problem for disparity at successive points in time before deriving a 3-D motion percept. This argument also applies to joint motion and disparity encoding because the visual system would require a large number of detectors specifically tuned to all combinations of interocular spatial and temporal offsets to capture objects moving in three dimensions. 
It is concluded that under the present experimental conditions, perceptual bias in 3-D motion trajectory and velocity is most likely the result of limited temporal integration for disparity change. This points to stereo-first or stereo–motion processing with negligible motion bias but rules out a motion-first mechanism that relies on interocular velocity difference only. 
Appendix
Details of viewing geometry and intersection of constraints (IOC) are provided in Appendix A. The three Bayesian models are described in Appendices B, C, and D
A. Viewing geometry and IOC
An approximation of trajectory angle β between an object's direction of motion and the median plane is usually given by the following expression:  
tan β ( α ˙ R + α ˙ L / 2 α ˙ R α ˙ L × I P D D = α δ × I P D D ,
(A1)
where
α ˙
L = d α L( t)/d t denotes the angular velocity in the left eye,
α ˙
R = d α R( t)/d t is the angular velocity in the right eye,
α ˙
= d α( t)/d t is the angular velocity of the cyclopean (fused) image, and
δ ˙
= d δ( t)/d t describes the rate of disparity change. Both expressions on the right-hand side are mathematically equivalent. If the cyclopean point on the Vieth–Müller circle is approximated by IPD/2 between the eyes, then this introduces a computational error for shorter viewing distances. 
In an alternative solution, the IOC was determined in Cartesian coordinates with the origin O located halfway on IPD as shown in Figure 1. Angles α L and α R measure deviation from the visual axes in the left and right eyes. The constraint lines in the left and right eyes intersect the x-axis at IPD/2 and +IPD/2 with slopes tan( αL) and tan( αR), where angles αL and αR are measured counterclockwise and clockwise from the x-axis, respectively. They are adjusted for vergence angle β 0, so that αL = 90 − β 0 + α L and αR = 90 − β 0α R
The constraint lines for the left and right eyes  
z = tan ( α L ) × ( I P D 2 + x ) , z = tan ( α R ) × ( I P D 2 x )
(A2)
are equated and solved for x and z:  
x = I P D 2 × sin ( α′ R α′ L ) sin ( α′ R + α′ L ) z = I P D × sin ( α′ L ) × ( sin ( α′ R ) sin ( α′ R + α′ L ) ) .
(A3)
From the coordinates of the intersection point, the clockwise trajectory angle β is then estimated by  
β = arctan ( x ( z D ) ) f o r z < D β = arctan ( x ( z D ) ) + 180 f o r z D
(A4)
and radial distance r by  
r = x 2 + ( z D ) 2 .
(A5)
 
This geometric solution requires knowledge about IPD and D. It is used to first determine constraint lines and IOCs and then estimate trajectories and radii for the Bayesian models. 
B. Motion-first Bayesian model
Following Bayes's rule, the likelihoods and priors of a scene S and image I are combined to produce a posterior for each model (e.g., Knill & Richards, 1996): 
p(S|I)p(I|S)×p(S).
(B1)
 
In this framework, perceived angular velocity of motion-first processing is described as a product of likelihood and prior for the left and right eyes:  
p ( v L | β ) p ( β | v L ) p ( v L ) , p ( v R | β ) p ( β | v R ) p ( v R ) .
(B2)
 
The likelihood for the left eye is modeled as a Gaussian distribution of angular velocities centered on the true angular velocity with d α L( t)/d t abbreviated as
α ˙
L. The standard deviation σ v of the likelihood distribution is left as a free parameter:  
p ( β | v L ; σ v ) = 1 2 π σ v 2 exp [ ( v L α ˙ L ) 2 2 σ v 2 ] .
(B3)
The likelihood for the right eye is modeled accordingly. The preference or prior for slow motion is modeled as a Gaussian distribution centered on zero velocity with fixed standard deviation σ:  
p ( v L ; σ ) = 1 2 π σ 2 exp [ ( v L ) 2 2 σ 2 ] .
(B4)
 
The product of the Gaussian likelihood distribution with prior G(0, σ) gives the posterior distribution, which is the probability of each possible angular velocity taking into account both prior and trajectory. Through differentiation, the maximum a posteriori (MAP) estimates of angular velocity are found for the left eye and right eye:  
α ^ L = 1 / σ v 2 α ˙ L 1 / σ v 2 + 1 / σ 2 = α ˙ L 1 + ( σ v / σ ) 2 α ^ R = 1 / σ v 2 α ˙ R 1 / σ v 2 + 1 / σ 2 = α ˙ R 1 + ( σ v / σ ) 2 .
(B5)
 
For a given stimulus, the true trajectory β and radius r are inserted in Equations A4 and A5. These equations are solved for the coordinates of the IOC before Equation A3 is solved for true angular velocities in the left and right eyes. Bias in angular velocities is introduced according to Equation B5. These estimates are then inserted into Equation A3 to determine the new coordinates of the biased IOC. Inserting these coordinates in Equations A4 and A5 provides the predicted trajectory and radius for a given stimulus. 
The ratio σ v/ σ (cf. Hürlimann et al., 2002) in Equation B5 was estimated for all trajectories using the Nelder–Mead downhill simplex method (fmins in MatLab, MathWorks) with σ constant. A parameter for radius was also included to compensate for overestimation of radial distance. Overestimation is typical for the representation of moving targets (Assad & Maunsell, 1995; Freyd & Finke, 1984). 
C. Stereo-first Bayesian model
Similarly, the stereo-first Bayesian model describes perceived binocular disparity (change) as the product of likelihood and prior:  
p ( d | β ) p ( β | d ) × p ( d ) .
(C1)
 
The likelihood for binocular disparity (change) is modeled as a Gaussian distribution centered on the true disparity (change) δ measured at the endpoint of stimulus motion. The standard deviation σ d of the distribution is left as a free parameter:  
p ( β | d ; σ d ) = 1 2 π σ d 2 exp [ ( d δ ) 2 2 σ d 2 ] .
(C2)
The preference or prior for small disparity (change) is modeled as a Gaussian distribution centered on zero disparity:  
p ( d ; σ ) = 1 2 π σ 2 exp [ ( d ) 2 2 σ 2 ] .
(C3)
 
The MAP estimate for disparity is then given by  
δ ^ = 1 / σ d 2 δ 1 / σ d 2 + 1 / σ 2 = δ 1 + ( σ d / σ ) 2 .
(C4)
 
Changing disparity information needs to be coupled with spatial location to recover 3-D motion. The cyclopean azimuth α is approximated by ( α L + α R)/2, and disparity constraints are estimated relative to angle α:  
α ^ L = α δ ^ / 2 , α ^ R = α + δ ^ / 2 .
(C5)
 
Predicted trajectories and velocities were calculated similar to the motion-first model: Bias in disparity was introduced according to Equation C4, and biased angles were found using Equation C5. The ratio σ d/ σ was estimated using a constant σ for all trajectory angles. As before, an additional parameter for radius was introduced to compensate for individual overestimation of radial distance. 
Note that disparity estimates are not integrated over time. Instead, the endpoint of stimulus motion was used to derive trajectory angle and radius. Consequently, the stereo-first model and its prior can be interpreted as temporal integration of biased disparities or alternatively biased temporal integration of disparities. 
D. Stereo-motion Bayesian model
If one estimates cyclopean azimuth
α ^
by (
α ^
L +
α ^
R)/2 and insert the velocity estimates from Equation B5 into Equation C5, then velocity and disparity input are combined in a single Bayesian model with three parameters:  
α ^ L = α ^ δ ^ / 2 , α ^ R = α ^ + δ ^ / 2 .
(D1)
Maximum likelihood fits and parameters for this combined Bayesian model are not reported because model fits to each individual data set were almost identical to the stereo-first Bayesian model and because goodness of fit was only marginally better. 
Acknowledgments
I would like to thank Pascal Mamassian, Larry Maloney, and two anonymous reviewers for helpful comments, Pamela Hunter and Katherine McArthur for collecting data. Parts of this research were supported by the Royal Society of London (UK) and presented at VSS 2005 and ECVP 2005. 
Commercial relationships: none. 
Corresponding author: Martin Lages. 
Email: m.lages@psy.gla.ac.uk. 
Address: 58 Hillhead Street, Glasgow, G12 8QB, UK. 
References
Anzai, A. Ohzawa, I. Freeman, R. D. (2001). Joint-encoding of motion and depth by visual cortical neurons: Neural basis of the Pulfrich effect. Nature Neuroscience, 4, 513–518. [PubMed] [PubMed]
Ascher, D. Grzywacz, N. M. (2000). A Bayesian model for the measurement of visual velocity. Vision Research, 40, 3427–3434. [PubMed] [CrossRef] [PubMed]
Assad, J. A. Maunsell, J. H. (1995). Neuronal correlates of inferred motion in primate posterior parietal cortex. Nature, 373, 518–521. [PubMed] [CrossRef] [PubMed]
Brainard, D. H. (1997). The Psychophysics Toolbox. Spatial Vision, 10, 433–436. [PubMed] [CrossRef] [PubMed]
Brooks, K. R. (2002). Monocular motion adaptation affects the perceived trajectory of stereomotion.Journal of Experimental Psychology. Human Perception and Performance, 28, 1470–1482. [PubMed] [CrossRef] [PubMed]
Brooks, K. R. Stone, L. S. (2004). Stereomotion speed perception: Contributions from both changing disparity and interocular velocity difference over a range of relative disparities. Journal of Vision, 4, (12), 1061–1079, http://journalofvision.org/4/12/6/, doi:10.1167/4.12.6. [PubMed] [Article] [CrossRef] [PubMed]
Carney, T. Paradiso, M. A. Freeman, R. D. (1989). A physiological correlate of the Pulfrich effect in cortical neurons of the cat. Vision Research, 29, 155–165. [PubMed] [CrossRef] [PubMed]
Cumming, B. G. Parker, A. J. (1994). Binocular mechanisms for detecting motion-in-depth. Vision Research, 34, 483–495. [PubMed] [CrossRef] [PubMed]
Edwards, M. Schor, C. M. (1999). Depth aliasing by the transient stereo-system. Vision Research, 39, 4333–4340. [PubMed] [CrossRef] [PubMed]
Farell, B. Li, S. McKee, S. P. (2004). Coarse scales, fine scales, and their interactions in stereo vision. Journal of Vision, 4, (6), 488–499, http://journalofvision.org/4/6/8/, doi:10.1167/4.6.8. [PubMed] [Article] [CrossRef] [PubMed]
Freyd, J. J. Finke, R. A. (1984). Representational momentum. Journal of Experimental Psychology: Learning Memory and Cognition, 10, 126–132. [CrossRef]
Harris, J. M. Dean, P. J. A. (2003). Accuracy and precision of binocular 3-D motion perception. Journal of Experimental Psychology: Human Perception and Performance, 29, 869–881. [PubMed] [CrossRef] [PubMed]
Harris, J. M. Drga, V. F. (2005). Using visual direction in three-dimensional motion perception. Nature Neuroscience, 8, 229–233. [PubMed] [CrossRef] [PubMed]
Hogervorst, M. A. Eagle, R. A. (1998). Biases in three-dimensional structure-from-motion arise from noise in the early visual system. Proceedings: Biological Sciences/The Royal Society, 265, 1587–1593. [PubMed] [CrossRef] [PubMed]
Howard, I. P. Rogers, B. J. (2002). Seeing in depth (Volume 2). Depth perception. Ontario: I Porteous.
Hürlimann, F. Kiper, D. C. Carandini, M. (2002). Testing the Bayesian model of perceived speed. Vision Research, 42, 2253–2257. [PubMed] [CrossRef] [PubMed]
Knill, D. C. Richards, W. (1996). Perception as Bayesian inference. Cambridge, UK: Cambridge University Press.
Lages, M. Mamassian, P. Graf, E. W. (2003). Spatial and temporal tuning of motion in depth. Vision Research, 43, 2861–2873. [PubMed] [CrossRef] [PubMed]
Lages, M. Treisman, M. (1998). Spatial frequency discrimination: Visual long-term memory or criterion setting? Vision Research, 38, 557–572. [PubMed] [CrossRef] [PubMed]
Lu, Z. L. Sperling, G. (1995). The functional architecture of human visual motion perception. Vision Research, 35, 2697–2722. [PubMed] [CrossRef] [PubMed]
McKee, S. P. Nakayama, K. (1984). The detection of motion in the peripheral visual field. Vision Research, 24, 25–32. [PubMed] [CrossRef] [PubMed]
Morgan, M. J. Fahle, M. (2000). Motion-stereo mechanisms sensitive to inter-ocular phase. Vision Research, 40, 1667–1675. [PubMed] [CrossRef] [PubMed]
Pack, C. C. Born, R. T. Livingstone, M. S. (2003). Two-dimensional substructure of stereo and motion interactions in macaque visual cortex. Neuron, 37, 525–535. [PubMed] [Article] [CrossRef] [PubMed]
Pelli, D. G. (1997). The VideoToolbox software for visual psychophysics: Transforming numbers into movies. Spatial Vision, 10, 437–442. [PubMed] [CrossRef] [PubMed]
Portfors-Yeomans, C. V. Regan, D. (1996). Cyclopean discrimination thresholds for the direction and speed of motion in depth. Vision Research, 36, 3265–3279. [PubMed] [CrossRef] [PubMed]
Qian, N. Andersen, R. A. (1997). A physiological model for motion-stereo integration and a unified explanation of Pulfrich-like phenomena. Vision Research, 37, 1683–1698. [PubMed] [CrossRef] [PubMed]
Read, J. C. (2002a). A Bayesian approach to the stereo correspondence problem. Neural Computation, 14, 1371–1392. [PubMed] [CrossRef]
Read, J. C. (2002b). A Bayesian model of stereopsis depth and motion direction discrimination. Biological Cybernetics, 86, 117–136. [PubMed] [CrossRef]
Read, J. C. Cumming, B. G. (2005a). Effect of interocular delay on disparity-selective v1 neurons: Relationship to stereoacuity and the Pulfrich effect. Journal of Neurophysiology, 94, 1541–1553. [PubMed] [Article] [CrossRef]
Read, J. C. Cumming, B. G. (2005b). The stroboscopic Pulfrich effect is not evidence for the joint encoding of motion and depth. Journal of Vision, 5, (3), 417–434, http://journalofvision.org/5/5/3/, doi:10.1167/5.5.3. [PubMed] [Article]
Regan, D. Beverley, K. J. Cynader, M. (1979). Stereoscopic subsystems for position in depth and for motion in depth. Proceedings of the Royal Society of London: Series B, 204, 485–501. [PubMed] [CrossRef]
Shioiri, S. Saisho, H. Yaguchi, H. (2000). Motion in depth based on inter-ocular velocity differences. Vision Research, 40, 2565–2572. [PubMed] [CrossRef] [PubMed]
Stocker, A. A. Simoncelli, E. P. (2006). Noise characteristics and prior expectations in human visual speed perception. Nature Neuroscience, 9, 578–585. [PubMed] [CrossRef] [PubMed]
Tyler, C. W. (1971). Stereoscopic depth movement: Two eyes less sensitive than one. Science, 174, 958–961. [PubMed] [CrossRef] [PubMed]
Ullman, S. Yuille, A. Ullman, S. Richards, W. (1989). Rigidity and smoothness of motion. Image understanding. Norwood, NJ: Ablex Publishing Corporation.
Watt, S. J. Akeley, K. Ernst, M. O. Banks, M. S. (2005). Focus cues affect perceived depth. Journal of Vision, 5, (10), 834–862, http://journalofvision.org/5/10/7/, doi:10.1167/5.10.7. [PubMed] [Article] [CrossRef] [PubMed]
Weiss, Y. Simoncelli, E. P. Adelson, E. H. (2002). Motion illusions as optimal percepts. Nature Neuroscience, 5, 598–604. [PubMed] [CrossRef] [PubMed]
Welchman, A. E. Tuck, V. L. Harris, J. M. (2004). Human observers are biased in judging the angular approach of a projectile. Vision Research, 44, 2027–2042. [PubMed] [CrossRef] [PubMed]
Figure 1
 
Binocular viewing geometry in top view. If the two eyes are verged on a fixation point at viewing distance D with vergence angle β 0, then projections of a moving target (arrow) with angle α L in the left eye and α R in the right eye constrain motion of the target in xz space. The intersection of constraints (IOC) determines stimulus trajectory β and radius r (see Appendix A).
Figure 1
 
Binocular viewing geometry in top view. If the two eyes are verged on a fixation point at viewing distance D with vergence angle β 0, then projections of a moving target (arrow) with angle α L in the left eye and α R in the right eye constrain motion of the target in xz space. The intersection of constraints (IOC) determines stimulus trajectory β and radius r (see Appendix A).
Figure 2
 
Illustration of a symmetric bivariate Gaussian prior for 3-D motion. Symmetric perspective projections into the left and right eyes give rise to marginal Gaussian distributions defining velocity priors and disparity prior centered on zero velocity and disparity, respectively.
Figure 2
 
Illustration of a symmetric bivariate Gaussian prior for 3-D motion. Symmetric perspective projections into the left and right eyes give rise to marginal Gaussian distributions defining velocity priors and disparity prior centered on zero velocity and disparity, respectively.
Figure 3
 
Illustration of predicted trajectory and velocity for the (A) motion-first and (B) stereo-first Bayesian models in top view. The constraint lines and IOC (gray) shift as a consequence of biased velocity (blue) or disparity (red) processing (see Appendices B and C for details).
Figure 3
 
Illustration of predicted trajectory and velocity for the (A) motion-first and (B) stereo-first Bayesian models in top view. The constraint lines and IOC (gray) shift as a consequence of biased velocity (blue) or disparity (red) processing (see Appendices B and C for details).
Figure 4
 
Predictions of trajectory angle and velocity in polar coordinates for the (A) motion-first and (B) stereo-first Bayesian models. Points denote model predictions for a target moving in three dimensions on trajectory angles of 10 to 350 deg, in steps of 20 deg. Uncertainty is modeled by the ratio of standard deviation for likelihood and prior, ranging from 0.1 to 1.9, in steps of 0.2. At a viewing distance of 1,140 mm, predictions are initially veridical, describing a circle of radius 33 mm, but with increasing uncertainty, they approximate a shrinking circle for the motion-first Bayesian model (blue) and a laterally oriented ellipse for the stereo-first Bayesian model (red; see text for explanation).
Figure 4
 
Predictions of trajectory angle and velocity in polar coordinates for the (A) motion-first and (B) stereo-first Bayesian models. Points denote model predictions for a target moving in three dimensions on trajectory angles of 10 to 350 deg, in steps of 20 deg. Uncertainty is modeled by the ratio of standard deviation for likelihood and prior, ranging from 0.1 to 1.9, in steps of 0.2. At a viewing distance of 1,140 mm, predictions are initially veridical, describing a circle of radius 33 mm, but with increasing uncertainty, they approximate a shrinking circle for the motion-first Bayesian model (blue) and a laterally oriented ellipse for the stereo-first Bayesian model (red; see text for explanation).
Figure 5
 
Illustration of adjustment methods. In separate blocks of trials, observers rotated a black line inside a circle or adjusted its length to indicate perceived trajectory and radial distance in top view.
Figure 5
 
Illustration of adjustment methods. In separate blocks of trials, observers rotated a black line inside a circle or adjusted its length to indicate perceived trajectory and radial distance in top view.
Figure 6
 
Results from four observers in Experiment 1. Polar plots of perceived trajectory angles and radial distances (black) and best fitting motion-first (blue) and stereo-first (red) Bayesian models. Black data points denote average adjustments to stimulus trajectories between 10 and 350 deg, in steps of 20 deg, as well as average radial distance; filled data points correspond to trajectories at 90 and 270 deg.
Figure 6
 
Results from four observers in Experiment 1. Polar plots of perceived trajectory angles and radial distances (black) and best fitting motion-first (blue) and stereo-first (red) Bayesian models. Black data points denote average adjustments to stimulus trajectories between 10 and 350 deg, in steps of 20 deg, as well as average radial distance; filled data points correspond to trajectories at 90 and 270 deg.
Figure 7
 
Results from four observers in Experiment 2. Polar plots for perceived trajectory angle and radial distance standardized to radius 1.0 (black) and best fitting motion-first (blue) and stereo-first (red) Bayesian models. Black data points denote average adjustments to stimulus trajectories between 0 and 350 deg in steps of 10 deg, as well as average standardized radial distance; filled data points correspond to cardinal trajectories.
Figure 7
 
Results from four observers in Experiment 2. Polar plots for perceived trajectory angle and radial distance standardized to radius 1.0 (black) and best fitting motion-first (blue) and stereo-first (red) Bayesian models. Black data points denote average adjustments to stimulus trajectories between 0 and 350 deg in steps of 10 deg, as well as average standardized radial distance; filled data points correspond to cardinal trajectories.
Figure 8
 
Results from four observers in Experiment 2. Polar plots for perceived trajectory angle and radial distance for stimulus velocities of 0.02 m/s or 16.6 mm (blue), 0.03 m/s or 25.0 mm (magenta), and 0.04 m/s or 33.3 mm (red) and best fitting stereo-first Bayesian models. Filled data points correspond to cardinal stimulus trajectories. With increasing stimulus velocity, estimates of radius and uncertainty increase, and model fits assume a more compressed elliptical shape (see Table 3 for details of both model fits and Supplements for separate plots of perceived vs. physical trajectory angle and radial distance).
Figure 8
 
Results from four observers in Experiment 2. Polar plots for perceived trajectory angle and radial distance for stimulus velocities of 0.02 m/s or 16.6 mm (blue), 0.03 m/s or 25.0 mm (magenta), and 0.04 m/s or 33.3 mm (red) and best fitting stereo-first Bayesian models. Filled data points correspond to cardinal stimulus trajectories. With increasing stimulus velocity, estimates of radius and uncertainty increase, and model fits assume a more compressed elliptical shape (see Table 3 for details of both model fits and Supplements for separate plots of perceived vs. physical trajectory angle and radial distance).
Table 1
 
Results from four observers in Experiment 1. Parameter estimates of uncertainty and radius, as well as of goodness of fit for motion-first and stereo-first Bayesian models. Note. *p < .05.
Table 1
 
Results from four observers in Experiment 1. Parameter estimates of uncertainty and radius, as well as of goodness of fit for motion-first and stereo-first Bayesian models. Note. *p < .05.
Observer Motion-first Stereo-first
σ v/ σ r v χ 2(2) σ d/ σ r d χ 2(2)
E.M. 2.41 134.9 6.03* 1.44 36.1 2.04
K.K. 0.00 23.3 6.53* 1.00 32.2 1.98
L.B. 1.04 53.5 1.60 0.80 34.0 0.96
S.Y. 2.91 147.1 7.32* 1.51 28.3 1.13
 

Note. * p < .05.

Table 2
 
Results from four observers in Experiment 2. Data were first standardized to radius 1.0 and then averaged across 12 trials. Parameter estimates and goodness of fit are reported for motion-first and stereo-first Bayesian models.
Table 2
 
Results from four observers in Experiment 2. Data were first standardized to radius 1.0 and then averaged across 12 trials. Parameter estimates and goodness of fit are reported for motion-first and stereo-first Bayesian models.
Observer Radius ( r) Motion-first Stereo-first
σ v/ σ r v χ 2(2) σ d/ σ r d χ 2(2)
A.G. 1.0 6.86 46.8 1.39 0.79 1.21 0.46
M.L. 1.0 0.00 1.24 1.20 0.72 1.47 0.61
R.G. 1.0 1.23 3.15 1.02 0.83 1.52 0.21
S.S. 1.0 0.00 1.36 0.61 0.74 1.60 0.22
Table 3
 
Results from four observers and three stimulus velocities (radius in millimeters) in Experiment 2. Parameter estimates and goodness of fit for motion-first and stereo-first Bayesian models are shown. Note. *p < .05.
Table 3
 
Results from four observers and three stimulus velocities (radius in millimeters) in Experiment 2. Parameter estimates and goodness of fit for motion-first and stereo-first Bayesian models are shown. Note. *p < .05.
Observer Radius ( r) Motion-first Stereo-first
σ v/ σ r v χ 2(2) σ d/ σ r d χ 2(2)
A.G. 16.6 0.00 17.6 2.40 0.32 17.7 2.29
25.0 0.00 20.6 6.63* 0.81 27.7 2.12
33.3 0.00 25.2 17.2* 1.01 32.9 5.79
M.L. 16.6 0.00 26.3 2.89 0.56 28.2 1.92
25.0 0.00 30.8 4.07 0.49 34.3 3.09
33.3 0.00 34.4 9.24* 0.77 39.5 3.00
R.G. 16.6 1.83 113.5 1.40 0.48 28.8 1.32
25.0 2.84 274.3 3.40 0.87 35.4 0.58
33.3 2.22 213.1 8.97* 1.00 43.2 1.45
S.S. 16.6 0.00 22.2 2.31 0.00 22.2 2.31
25.0 0.00 36.1 1.45 0.58 39.3 1.14
33.3 0.82 64.5 1.74 0.64 44.5 1.35
 

Note. * p < .05.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×