**Sensory systems are faced with an essentially infinite number of possible environmental events but have limited processing resources. Posed with this challenge, it makes sense to allocate these resources to prioritize the discrimination of events with the most behavioral relevance. Here, we asked if such relevance is reflected in the processing and perception of motion. We compared human performance on a rapid motion direction discrimination task, including monocular and binocular viewing. In particular, we determined sensitivity and bias for a binocular motion-in-depth (three-dimensional; 3D) stimulus and for its constituent monocular (two-dimensional; 2D) signals over a broad range of speeds. Consistent with prior work, we found that binocular 3D sensitivity was lower than monocular sensitivity for all speeds. Although overall sensitivity was worse for 3D discrimination, we found that the transformation from 2D to 3D motion processing also incorporated a pattern of potentially advantageous biases. One such bias is reflected by a criterion shift that occurs at the level of 3D motion processing and results in an increased hit rate for motion toward the head. We also observed an increase in sensitivity for 3D motion trajectories presented on crossed rather than uncrossed disparity pedestals, privileging motion trajectories closer to the observer. We used these measurements to determine the range of real-world trajectories for which rapid 3D motion discrimination is most useful. These results suggest that the neural mechanisms that underlie motion perception privilege behaviorally relevant motion and provide insights into the nature of human motion sensitivity in the real world.**

^{2}. The maximum luminance (“white”) was 13.8 cd/m

^{2}, and the minimum luminance (“black”) was 0.03 cd/m

^{2}. Vergence stability was facilitated by a 1/

*f*noise background that covered the entire display with the exception of two stimulus apertures. A fixation dot and nonius lines were presented in the center of the display to help monitor vergence, and observers were instructed to maintain fixation at all times.

**Figure 1**

**Figure 1**

*g*(

*x*)) was parameterized as a scaled skew-Gaussian distribution: where

*μ*and

*σ*are the mean and standard deviation, respectively;

*α*is a scale term that determines the amplitude of the function;

*β*is the skewness; and

*x*is the logarithm (base 10) of the monocular retinal speed. For 2D accuracy data, the skew term was fixed at zero in order to achieve a stable fit. Criterion data were fit with least squares linear regression.

*x*-axis is parallel to the interocular axis and positive to the right. The

*y*-axis is orthogonal to the

*x*-axis in the plane of the forehead and positive toward the top of the head. The

*z*-axis is positive in front of the observer. If the observer has an interocular separation of 2

*a*, then the right and left eyes are located on the

*x*-axis with coordinates (

*a*, 0, 0) and (–

*a*, 0, 0), respectively.

*P*(

*t*) = [

*X*(

*t*),

*Y*(

*t*),

*Z*(

*t*)] is the position of an object in this coordinate system as a function of time (

*t*). For simplicity, we consider only objects in the midsagittal plane:

*X*(

*t*) = 0 for all

*t*. The object's azimuth (Θ(

*t*), the angle relative to the

*z*-axis in the

*xz*-plane) on the right eye's retina is

*t*

_{0}as where

*δ*,

*v*, and

*b*correspond to

*d*Θ(

*t*)/

*dt*,

*dZ*(

*t*)/

*dt*, and

*Z*(

*t*), respectively, each evaluated at

*t*=

*t*

_{0}. These are the horizontal retinal velocity in the right eye, the object velocity in depth, and the object distance, respectively. Recall that this calculation yields the retinal velocity in the right eye. If we consider only motion directly toward and away from the observer in the midsagittal plane, then the angular velocity in the left eye will be equal in magnitude (speed) but opposite in sign. Thus, we can generically describe the monocular horizontal retinal speed

*ω*cast by an object with world speed in depth

*s*= |

*v*| as

*s*) and distances (

*b*). We assume an interocular distance of 6.4 cm (

*a*= 3.2) and convert

*ω*from radians per unit time to degrees per second.

*d*′) exceeded 3D sensitivity for all tested speeds, but the peak sensitivity for both tasks occurred at similar speeds (5.5°/s for 2D and 3.3°/s for 3D). The 3D condition data plotted here are for the middle disparity pedestal only (near and far pedestal data are shown in Figure 3). Note that the positions of the moving dots updated and “wrapped” within the aperture in both the 2D and 3D conditions, potentially making both tasks more difficult at fast speeds. The reduction in 3D sensitivity relative to 2D is generally consistent with the idea that the computation of 3D motion direction is accomplished by comparing the 2D signals between the two eyes and that the additional computation results in lower signal-to-noise (Katz et al., 2015; Tyler, 1971). Using a strict test for statistical differences (nonoverlap of the 95% confidence intervals), we found that the amplitude, mean, and standard deviation were greater for the 2D motion curve than for the 3D curve (Table 1). The ratio of the sensitivity values (yellow line) followed a shallow U shape, showing that the discriminability of 3D motion relative to 2D motion is best and nearly constant for midrange speeds; however, it worsens for relatively fast and slow speeds. Plotting the same results in terms of percent correct responses revealed that performance was well above chance for all conditions (Figure 2b). This is impressive given the brief presentation intervals and wide range of speeds. Discrimination for 2D motion was also above threshold (75%) for all but the slowest speed, and discrimination for 3D motion was above threshold for seven out of the 10 speeds tested.

**Figure 2**

**Figure 2**

**Figure 3**

**Figure 3**

**Table 1**

*b*) and speed in depth (

*s*) in the world that could generate the retinal speeds (

*ω*) used in our experiments, assuming the scenario illustrated in Figure 4a (see Materials and methods). Note that our experiment required observers to make direction judgments from very brief stimulus presentations (200 ms or less), so these predictions are relevant for making rapid discriminations.

**Figure 4**

**Figure 4**

*s*) will have a different retinal speed as its distance (

*b*) changes. Indeed, an object moving toward an observer at a constant velocity will project to an accelerating retinal speed (as

*b*decreases,

*ω*increases). An object moving away from an observer at a constant velocity will project to a decelerating retinal speed. It then follows that stimuli with constant retinal velocities are consistent with objects that are decelerating as they approach an observer or accelerating as they recede. In future work, it would be interesting to examine whether 2D and 3D motion sensitivities are differentially affected by changes in velocity.

*, 9 (3), 317–325.*

*Perception**, 211 (3), 599–622.*

*Journal of Physiology**, 28, 157–189.*

*Annual Review of Neuroscience**, 10 (4), 433–436.*

*Spatial Vision**, 51 (13), 1431–1456.*

*Vision Research**, 24 (1), 105–114.*

*Perception**, 34 (4), 483–495.*

*Vision Research**, 34 (47), 15522–15533.*

*Journal of Neuroscience**, 104 (5), 2886–2899.*

*Journal of Neurophysiology**, 28 (12), 1323–1335.*

*Vision Research**, 22 (9), 1013–1023.*

*Perception**, 77 (5), 1685–1696.*

*Attention, Perception, & Psychophysics**, 21 (6), 531–547.*

*Spatial Vision**, 18 (4), 399–411.*

*Spatial Vision**, 41 (19), 2457–2473.*

*Vision Research**. Ontario, Canada: I Porteous.*

*Seeing in depth, Volume 2: Depth perception**, 5 (2), 129–141.*

*Perception**, 35 (28), 10212–10216.*

*Journal of Neuroscience**, 36, 1.*

*Perception**, 19 (13), 1118–1122.*

*Current Biology**, 19 (7), 686–692.*

*Psychological Science**, 10 (4), 437–442.*

*Spatial Vision**, 36 (20), 3265–3279.*

*Vision Research**, 34 (8), 1029–1037.*

*Vision Research**, 33 (16), 2359–2360.*

*Vision Research**, 12 (8), 1050–1055.*

*Nature Neuroscience**, 34 (47), 15508–15521.*

*Journal of Neuroscience**, 136 (3520), 982–983.*

*Science**Science Advances*, 1 (4), e1400254.

*, 174 (4012), 958–961.*

*Science**. New York, NY: Dover. (Original work published 1867)*

*Treatise on physiological optics**, 242 (3), 827–841.*

*Journal of Physiology*