Abstract
Motion-in-depth perception relies on multiple sensory cues. Previous work has quantified the contribution of binocular motion cues, i.e. interocular velocity differences and changing disparities over time, as well as monocular motion cues, i.e. size and density changes. However, even when these cues are presented in concert, observers will systematically misreport the direction of motion-in-depth stimuli. Here we considered the potential role of small involuntary head movements, i.e. head jitter, in motion-in-depth perception. We first measured head jitter under fixating but head-free viewing conditions. Spectral densities for all three head movement axes exhibited a pink (1/f) noise pattern, consistent with random drift, rather than voluntary control. Head translations and rotations were ~12 mm/s and ~2.5 deg/s on average, respectively. While small, the resulting retinal motion signals were above perceptual threshold. We subsequently investigated the impact of head-jitter on motion-in-depth perception using virtual reality. Observers reported motion-in-depth of a 3D target under head-free viewing while head tracking was on, off, or delayed either randomly or uniformly. Providing head-jitter-induced retinal signals (“on”) increased sensitivity and reduced bias of motion-in-depth perception. Increasing random variability in head-movement-to-photon latency (“delayed”) reduced sensitivity and produced biases comparable to when head tracking was turned off altogether. Furthermore, uniform delays in motion-to-photon latency also reduced performance. Thus the retinal signals produced by head jitter enhanced motion-in-depth perception, provided that they were (1) consistent, and (2) low-latency. These results suggest that head-restrained viewing typical in psychophysical experiments eliminates cues critical to motion-in-depth perception and underestimates perceptual sensitivity. Similarly, in addition to the well-established role of monocular and binocular motion cues, other cues rarely considered in traditional motion perception experiments, such as accommodative blur and lighting may serve critical roles in the accurate perception of motion-in-depth.