Abstract
It has been suggested that whenever an object moves the successive changes in position over time generate a ‘motion streak’ that can activate local orientation-selective neurons aligned along the axis of motion. These streaks could help resolve the inherent directional ambiguity (aperture problem) of local motion sensors that respond to moving objects. Indeed Geisler (1999; Nature 400: 65 69) showed that 1-D spatial noise masks parallel to a moving object’s trajectory elevate detection thresholds more than orthogonally-oriented noise masks. We sought to systematically quantify the orientation tuning and bandwidth of this masking phenomenon. In a 2IFC motion-detection task observers (N=4) identified which of two intervals (450ms each), containing spatially 1-D dynamic noise (0.1 contrast), contained a superimposed moving Gaussian blob (SD 0.067°). Contrast detection thresholds (75% correct) for the blob were measured using the method of constant stimuli for each of a range of 7 mask orientations (0 to 90° relative to the motion axis). Performance was measured at speeds of 5.4 and 10.6°/s. In agreement with previous studies, observers’ thresholds were ~ 20% higher when the noise mask was orientated parallel to the motion axis (0°) than when it was orthogonal (90°). However the threshold versus mask orientation function was found to be non-monotonic in that peak masking typically occurred 5 to 10° away from the axis of motion. Testing with different motion directions showed that this phenomenon was not an oblique effect due to an anisotropy in orientation encoding. Additionally the tuning of the masking function was broader at the slower object speed. These results are not readily explicable in terms of current models of motion streak encoding and place important constraints on the nature of the putative interactions between local orientation detectors and motion mechanisms in human vision.
Meeting abstract presented at VSS 2013