Abstract
Problem. Motion transparency is the perception of motions in more than one direction at the same spatial location, as in random dot kinematograms (RDKs) with groups of dots moving differently. Direction or speed differences between dot groups control the appearance of transparency. How do factors such as stimulus size or motion coherence (via spatial integration) affect the perception of transparency? Methods. We propose a computational model of visual motion detection and integration. Model V1 detects motions that are spatially integrated in model MT. We define velocity-sensitive MT cells with local on-center/off-surround selectivity in direction and speed with parameters fitted to monkey MT data (Treue et al., Nature Neurosci., 3, 2000). Model MST integrates signals from model MT to achieve selectivity for motion patterns. Top-down signals from MST to MT and MT to V1 disambiguate and stabilize local motion estimates. Model area LIP temporally integrates motion signals and applies thresholds in order to simulate a decision. Results. In computer simulations of two 2AFC experiments, the model's perceptual thresholds for perception of transparency are measured. Two “bull's eye” configurations of RDKs appear to either side of a simulated “fixation point”. One central disk region contains transparent motion (via an overlay of clockwise and counterclockwise rotations, surrounded by an annulus region of random flicker), the other only opaque motion, with a comparable annulus. In one set of simulations, the radii of the central disks of the bull's eyes were varied and detection rates are calculated. A second experiment varied the motion coherency of dots in disks of constant radius. Results from both experiments indicate a minimum spatial integration of motion directions and speeds is necessary to distinguish motion transparency from opaque motion by disambiguating local motions in MT and global motion patterns in MST. The model's predictions are testable by human psychophysics.
Supported in part by CELEST (NSF SBE-0354378 and OMA-0835976) and the 7th framework program ICT-project no. 215866-SEARISE.