Abstract
Synchrony is an important factor in binding of visual information, and is also a basic parameter in time perception. Here I report a temporal asymmetry in a synchrony-based perceptual grouping, which demonstrates a notion that synchrony between visual stimuli is mediated by spatial interactions between feature detectors. The stimulus display was a square array of four local patterns each composed of overlapping vertical and horizontal Gabor signals (6 c/deg) whose relative contrasts were sinusoidally modulated out of phase at a certain temporal frequency, producing an orientation alternation. One of the four elements (target) alternated its orientation in a different phase by phi (−180 – 180 deg) from the others. While steady fixating on the center of the array, observers were asked to detect the target element (i.e., asynchrony detection; 4AFC). No motion was seen between elements. The limits of lagged (phi > 0) and advanced (phi < 0) phase differences were measured for various alternation frequencies (TF > 4 Hz) and for various distances between elements (inter-element distance, IED, 0.3 – 1.5 deg). The results showed that both phase limits increased as the TF and IED increased, and that the advanced phase limits were significantly lower than the lagged ones. Thus the target whose orientation changed earlier than the others was easier to detect than the target whose orientation changed later by the same amount. Moreover, the difference between the two phase limits proportionally increased with the IED. This temporal asymmetry, which correlated with the distance in space, can be explained by assuming that visual synchrony is achieved via spatial interactions between local feature detectors, which essentially involve mutual delays of signals that propagate with a finite velocity. A simulation with a simple 4-unit network gave a quantitative prediction of the results with estimated propagation velocities of 60–130 deg/sec.
Note: supported by JSPS (#7971) and Tohoku University