Abstract
It is known that synchronous/asynchronous changes in visual attribute values at different spatial locations are effective cues for perceptual grouping/segmentation. To see whether changes in any visual attributes produce similar perceptual grouping/segmentation effects, and how this phenomenon is related to the perceptual asynchrony between different attributes, we measured segregation performance under single-attribute and dual-attribute conditions for three visual attributes: luminance, color, and motion direction. Our stimulus consisted of an array of 16×16 elements divided into four quadrants. Each element was a Gaussian bulb for luminance or color changes, and a Gabor patch for direction change. Temporal patterns of stimulus changes, A and B, were repetitive alternations at a given temporal frequency (variable between 1 and 10Hz) with a 180-deg phase shift between A and B. One of the four quadrants (target) followed the temporal pattern A, and the others B. The observer’s task was to detect the target quadrant (4AFC). In the single-attribute condition, all elements changed in the same attribute. In the double-attribute condition, the elements in the two diagonally opposite quadrants changed in one attribute, while the rest in another attribute. The results of the single-attribute conditions showed that the temporal frequency limit for luminance was 8Hz or higher, while that for color was only slightly lower. When luminance and color were paired in a double attribute condition, the temporal limit was reduced (4-5 Hz) relative to the single-attribute conditions. On the other hand, segmentation based on asynchronous motion direction changes was very difficult either in a single-attribute condition or in the double-attribute conditions paired with luminance or color. Our results indicate that the performance of asynchrony-based segregation is highly dependent on which visual attributes convey temporal pattern information.