Journal of Vision Cover Image for Volume 23, Issue 9
August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Temporal limits of visual segmentation based on temporal asynchrony in luminance, color, motion direction, and their mixtures
Author Affiliations & Notes
  • Yen-Ju Chen
    Graduate School of Informatics, Kyoto University, Japan
  • Shin’ya Nishida
    Graduate School of Informatics, Kyoto University, Japan
    Human Information Science Laboratory, NTT Communication Science Laboratories, Nippon Telegraph and Telephone Corporation, Japan
  • Footnotes
    Acknowledgements  This studies was supported by Supported by MEXT/JSPS KAKENHI (Japan) 20H00603, 20H05605
Journal of Vision August 2023, Vol.23, 4864. doi:https://doi.org/10.1167/jov.23.9.4864
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Yen-Ju Chen, Shin’ya Nishida; Temporal limits of visual segmentation based on temporal asynchrony in luminance, color, motion direction, and their mixtures. Journal of Vision 2023;23(9):4864. https://doi.org/10.1167/jov.23.9.4864.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

It is known that synchronous/asynchronous changes in visual attribute values at different spatial locations are effective cues for perceptual grouping/segmentation. To see whether changes in any visual attributes produce similar perceptual grouping/segmentation effects, and how this phenomenon is related to the perceptual asynchrony between different attributes, we measured segregation performance under single-attribute and dual-attribute conditions for three visual attributes: luminance, color, and motion direction. Our stimulus consisted of an array of 16×16 elements divided into four quadrants. Each element was a Gaussian bulb for luminance or color changes, and a Gabor patch for direction change. Temporal patterns of stimulus changes, A and B, were repetitive alternations at a given temporal frequency (variable between 1 and 10Hz) with a 180-deg phase shift between A and B. One of the four quadrants (target) followed the temporal pattern A, and the others B. The observer’s task was to detect the target quadrant (4AFC). In the single-attribute condition, all elements changed in the same attribute. In the double-attribute condition, the elements in the two diagonally opposite quadrants changed in one attribute, while the rest in another attribute. The results of the single-attribute conditions showed that the temporal frequency limit for luminance was 8Hz or higher, while that for color was only slightly lower. When luminance and color were paired in a double attribute condition, the temporal limit was reduced (4-5 Hz) relative to the single-attribute conditions. On the other hand, segmentation based on asynchronous motion direction changes was very difficult either in a single-attribute condition or in the double-attribute conditions paired with luminance or color. Our results indicate that the performance of asynchrony-based segregation is highly dependent on which visual attributes convey temporal pattern information.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×