Abstract
Background: Color matching in both perception and memory is reported to exhibit color-specific patterns of bias and variability that have been accounted for by a model in which color categories influence color selections. (Bae et al, 2015). If robust, these patterns may prove a useful tool in understanding how prior information, such as color categories, influences visually-guided behavior when memory makes sensory information increasingly noisy. Methods: We completed two studies. The first (CRT) was a nearly exact replication of the task in Bae et al (2015); the second (RealWorld) was a conceptual replication using real objects. After viewing a test color, observers in both studies selected its match from a hue circle presented either simultaneously (perception) or after a 2 second delay (memory). Between trials, test stimuli spanned the hue circle (CRT: 180 2D patches; RealWorld: Farnsworth-Munsell 100 hue chips). From pooled observer responses, we estimated bias and variability of choices for each test color. Observers' color category estimates were obtained in separate experiments. Results: Reliable patterns of bias and variability were present in both studies. Bias and variability of perceptual matches correlated strongly with bias and variability of memory matches in both CRT and RealWorld studies. In addition, observer-defined color categories made sense of the color specific patterns of bias in the CRT study: Matches to colors near boundaries of categories were biased towards the nearest color category center. Although real-world data were internally reliable, the effect of categories was not clear for real world stimuli. Conclusions: Color-specific patterns in bias and variability of color matches were robust across experiments; however, only for study CRT were patterns consistent with categories influencing visually-guided behavior when sensory information is uncertain.
Meeting abstract presented at VSS 2017