Abstract
Arranging samples according to color seems natural—this task is even used as a clinical test of brain function (Farnsworth-Munsell 100 Hue Test). Yet the neural basis that governs color relationships remains elusive. Color is often described by three orthogonal parameters (hue, value, chroma), but this organization is not uniform: the distance between any two colors does not predict the relationship between another pair of colors separated by the same distance. There is no consensus on a space that captures uniform perceptual relationships, and spaces that try to do so empirically, such as Munsell, have an asymmetric organization that is not formally defined in terms of photoreceptor responses. Here we tackle color geometry with a new approach. Using magnetoencephalography (MEG), we measured brain responses to colored spirals defined by the intermediate directions in cone-opponent color space, at two luminance levels (eight colors total). Color could be decoded very accurately from neural activity patterns; hue and luminance could be decoded independently. Using classifiers trained on these patterns, we determined similarity relationships among the colors. We discovered two hue-luminance interactions that reflect nonlinearities in the neural representation of color space: an asymmetry in the representation of the daylight-axis (blue, orange) versus anti-daylight-axis (green, pink); and an asymmetry in the representation of warm (pink, yellow) versus cool (blue, green) colors. Taken together, the results uncover cortical transformations of retinal color signals that give rise to color relationships and show why color cannot be represented in a perceptually uniform (linear) color space.