We investigate the spatial correlations of orientation and color information in natural images. We find that the correlation of orientation information falls off rapidly with increasing distance, while color information is more highly correlated over longer distances. We show that orientation and color information are statistically independent in natural images and that the spatial correlation of jointly encoded orientation and color information decays faster than that of color alone. Our findings suggest that: (a) orientation and color information should be processed in separate channels and (b) the organization of cortical color and orientation selectivity at low spatial frequencies is a reflection of the cortical adaptation to the statistical structure of the visual world. These findings are in agreement with biological observations, as form and color are thought to be represented by different classes of neurons in the primary visual cortex, and the receptive fields of color-selective neurons are larger than those of orientation-selective neurons. The agreement between our findings and biological observations supports the ecological theory of perception.

*s*(

*k*) ∼

*k*

^{−1}, imply that nearby photoreceptors are highly redundant (Ruderman, 1997; Ruderman & Bialek, 1994), and therefore, their information need not be transmitted in full. Similarly, the lagged-x cells of the geniculate are optimal filters for removing temporal redundancies (Dan, Atick, & Reid, 1996). Srinivasan, Laughlin, and Dubs (1982) discussed redundancy removal by predictive coding done by the center–surround antagonism of receptive fields. Similarly, Buchsbaum and Gottschalk (1983) derived the receptive field structure of retinal ganglion cells from optimality considerations.

*L**

*a**

*b** coordinates using the ITU recommended D65 white point reference in the ITU BT.709 standard, by using the equations given in Wyszecki and Stiles (1982, p. 166). For the sake of notational convenience, we use (

*L, a, b*) to denote CIE

*L**

*a**

*b**. We also performed our analysis using a calibrated LMS space, as provided by the creators of the image database (Olmos & Kingdom, 2004). The results we obtained were virtually identical for the two color spaces.

*coherence*is a measure for how strongly anisotropic the image edges are within a local neighborhood. The coherence can be interpreted as the magnitude of an orientation vector, and the direction of the vector is given by the angle of orientation.

*N*of the image is given by

*π*/2, since the gradient vector is perpendicular to the direction of anisotropy.

*W*of prescribed size around point

*g*

_{1}for estimating the orientation angle in 1, and

*g*

_{2}for estimating the coherence in 2. The values of

*g*

_{1}and

*g*

_{2}are described in Figure 1 and range from small to large spatial scales as indicated.

*σ*is the variance, defined as

_{ OC}formed by concatenating the vectors Ψ

_{ o}(

_{ c}(

_{ OC}(

*t*-test of the distributions for color and orientation that are summarized in Figure 2. For each displacement, the autocorrelation value for color over all the samples produces a first distribution, and that for orientation produces a second distribution. We used the function

*ttest*2 in MATLAB for this computation using data generated from the small-scale filters. The result rejected the null hypothesis that the data in these two distributions are independent random samples from normal distributions with equal means but unknown variances. The

*p*value associated with the test was less than 10

^{−10}for each positive pixel distance.

*θ*

_{ e}with each of the color (

*L, a, b*) values, giving rise to three distributions, (

*θ*

_{ e},

*L*), (

*θ*

_{ e},

*a*), and (

*θ*

_{ e},

*b*). The plots in Figure 5 clearly indicate that the pairwise product of the marginal probability density functions is equal to the joint probability distribution functions

*P*(

*θ*

_{ e},

*L*),

*P*(

*θ*

_{ e},

*a*), and

*P*(

*θ*

_{ e},

*b*). Thus,

*P*(

*θ*

_{ e},

*L*) =

*P*(

*θ*

_{ e})

*P*(

*L*), where

*P*(

*θ*

_{ e}) is the probability density function of

*θ*

_{ e}and

*P*(

*L*) is the probability density function of luminance,

*L*.

*θ*

_{ e},

*L*), (

*θ*

_{ e},

*a*), and (

*θ*

_{ e},

*b*), respectively. The mutual information between two variables

*A*and

*B*is defined in terms of their entropies,

*H*(

*A*) and

*H*(

*B*) as follows. Let

*A*possess

*N*finite states, {

*a*

_{1},

*a*

_{2}, …

*a*

_{ N}}. The entropy

*H*(

*A*) is given by

*H*(

*B*) is similarly defined, where

*B*possess

*M*finite states. The joint entropy

*H*(

*A, B*) is defined by

*MI*(

*A, B*), is defined by

*A*and

*B*are statistically independent.

*MI*(

*θ*

_{ e},

*L*) = 0.0022 bits,

*MI*(

*θ*

_{ e},

*a*) = 0.0019 bits, and

*MI*(

*θ*

_{ e},

*b*) = 0.0029 bits. We also measured the mutual information between

*θ*

_{ e}and the hue angle

*θ*

_{ c}= arctan

*MI*(

*θ*

_{ e},

*θ*

_{ c}) = 0.0028 bits. Since these pairwise mutual information measures are close to zero, we conclude that orientation is independent of the (

*L, a, b*) color components. This observation suggests that an efficient scheme of representing natural images is to represent color and orientation information in separate pathways.