Abstract
Earlier we developed a model that determined whether or not two pixels fell on the same surface based on luminance and color differences between them. When these differences are small the two pixels probably fall on the same surface, when color and luminance differences are larger, the two pixels probably fall on different surfaces (Fine et al., 2003). Here, we extended this model to create a neurophysiologically plausible Bayesian model that estimates the structure in a given patch based on color and luminance information within the entire patch.
Observers rated whether there was “stuff” (structure) within 2-degree natural scene patches. These patches were convolved with a set of luminance, red-green and blue-yellow Gabors varying in spatial frequency, orientation and phase. Our Bayesian model calculated the probability of structure in any patch given the energy of that patch across the Gabor set. We found that the model's estimate of whether there was structure in each patch was correlated with the structure ratings made by human observers.
We are currently examining whether model's estimates of structure correlate with firing responses of simple and complex cells in macaque V1, using data for luminance and color noise stimuli from Horwitz et al (2005).