Abstract
Edge integration algorithms model lightness computation as a process that occurs in at least two stages. First, spatial filters extract information about the magnitude and sign of local contrast in the image. Then, the filter outputs are combined across space to assign lightness values to every point in the original image. A series of experiments conducted in our laboratory have sought to characterize the minimum set of principles required for an edge integration algorithm to model lightness and brightness matches carried out with simple stimuli comprising disks or squares surrounded by one or more rings or bands. To a first order of approximation, our data are well accounted for by a modified Retinex algorithm in which the logarithms of the local edge ratios are spatially summed, with the ratios of edges that are close to the test region being given larger weights than those of more distant edges. Here we provide evidence for an additional process whereby edges interact as a function of their spatial separation to modulate the contrast gains applied to the log luminance ratios prior to edge integration. A computational model incorporating this feature is shown through computer simulations to account for observed changes in the exponent of Stevens' brightness law as a function of surround width, as well as for a previously unreported and surprising asymmetry in the direction of the gain control applied to increments and decrements: the presence of other, nearby, edges increases the brightness exponent for increments but decreases the exponent for decrements.