Abstract
Sensitivity to texture contrasts often improves as the disparate region is moved from fixation into the periphery. Because performance is non-optimal at fixation this phenomenon is referred to as the Central Performance Drop (CPD). The CPD has been explained as a mismatch between the scale of the texture and the scale of the available texture segmentation mechanisms; at fixation the available mechanisms are too small to support segmentation whereas at some point in the periphery the match is optimal. However, one would expect the range of available scales in the periphery to be a subset of the range available at fixation. One resolution to this apparent conundrum suggests that the responses of uninformative high-frequency mechanisms at fixation dilute the information available in more informative low-frequency mechanisms. This has been termed Cross-Frequency Interference (CFI). Gurnsey et al. (1996, JEP:HPP) reasoned that when high frequencies are removed through low-pass filtering the result should be a release from inhibition and improved performance at the fovea; this result was not obtained. Following the same logic Morikawa (2000, Vision Research) found that low-pass filtering improved foveal performance. However, the filtered stimuli were contrast enhanced thus rendering the results ambiguous. Carrasco et al. (in press, Perception & Psychophysics) found release from inhibition (attenuation of the CPD) following adaptation to a high frequency grating. However, the adapting stimulus had the same orientation as the background texture (but not the disparate texture) suggesting that adaptation functioned to enhance the relative salience of the disparate texture. We reran each of these three experiments with appropriate modifications. In contrast to Gurnsey et al. (1996) we found clear evidence for CFI when textures were low-pass filtered. We found no evidence that adaptation enhanced the salience of the disparate texture. We conclude that there is a role for CFI in the CPD.