Purchase this article with an account.
Gary Lupyan; Semantic effects on color afterimages. Journal of Vision 2013;13(9):466. doi: 10.1167/13.9.466.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
To what degree is what we see influenced by what we know? In a series of simple behavioral studies utilizing a powerful color afterimage illusion, I show that knowledge of typical object colors strongly influences the colors people see. Participants were tested on a version of the ‘Spanish Castle Illusion’ in which adapting to a chroma-inverted image renders a subsequently presented grayscale image in vivid color until subjects moves their eyes (e.g., www.johnsadowski.com/big_spanish_castle.php). Participants adjusted the hue of the grayscale image displayed post-adaptation until it looked subjectively achromatic enabling accurate measurements of afterimage strength—e.g., a blue adaptor would induce a yellow afterimage requiring the addition of blue to offset it back to grayscale. Reliable afterimages were produced in all cases, but adapting to objects with intrinsic colors (e.g., a pumpkin) led to stronger afterimages than adapting to identically colored objects without intrinsic colors (e.g., an orange car). Semantic effects on perceived afterimages were greatly exagerrated when full-color scenes were used. Adapting to a castle scene containing intrinsically-colored components (grass, sky, etc.) produced afterimages that were 2.4 times stronger than those induced after adapting to a bookcase with arbitrarily colored books. These between-image differences disappeared if the adaptor induced an atypically colored scene or if upside-down images were used, as expected if decreases in familiarity reduce top-down predictive signals. The present results are consistent with recent findings showing that color afterimage signals originate from ganglion cell rebounds, which are then modulated at the cortical level just as other retinal inputs. This work goes beyond earlier demonstrations of effects of color memory on perception by showing that induced retinal activity is interpreted differently depending on the ecological likelihood of the resulting percept. In line with predictive-coding models of vision, retinal inputs that conflict with prior knowledge may lead to cortical discounting.
Meeting abstract presented at VSS 2013
This PDF is available to Subscribers Only