Purchase this article with an account.
Juraj Mesik, Akshay Patke, Stephen Engel; Repeatedly adapting to orientation ensembles does not change contrast adaptation dynamics. Journal of Vision 2015;15(12):36. doi: 10.1167/15.12.36.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Contrast adaptation adjusts sensitivity to match the statistical structure of the world. Does the visual system adapt faster to statistics with which it has prior experience? To answer this question, we repeatedly adapted subjects (n=6), over the course of 4 days, to three distinct sets of contrast statistics, and measured the growth and decay of the tilt-aftereffect. Subjects viewed 2 min sequences comprised of rapidly presented sinusoidal gratings of 12 different orientations (33 ms/grating, 85% contrast). The sequences were either uniform across orientation, or biased, with one orientation much more likely than the others (66% vs 3% probability). The biased sequences have been shown to induce contrast adaptation to the high probability orientation. “Adapter” and unbiased “test” sequences were presented simultaneously on opposite sides of fixation at 2.5 deg eccentricity. To measure the tilt-aftereffect, every 1 sec a pair of 2.5 deg, 50% contrast gratings were inserted in the sequences for 250 msec. Using a mouse, subjects adjusted the tilt of the grating in the test sequence to match its apparent orientation in the adapter sequence. The test was oriented 13, 15, or 17 deg away from the biased orientation, offsets at which robust tilt-aftereffects are observable. Every day, subjects completed 4 task runs, each containing 5 blocks that alternated between biased adapters and unbiased “baseline” adapters. The biased adapters induced a strong tilt-aftereffect (on average ~2 deg). However, we observed no change in adaptation dynamics across the 4 days. Adaptation growth and decay time constants and peak tilt-aftereffect levels did not reliably change across the 4 days p>0.05). Our results suggest that the visual system cannot learn to adapt more quickly to statistical regularities in low-level visual features, at least over the timescales tested here.
Meeting abstract presented at VSS 2015
This PDF is available to Subscribers Only