Purchase this article with an account.
Katherine E.M. Tregillus, Michael A. Webster, Gerrit W. Maus; Adaptation to a field of distributed temporal frequencies results in a reduction of the perceived mean flicker rate. Journal of Vision 2019;19(8):117. doi: https://doi.org/10.1167/19.8.117.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Previous research has shown that perceived flicker tends to slow over time, but does this adaptation effect extend to dynamic scenes with differing rates of change? In order to test this, participants adapted to a 5×5 array of squares, each one sinusoidally modulating from black to white. The flicker rates of the squares were probabilistically distributed around a given mean frequency, each with a randomized phase. We tested adaptation at 5 mean frequency levels: 0.2, 0.9, 1.6, 2.3, and 3 Hz. The adapting array was presented left or right of fixation for an initial 30 sec. period, with 5 sec. top-ups between trials. The test display consisted of two arrays of squares on either side of fixation: a matching array, which updated mean frequency throughout the trial, and the adapting array, which remained constant. Using a staircase procedure, participants indicated which array appeared to flicker “faster” on average. Prior to adaptation, participants closely matched the mean flicker rates of the adapting and test arrays. Following adaptation, the appearance of the mean flicker rate slowed significantly, with greater adaptation at higher mean frequencies. In a second experiment, participants were asked to approximate the mean flicker rate of the array by adjusting the frequency of a single reference square. We found that participants tended to overestimate the array’s average flicker rate. Our results indicate that while mean temporal frequencies might not be perceived accurately, adaptation to the temporal properties of a scene affects perception of its overall rate of change.
This PDF is available to Subscribers Only