Purchase this article with an account.
Thomy H. Nilsson; Psychophysical visual memory data and their neural net replications indicate sensory-like activity is released from storage. Journal of Vision 2006;6(6):31. doi: https://doi.org/10.1167/6.6.31.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Memory matches of the hue of monochromatic light and the orientation of single lines and gratings were obtained after delays up to 24 seconds using an iterative procedure with one second stimulus presentations to control memory time. As memory time increased, there were minimal shifts in the average hue or angle of the matches but their standard deviations increased as a negative exponential function with a half-life of 20–30 seconds. Memory difference thresholds as a function of wavelength and angle resembled the sensory difference thresholds. As memory time increased, the memory discrimination functions did not decay into randomness. Rather, their shape became more exaggerated over time. This suggests that memory matching involves comparing a sensory response with activity from memory that resembles a sensory response. To determine whether the exaggeration of memory discrimination functions could arise from an increase in noise with time in memory, a neural network model of the memory matching task was constructed in QuattroPro using macro functions, random number generators, and a time-varying correlation matrix. The model demonstrates that the obtained memory discrimination functions can be replicated simply by adding noise to a stored transformation of the sensory response. This suggests that no matter how sensory information is actually stored, its retrieval produces activity that is similar to the original visual response.
This PDF is available to Subscribers Only