Abstract
In change-detection experiments with simple objects, performance declines gradually as the retention interval increases. We considered two classes of explanations for this decline. First, it is possible that the representations of the objects decay, becoming less accurate over time. Second, it is possible that representations remain accurate but have a limited lifetime, completely extinguishing after a variable period of time. To distinguish between these possibilities, we factorially manipulated the magnitude of the change and the retention interval in a color change-detection experiment. On each trial, subjects viewed a 100-ms sample array consisting of 3 colored squares, followed by a retention interval of 1–5 seconds and then a test array. The test array was either identical to the sample array or differed in the color of one item, and subjects indicated whether a change was present. The magnitude of the color change could be small, intermediate, or large. When the change magnitude is large, it should still be possible for subjects to detect the change even if the representation has begun to decay. When the change magnitude is small, however, the progressively less accurate representations produced by increases in the retention interval should lead to progressively greater impairments in change-detection performance. Thus, the decay hypothesis predicts a larger effect of retention interval for small change magnitudes than for large change magnitudes. In contrast, the limited-lifetime hypothesis predicts no interaction between change magnitude and retention interval. Preliminary data were consistent with the limited-lifetime hypothesis: The effect of retention interval was not systemically larger for small change magnitudes than for large change magnitudes.
This research was made possible by grants R01 MH63001 and R01 MH65034 from the National Institute of Mental Health.