Abstract
Visual short-term memory (VSTM) is famously capacity-limited, but the nature of the capacity limit is under heavy dispute. One class of theories postulates that a fixed number of “objects” (usually 4) may be remembered. A second class suggests that capacity reflects a resource divided over multiple remembered entities. The second account, however, does not specify what this resource may be. Here we apply information theory to VSTM to derive a measure of this capacity-limiting resource: information as number of bits. We presented subjects with arrays of colors or orientations and asked them to remember one, two, or four items. Subjects then made settings to match the color or orientation of remembered items, and the average error of these settings was computed. From this measure we could define the probability of the true item value given one setting from the subject: P(T|S). The uncertainty in this distribution is quantified by its Shannon entropy (in bits — H(T|S)). We can compare this entropy, to a measure of entropy of our prior distribution H(T): the probability of the true item value without seeing a subject's setting. The difference between these entropies [H(T) − H(T|S)] is the mutual information between a subject's setting and the true value of a particular item - this corresponds to the amount of information we gain from one setting. We find that when subjects have to remember more items, they make greater errors, and the amount of information contained in any one setting decreases greatly. However, when we multiply this measure of information gained per item by the number of items remembered we find a constant number of bits (roughly 3). Thus, it appears that the formal definition of information is a good candidate for the resource that limits VSTM capacity.