Purchase this article with an account.
Brad Wyble, Howard Bowman; A neural network account of binding discrete items into working memory using a distributed pool of flexible resources. Journal of Vision 2006;6(6):33. doi: https://doi.org/10.1167/6.6.33.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
It has recently been disputed whether working memory has a fixed capacity of 4 objects. Alvarez & Cavanagh (2004), for example, demonstrate that complex items appear to consume more resources than simple ones, reducing storage capacity. However, existing models of working memory have difficulty describing a system with a feature-based capacity limit spread over multiple items.
We propose a neural network model of working memory that combines slot-based theories of discrete objects (Luck &Vogel, 1997) with signal detection accounts exhibiting increasing interference as memory load increases (Wilken & Ma, 2004).
In an effort to elaborate the STST account of the attentional blink (Wyble & Bowman,2005), we model the binding of types (e.g. visual features) to tokens (Kanwisher, 1987) without resorting to synaptic modification or temporal synchrony. A single pool of binding nodes is used to hold onto multiple working memory traces. Encoding an object uses fixed weights to rapidly activate a fraction of the pool representing that object's features. This portion of the pool is self-sustaining and can be used to reconstruct the features of the object. This model uses distributed representations to avoid the combinatorial explosion inherent in encoding arbitrary combinations of features. The pool can also uniquely encode multiple copies of the same item.
Complex objects are harder to encode, and therefore require a larger portion of the pool, reducing the observed capacity of working memory. Our model suggests that pairs of feature dimensions which can be bound into single objects without capacity cost have separate binding pools, insulating them from mutual interference.
This PDF is available to Subscribers Only