Finally, the existing model implementations in either framework can account for only purely
perceptual effects. But real-world sensory estimation tasks commonly place demands on both
perceptual and
memory processing: To identify previously seen surfaces in a new scene (e.g., when looking for a lost item of clothing), one has to
both discount illumination variation (such as shadows in
Figure 1a) to estimate reflectance
and employ working memory to compare percepts with the memorized surfaces. In the lightness perception literature,
perceptual errors have been used to drive theory; the dependence of perceived lightness on depth relationships, for example, demonstrates that local contrast is not the only determinant of perceived lightness. A separate visual memory literature has revealed that working memory processing increases the sensory noise of a representation (Pasternak & Greenlee,
2005), leading to errors in memory-dependent estimation (Ashourian & Loewenstein,
2011; Jazayeri & Shadlen,
2010; Olkkonen, McCarthy, & Allred,
2014). Accumulating evidence from behavioral and neurophysiological studies shows further that the underlying neural processes for perception and working memory are closely related (Ester, Serences, & Awh,
2009; Harrison & Tong,
2009; Kang, Hong, Blake, & Woodman,
2011; Magnussen & Greenlee,
1999; Pearson & Brascamp,
2008; Serences, Ester, Vogel, & Awh,
2009; Supèr, Spekreijse, & Lamme,
2001). Although a small number of studies have considered some aspects of memory in a color constancy task (Allen, Beilock, & Shevell,
2011; Jin & Shevell,
1996; Ling & Hurlbert,
2008; Uchikawa, Kuriki, & Tone,
1998), the independence of perceptual and memory demands in perceptual constancy has not been characterized.