Abstract
The internal representations of three dimensional objects within visual memory are only partially understood. Previous research suggests that 3D object perception is viewpoint dependent, and that the visual system stores viewpoint perspectives in a biased manner. The aim of this project was to obtain detailed estimates of the distributions of 3D object views in shared human memory. We devised a novel experimental paradigm based on crowdsourcing and transmission chains to investigate memory biases for the 3D orientation of objects. In the transmission chains, a subject's reconstruction of the remembered view of an object becomes the stimulus for the next subject (a process analogous to the 'telephone game'). Using a specialized crowdsourcing platform, we generated large-scale transmission chains over amazon mechanical turk (amt) probing the remembered 3D view for a set of common 3D objects and shapes. We found that memory tends to be biased towards orthogonal diagrammatic perspectives of these objects, and that these biases are strongest for side views as well as top or bottom views for a small set of bilaterally symmetric objects. Finally, we found that views sampled from the modes were easier to categorize in a recognition task. Our approach reveals nuanced structure in shared memory biases that has eluded previous experimental approaches. Our croudsourcing platform also provides a general framework for curating different network structures over amt in which group estimates rather than individual responses can be transmitted through a chain.