Abstract
Images of objects are commonly used as proxies to understand the organization of conceptual knowledge in the human brain. However, recent studies from our laboratory have highlighted differences between images and real objects at the level of neural representations, as well as in their contribution to memory, attention, and decision-making. Asking an observer to make judgments about the similarities among a set of objects can provide unique insights into the nature of the underlying representations of those objects in human cortex (Mur et al, 2013). Here, we used inverse multidimensional scaling (Kriegeskorte and Mur 2012) to investigate whether the subjective properties that observers use to characterize objects during free-sorting are dependent on display format. Observers arranged 21 different objects so that the distances between them reflected their perceived dissimilarities. Critically, one group of participants sorted 2-D images of the objects on a computer monitor using a mouse drag-and-drop action; another group manually sorted objects presented using AR; the remaining group manually sorted real-world exemplars. Participants were free to use any dimension they liked to group the items. By correlating models based on the various sorting criteria, and the dissimilarity matrix obtained by the behavioral ratings, we identified the properties that observers used to separate the items in each format. We found that object representations depended on the format in which objects were displayed. 2-D images of objects were sorted primarily with respect to the conceptual property of typical location. AR objects were sorted according to their physical size and weight properties, but less so according to conceptual properties. Real objects, unlike 2-D images and AR stimuli, were sorted with respect to both their conceptual (typical location) and physical properties (size, weight). Real-world objects are coded in a richer, more multidimensional, property space compared to computerized images.
Acknowledgement: NIH grant R01EY026701 awarded to J.C.S