Abstract
The ultimate goal of visual neuroscience is to understand the relation between visual input, neural processing, and human behavior. Analysis methods such as multivariate pattern analysis and comparison to deep neural network models of vision have recently improved our understanding of the relation between visual input and neural activity. In contrast, the use of such tools to understand the relation between neural activity and behavior has not received equal attention. Here, we investigated the relation between neural activity and behavior. As a measure of behavior, we acquired behavioral similarity judgments (N=20) for a set of 118 real world objects presented on natural backgrounds. We used representational similarity (RSA) analysis to relate behavior-derived representational dissimilarity matrices (RDMs) to fMRI and MEG (N=15) for the same stimulus set. Comparing behavior with fMRI spatial patterns using searchlight analysis delineated behavior-related neural activity with a peak in high-level ventral visual cortex. Comparison of behavior with MEG sensor patterns revealed the time course of behavior-related activity, peaking at 200ms. Further analyses showed that these were neither fully explained by capturing the categorical structure of the image set, nor by representations in current deep neural networks trained in object perception15. Finally, behavior-constrained fusion of MEG with fMRI data by RSA revealed the spatio-temporal dynamics of behavior-related activity as a subset of all visual activity during visual processing, In sum, these results elucidate the relation between neural activity and human behavior in visual object recognition. By showing that this relation is not fully captured by categorization or current categorization-based models of vision, it highlights the role of perceptual aspects in the representations of objects in the human brain.
Meeting abstract presented at VSS 2017