Abstract
Although neuroimaging studies have advanced our understanding of cortical representations of object characteristics such as animacy, little is known about temporal aspects of associated processes. In the current study, we investigated with millisecond resolution when the brain shows differential processing of biological agents. We used unique stimuli consisting of images and videos of three agents: A female adult (human), a non-human agent that closely resembles her (android), and the latter in a more mechanical appearance (robot). The human agent had biological form and biological motion, the android had biological form and non-biological motion, and the robot had non-biological form and non-biological motion. Observers were shown the agents prior to the start of the study and told whether each agent was a human or a robot. We recorded EEG as observers viewed images and movies of these agents and analyzed visual event related potentials (ERPs). The amplitude of the P1 component (~80-120 ms) was significantly greater for the human, indicating the biological status of an object can modulate visual processing very early on. In fact, human-specific differences were observed even earlier than 80 ms, as part of the C1 component. Given the near-identical appearance of the human and android, it is rather implausible for these very early effects to be driven by stimulus properties. Instead, early human-specific responses are likely due to subjects' top-down knowledge of the agents' biological status, and the brain ascribing salience to the human agent from the very onset of the stimuli. The visual N1 (~150 ms) had greater amplitude for the robot condition, a modulation that is more likely to be driven by visual stimulus features. Overall, these data show that biological/non-biological status of an agent modulates visual processing very early on, likely due to ascription of salience to agents known to be biological.
Meeting abstract presented at VSS 2014