Abstract
Social communication relies on the rapid and accurate integration of numerous auditory and visual signals. One cortical region which figures prominently in communication is the frontal lobes. In addition to neuropsychological and neuroimaging data confirming the essential role of the human inferior frontal lobe in language, studies have shown that single cells in the prefrontal cortex (PFC) of non-human primates respond to faces while more recent studies have focused on vocalization-responsive neurons in adjacent parts of the primate PFC. To understand how audio-visual information is integrated in the frontal lobes during social communication we recorded from single cells in the ventrolateral PFC while we presented vocalizations and corresponding facial gestures to awake, behaving monkeys. The stimuli consisted of short video clips of conspecific macaques vocalizing which were deconstructed into audio and visual components and were presented separately and simultaneously. The single units we encountered responded robustly to faces, vocalizations and combined face-vocalization stimuli. The multisensory neurons represented ∼one-third of the task responsive population and exhibited significantly enhanced or suppressed responses to bimodal stimuli. Combination of the face stimuli with non-matching (incongruent) vocalizations resulted in significant changes in neuronal firing compared to the normal congruent stimuli in some cells. Moreover, alterations of the temporal onset of auditory-visual stimuli also resulted in a significant change when the onset of the auditory stimulus preceded the visual motion stimulus. Our results suggest that ventrolateral PFC neurons integrate communication-relevant auditory and visual information. Analysis of multisensory processing in the VLPFC of non-human primates may help us to understand social communication in the human brain, which critically depends on the integration of multiple types of sensory information.