September 2017
Volume 17, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2017
Detecting Emotional Facial Expressions in the Peripheral Visual Field: Psychophysical and Electrophysiological Evidence
Author Affiliations
  • Andrew Mienaltowski
    Department of Psychological Sciences, Ogden College of Science and Engineering, Western Kentucky University
  • Hayley Lambert
    Department of Psychological Sciences, Ogden College of Science and Engineering, Western Kentucky University
  • Connor Rogers
    Department of Psychological Sciences, Ogden College of Science and Engineering, Western Kentucky University
  • Brittany Groh
    Department of Psychological Sciences, Ogden College of Science and Engineering, Western Kentucky University
  • J. Farley Norman
    Department of Psychological Sciences, Ogden College of Science and Engineering, Western Kentucky University
Journal of Vision August 2017, Vol.17, 822. doi:10.1167/17.10.822
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Andrew Mienaltowski, Hayley Lambert, Connor Rogers, Brittany Groh, J. Farley Norman; Detecting Emotional Facial Expressions in the Peripheral Visual Field: Psychophysical and Electrophysiological Evidence. Journal of Vision 2017;17(10):822. doi: 10.1167/17.10.822.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Emotion detection requires one to recognize the presence of facial cues that signal a target's specific emotional state. The salience of facial cues is influenced by where in the visual field a facial stimulus is presented. Emotion detection should be superior in the center of a participant's visual field than in the periphery because retinal cone density is at its peak. The current study examined younger adults' ability to detect emotion on facial stimuli presented briefly at one of five horizontal locations on a display, -20, -10, 0, +10, +20 degrees from center. Participants completed two emotion detection tasks, detecting angry and happy expressions amongst neutral ones in separate blocks. Overall, there were 960 trials (480 per task, 96 at each location). Concurrently, visually-evoked potentials were recorded using a 128-channel high-density electrode array and were time-locked to the visual onset of the facial stimuli. The psychophysical data show superior detection ability for happy relative to angry expressions at each stimulus location, and that peripheral detection dropped off more sharply for angry than for happy expressions. Analyses also revealed that stimulus location impacted the visually-evoked N170 measured over occipito-temporal electrodes. Peak N170 amplitude was greater for centrally presented faces than for peripherally presented ones. Additionally, greater amplitude N170s were elicited in the hemisphere contralateral to the visual field in which the stimuli were presented. Follow-up analyses exploring the impact of expressive intensity on emotion detection demonstrated greater differences in detection performance for lower and higher intensity angry expressions than for lower and higher intensity happy expressions. Additionally, the difference in emotional versus neutral peak N170 amplitude was significant for high intensity emotional expressions but not for less intense emotional expressions.

Meeting abstract presented at VSS 2017

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×