September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
Dynamics of Face Perception: Unraveling the Role of Eyes and Mouth in Neural Processing
Author Affiliations & Notes
  • Yasemin Gunindi
    Sabanci University
  • Çiçek Güney
    Sabanci University
  • Huseyin Ozkan
    Sabanci University
  • Nihan Alp
    Sabanci University
  • Footnotes
    Acknowledgements  This work was funded by TUBITAK 1001 (122K922).
Journal of Vision September 2024, Vol.24, 1180. doi:https://doi.org/10.1167/jov.24.10.1180
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Yasemin Gunindi, Çiçek Güney, Huseyin Ozkan, Nihan Alp; Dynamics of Face Perception: Unraveling the Role of Eyes and Mouth in Neural Processing. Journal of Vision 2024;24(10):1180. https://doi.org/10.1167/jov.24.10.1180.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Faces, as crucial conveyors of social information, are often studied using static images or dynamic videos to analyze emotional aspects, overlooking the significance of part-based dynamic information in neutral faces. Hence, the extent to which part-based dynamic information, primarily derived from the eyes and mouth, contributes to dynamic face perception remains elusive. In this study, using neutral dynamic face stimuli, we investigate how the brain processes part-based information during dynamic face perception with specific emphasis on its ability to discriminate between forward and backward face videos over time. Participants fixated on a central cross while watching 3-second grayscale muted videos featuring individuals speaking in neutral state. We manipulate the face orientation (right-side-up, upside-down) and the presence of eye blink (with/without blink), and ask participants to indicate the temporal order of the dynamic face videos as forward or backward. The eyes and mouth were contrast-modulated at 6 and 7.5 Hz, respectively. Steady-state visual evoked potentials were recorded from 64 EEG channels. Behavioral results (d' primes > 0) indicate that participants performed the task well. EEG results reveal an orientation effect consistent with the literature. The topographic map of neural responses indicates a central-occipital focus for the eye and a lateral-occipital focus for the mouth. The overall neural response shows that there is a bias towards the mouth when face orientation is right-side-up. We observed that neural responses to eyes tend to be differentially more elevated compared to mouth when there is a blink. In intact faces (forward, right-side-up), the blink information closes the mouth bias. Whereas the most-distorted case (backward, up-side-down) requires a separate search for a cue (blink) in the eyes, suppressing the mouth part. Overall, this study significantly contributes to our understanding of dynamic face perception, emphasizing the role of dynamic part-based information, particularly eyes and mouth movements.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×