September 2015
Volume 15, Issue 12
Free
Vision Sciences Society Annual Meeting Abstract  |   September 2015
Relating orientation tuning and feature utilization during facial expression recognition
Author Affiliations
  • Justin Duncan
    Département de Psychologie et Psychoéducation, Université du Québec en Outaouais Département de Psychologie, Université du Québec À Montréal
  • Charlène Cobarro
    Département de Psychologie et Psychoéducation, Université du Québec en Outaouais
  • Frédéric Gosselin
    Département de Psychologie et Psychoéducation, Université du Québec en Outaouais Centre de Recherche en Neuropsychologie et Cognition (CERNEC), Université de Montréal
  • Caroline Blais
    Département de Psychologie et Psychoéducation, Université du Québec en Outaouais Centre de Recherche en Neuropsychologie et Cognition (CERNEC), Université de Montréal
  • Daniel Fiset
    Département de Psychologie et Psychoéducation, Université du Québec en Outaouais Centre de Recherche en Neuropsychologie et Cognition (CERNEC), Université de Montréal
Journal of Vision September 2015, Vol.15, 164. doi:https://doi.org/10.1167/15.12.164
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Justin Duncan, Charlène Cobarro, Frédéric Gosselin, Caroline Blais, Daniel Fiset; Relating orientation tuning and feature utilization during facial expression recognition. Journal of Vision 2015;15(12):164. https://doi.org/10.1167/15.12.164.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Facial expression recognition (Huynh & Balas, 2014) correlates with horizontal information utilization, and it also correlates with local information utilization (e.g. mouth; Blais et al., 2012). However, the link between these two aspects of visual processing remains elusive. Our aim was to provide a first examination of this link, and refine current knowledge on orientation tuning. Twenty participants each performed 1,400 trials in a facial expression recognition task using 7 expressions. Seventy pictures of faces (10 identities; face width= 4deg) depicting the six basic emotions plus neutrality were used as stimuli. Images were randomly filtered in the orientation domain (orientation bubbles) and presented for 150ms. We developed this method to allow precise extraction of orientation utilization and exclude adaptation to predefined filters as a possible confound. Classification Images (CIs) were derived with a weighted sum of orientation samples, using Z-transformed accuracy scores as weights. A Pixel Test (Chauvin et al., 2005) was used on Z-transformed CIs (ZCIs) to establish statistical significance. Horizontal information significantly correlates with performance (Z-obs>Zcrit= 1.89) for anger [0°-6°; 173-180°], sadness [0°-3°; 176°-180°], disgust [0°-9°; 175°-180°], fear [0-10°; 176°-180°], happiness [0°-8°; 175°-180°], and neutrality [0°-10°; 175°-180°]. For surprise, vertical/oblique [64°-76°] information is significant. Participants also completed the same task, but with location randomly sampled (location bubbles; Gosselin & Schyns, 2001). The link between orientation tuning and location tuning was examined by regressing orientation ZCIs with location ZCIs. Interestingly, individuals more horizontally tuned rely significantly (Z-obs>Zcrit= 3.57) more on diagnostic locations (eyebrow junction for anger, left eye and mouth for fear, and mouth for all others) than those less tuned. Differences weren’t reliable at other angles. Our results therefore imply that horizontally tuned individuals are more efficient at extracting local diagnostic information from faces. Implications for face processing will be discussed.

Meeting abstract presented at VSS 2015

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×