September 2017
Volume 17, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2017
Eye Left the Right Face: The Impact of Central Attentional Resource Modulation on Visual Strategies During Facial Expression Categorization
Author Affiliations
  • Justin Duncan
    Université du Québec en Outaouais
    Université du Québec A´ Montréal
  • Gabrielle Dugas
    Université du Québec en Outaouais
  • Benoit Brisson
    Université du Québec a´ Trois-Rivières
  • Caroline Blais
    Université du Québec en Outaouais
  • Daniel Fiset
    Université du Québec en Outaouais
Journal of Vision August 2017, Vol.17, 831. doi:https://doi.org/10.1167/17.10.831
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Justin Duncan, Gabrielle Dugas, Benoit Brisson, Caroline Blais, Daniel Fiset; Eye Left the Right Face: The Impact of Central Attentional Resource Modulation on Visual Strategies During Facial Expression Categorization. Journal of Vision 2017;17(10):831. https://doi.org/10.1167/17.10.831.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The categorization of facial expressions is impaired when central attentional resources are shared with an overlapping task (Tomasik et al., 2009). Using the psychological refractory period (PRP) dual-task paradigm, we verified if unavailability of central resources precludes the utilization of normal visual strategies. Twenty subjects took part in the study. In the first task (T1), they categorized a sound (150ms) as either low (200Hz or 400Hz) or high (800Hz or 1,600Hz) frequency. In the second task (T2), participants categorized the facial expressions of anger, disgust, fear, happiness, sadness, and surprise taken from the Karolinska face database (Lundqvist, Flykt & Öhman, 1998). External facial cues were hidden with an oval that blended with the background. Faces were sampled with Bubbles (Gosselin & Schyns, 2001) and presented for 150ms. T1 and T2 presentation was separated by a stimulus onset asynchrony (SOA) of either 300ms (central resource overlap) or 1,000ms (no overlap). Participants were instructed to answer as rapidly and as accurately as possible to both tasks, and not to wait for T2 onset before answering to T1. We performed a linear regression of Bubbles' coordinates on T2 performance. Statistical significance was determined with the Stat4CI toolbox (Chauvin et al., 2005). The categorization of angry, sad, fearful, and surprised expressions strongly correlated with utilization of both eyes and the mouth, at short and long SOAs (Z> 3.4, p< .05). Utilization of the left eye, however, was significantly reduced at short, relative to long SOA (Z> 2, k> 2,347 pixels, p< .05). Interestingly, whereas participants showed a bias favoring the left side of the face at long SOA, they favored the right side at short SOA. Participants always fixated the center of face stimuli. Thus, these results could be hinting at hemispheric differences in sensitivity to the modulation of central attentional resources.

Meeting abstract presented at VSS 2017

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×