December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
The Impact of Deafness on the Use of Information During Facial Emotion Discrimination
Author Affiliations
  • Catherine Landry
    cerebrum, Département de Psychologie, Université de Montréal, Canada
  • Justine Lévesque
    Département de Psychologie, Université de Montréal, Canada
  • Marie-Ève Doucet
    Département de Psychologie, Université de Montréal, Canada
  • Nicolas Dupuis-Roy
    Département de Psychologie, Université de Montréal, Canada
  • Frédéric Gosselin
    cerebrum, Département de Psychologie, Université de Montréal, Canada
  • Hugo Théoret
    cerebrum, Département de Psychologie, Université de Montréal, Canada
  • François Champoux
    École d’orthophonie et d’audiologie, Université de Montréal, Canada
  • Franco Lepore
    cerebrum, Département de Psychologie, Université de Montréal, Canada
Journal of Vision December 2022, Vol.22, 4144. doi:https://doi.org/10.1167/jov.22.14.4144
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Catherine Landry, Justine Lévesque, Marie-Ève Doucet, Nicolas Dupuis-Roy, Frédéric Gosselin, Hugo Théoret, François Champoux, Franco Lepore; The Impact of Deafness on the Use of Information During Facial Emotion Discrimination. Journal of Vision 2022;22(14):4144. https://doi.org/10.1167/jov.22.14.4144.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The integration of visual and auditory cues facilitates the detection of emotions in our interactions with others. Sensory deprivation such as deafness can be compensated through sign language, lip reading and oral communication. The diagnostic features for the recognition of emotions expressed facially in relation to the preferred means of communication and other factors of deafness (e.g. CI, cochlear implant) still remain unknown. In the present study, we used the Bubbles technique (Gosselin & Schyns, 2001) to examine whether severe-to-profound bilateral deafness leads to visual differences in decoding specific facial information. Twenty-two deaf individuals (including oral deaf, signers, and CI users) matched in age and sex with 22 hearing controls completed 3,072 trials of a fear versus happy discrimination task. On each trial, a face image was randomly selected from a set of four (two happy), mirror-reversed with a probability of 0.5 and sampled using randomly located Gaussian apertures at five scales (see Adolphs et al., 2005). The number of Gaussian apertures was adjusted on each trial to maintain a 75% correct rate. Multiple linear regressions were performed on the location of the Gaussian apertures and accuracy score for each participant. The individual regression coefficient planes were combined within subject groups (oral, signers, CI users, controls). Finally, to reveal group differences in information use for the task at hand, we computed all pairwise contrasts between these group regression coefficient planes and we applied a statistical threshold (Chauvin et al., 2005). The deaf participants using sign language differed from the other groups by using mostly the eye region to discriminate emotions. Similar patterns with the mouth as the salient area were found for hearing controls, CI users and the oral deaf. Facial information processing is therefore influenced by auditory experience and the visual strategies used in communication.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×