September 2017
Volume 17, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2017
Lateralisation and binding of dynamic facial features
Author Affiliations
  • Ben Brown
    School of Psychology, University of Nottingham
  • Vanessa Enahoro
    School of Psychology, University of Nottingham
  • Alan Johnston
    School of Psychology, University of Nottingham
Journal of Vision August 2017, Vol.17, 1028. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Ben Brown, Vanessa Enahoro, Alan Johnston; Lateralisation and binding of dynamic facial features. Journal of Vision 2017;17(10):1028.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Faces transmit a complex stream of social information via coordinated global motion. Their encoding is predominantly right-hemisphere lateralised and entails the binding of features into a holistic representation (Ramon & Rossion, 2011, Brain Cogn., 78, 7-13). Given recent evidence of timing-dependent interactions between moving facial features (Cook et al, 2015, Psychol. Sci., 26, 512-517; Iwasaki & Noguchi, 2016, Sci. Rep., 6, 22049), we asked whether dynamic binding was similarly lateralised. Twenty-two participants viewed pairs of animated facial avatars, whose eyebrows oscillated vertically while their mouths opened and closed. Participants had to detect which face (2IFC) contained misaligned eyebrow movement. Stimuli were presented in either the left or right visual hemifield in alternating trials, and fixation was enforced using an eye tracker. Mouth movement of both faces opposed, trailed, matched or led eyebrow movement (0, 90, 180, or 270 degrees of phase angle) in separate blocks. Stimuli were presented for 3 seconds, either upright or inverted, in separate blocks. Faces subtended 5 DVA (width), with eccentricity 3.125 DVA (fixation to centre) and feature oscillation set to 1.5Hz. Each participant provided 40 trials per condition from which we calculated percentages correct. A 4x2x2 repeated-measures ANOVA showed a significant interaction between hemifield and eyebrow-mouth relative phase (F(3, 63) = 4.31, p = .01), driving a significant main effect of phase (F(3, 63) = 4.05, p = .01). Collapsing over synchronous (0° and 180°) and asynchronous (90° and 270°) motion reveals a specific performance reduction for synchronicity in the right hemifield (RH mean correct 58.85% and 63.38%; LH 62.56% and 63.36%). We found no inversion effect however (F(1,21) = 2.58, p = .12). We can conclude that the left hemisphere appears susceptible to interference by coincident feature motion, which may indicate involvement of left hemisphere mechanisms for the analysis of facial speech.

Meeting abstract presented at VSS 2017


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.