August 2014
Volume 14, Issue 10
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2014
Implicit detection of asynchronous audiovisual speech by eye movements
Author Affiliations
  • Tim J. Smith
    Department of Psychological Sciences, Birkbeck, University of London
  • Jonathan Batten
    Department of Psychological Sciences, Birkbeck, University of London
  • Rachael Bedford
    Department of Biostatistics, Institute of Psychiatry, Kings College London
Journal of Vision August 2014, Vol.14, 440. doi:10.1167/14.10.440
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Tim J. Smith, Jonathan Batten, Rachael Bedford; Implicit detection of asynchronous audiovisual speech by eye movements. Journal of Vision 2014;14(10):440. doi: 10.1167/14.10.440.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

When watching TV or film poorly synchronised audio and video can "pop-out", drawing attention to the speaking actor's mouth. Previous studies of explicit lipsynch detection have shown that participants are quite poor at detecting asynchronies in complex speech of less than 100ms. Can the attentional capture of viewer gaze by subtle lipsynch error be used as a measure of implicit asynchrony detection? Participant eye movements were recorded while they watched a series of brief (40 x ~9s) close-up videos of people speaking to the camera. Participants were given the cover task of answering infrequent questions about the video. Each video was repeated under seven asynchrony conditions: -150ms (audio first), -100ms, -50ms, 0ms, +50ms (video first), +100ms. +150ms. After twenty videos had been shown once in each synch condition (Block 1) the same twenty videos were shown with a large asynchrony, -300ms (the Cue block). The remaining twenty videos were then shown once in each synch condition (Block 3). Analysis of participant eye movements during the cue block revealed a significant bias towards the mouth and away from the eyes (the default viewing position) and this bias did not differ between participants who reported seeing lipsynch error (i.e. explicit detection) at the end of the experiment and those that did not. After the cue, gaze was significantly shifted towards the mouth and this was most pronounced for synch conditions in which the video was first (+50 and +100ms). These results indicate that asynchronous audiovisual speech captures overt attention even when the degree of asynchrony is sub-threshold for explicit awareness and may be used as a measure of implicit asynchrony detection.

Meeting abstract presented at VSS 2014

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×