September 2017
Volume 17, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2017
Look at me when I'm talking to you! Sound influences gaze behaviour in a 'split-screen' film
Author Affiliations
  • Jonathan Batten
    Psychological Sciences, Birkbeck, University of London
  • Jennifer Haensel
    Psychological Sciences, Birkbeck, University of London
  • Tim Smith
    Psychological Sciences, Birkbeck, University of London
Journal of Vision August 2017, Vol.17, 194. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Jonathan Batten, Jennifer Haensel, Tim Smith; Look at me when I'm talking to you! Sound influences gaze behaviour in a 'split-screen' film. Journal of Vision 2017;17(10):194. doi:

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Viewing a dynamic audiovisual scene has inherent challenges for where and when gaze is allocated because of the competing and transient sensory information. The applied craft of film production has developed intuitive solutions for guiding viewers' gaze through visual and sound editing techniques, for example sound designers believe that increasing the loudness of dialogue relative to background ambient noise orients a viewers' attention to the speaking character. A fundamental assumption of these techniques is that a viewer's gaze is attracted to audiovisual elements in a scene and inversely less attracted to visual events without sound. Empirical evidence of viewing behaviour to dynamic scenes has predominantly focused on visual features, the role of sound as an influence on viewers' gaze is less clear. This study utilised a found experiment, Mike Figgis's experimental feature film, Timecode (2000) which contains four continuous perspectives of interrelated events displayed using a 2x2 split-screen, where each quadrant has an isolatable sound mix. We investigated the influence of sound on gaze behaviour to a 4 minute 50 second excerpt by manipulating the presence of sound across the four quadrants one at a time with abrupt sound cuts shifting sound 16 times (each quadrant represented four times). Forty-eight participants free-viewed the clip whilst being eye-tracked (sound order was counterbalanced across participants). Sound representation to a quadrant significantly increased the proportion of gaze to that region. Gaze was also influenced by time, as later sound representations of a quadrant had a significantly higher proportion of gaze than earlier ones. Fixation durations to sound regions were significantly longer than those to visual only quadrants. The auditory and visual salience values are also considered as predictors of gaze between the quadrants. These preliminary results suggest that dynamic scene viewing behaviour is significantly influenced by the inclusion of corresponding sound.

Meeting abstract presented at VSS 2017


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.