December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
Recovery of vivid 3D percepts from 125 years of historical motion film and animation clips, part 2 - 3D movies
Author Affiliations
  • David Peterzell
    Fielding Graduate University
    JFK School of Psychology (National University)
  • Christopher Tyler
    Smith-Kettlewell Eye Research Institute
    City University of London
Journal of Vision December 2022, Vol.22, 3446. doi:https://doi.org/10.1167/jov.22.14.3446
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      David Peterzell, Christopher Tyler; Recovery of vivid 3D percepts from 125 years of historical motion film and animation clips, part 2 - 3D movies. Journal of Vision 2022;22(14):3446. https://doi.org/10.1167/jov.22.14.3446.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Stereoscopy involves presenting two differentially-offset images separately to left and right eyes. This 2D image information is combined binocularly in the brain to generate 3D depth perception. Last year, static image pairs were used to introduce and demonstrate the ability to recover and perceive dynamic 3D structure from 2D movies. This year, 3D movies generated from 2D movies demonstrate the same phenomenon, created by introducing a delay between identical tracking shots presented to the left and right eyes. As before, the stereoscopic displays were generated from a variety of lateral and arc tracking movies, including: 1) dolly shots, 2) lateral shots taken while driving, flying, boating, traveling by rail, and orbiting planets, 3) ‘bullet time'/Matrix time sequence shots, and 4) animations based on 3D models. These were scenes from classic motion pictures, or archival footage of significant historical and strategic events. This year's demonstrations include new 3D versions of the first known tracking shots (Venice, 1896), the Hindenburg’s last moments (1937), 'bullet time' animation from Speed Racer/MachGoGogo (1966), The Beatles rehearsals (1969), and scenes generated while orbiting the moon and distant planets. The method enables viewers to (1) see historically or strategically important scenes in 3D; (2) infer depth structure and estimate distance when static monocular cues to depth are sparse or non-existent; and (3) break static forms of camouflage. The method holds the potential to quantify real and perceived depth from motion parallax in historical and contemporary popular movie sequences. Moreover, we demonstrate that binocular disparity sequences derived from dolly-arc tracking shots rotating around a subject can generate robust 3D perception, despite the common practice of avoiding such binocular convergence in the stereoscopy field (e.g., Gao et al. 2018, PLoS One).

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×