Abstract
Few stimuli can captivate human attention like a movie; young children and adults alike are drawn to the screen. Properties of film are able to engage viewer's attention for hours, but what is it about a movie that makes people stop and watch?
Recent research into Hollywood film has revealed a number of trends: the average shot length (ASL) in popular film has decreased while the overall running length of films has remained the same. Our current research into this change began by collecting and analyzing a database of over 150 popular Hollywood films ranging from 1935-2005. Utilizing a mixture of algorithmic cut detection and human confirmation, we were able to accurately find shot transitions for all 150 films. Power and autocorrelation analysis show that shots are not only becoming shorter over time, but the distribution of shot lengths exhibit a distribution that is approaching 1/f. This type of distribution is very similar to the endogenous rhythms found in human reaction times, and is thought to be due to temporal fluctuations in attention. We propose that film has evolved to interface with the rhythms of human attention, and by extension, the temporal structure of the world.
In addition to a changing shot distribution, the correlation of neighboring frames has decreased over the past 70 years. This can be explained as a gradual increase in the amount of visual activity: motion of scenes in front of the camera or movement of the camera itself. This change is evident in the modern Action and Adventure genres including recent “queasy-cam” films such as Cloverfield and The Bourne Ultimatum. These films, among others, exhibit just how fast-moving, and thus uncorrelated, frames in film have become. These trends in film raise questions about where the limits of visual attention are placed.