September 2011
Volume 11, Issue 11
Free
Vision Sciences Society Annual Meeting Abstract  |   September 2011
Consistent frequency-based sound matches to natural visual scenes
Author Affiliations
  • Aleksandra Sherman
    Department of Psychology, Northwestern University, USA
  • Marcia Grabowecky
    Department of Psychology, Northwestern University, USA
    Interdepartmental Neuroscience Program, Northwestern University, USA
  • Satoru Suzuki
    Department of Psychology, Northwestern University, USA
    Interdepartmental Neuroscience Program, Northwestern University, USA
Journal of Vision September 2011, Vol.11, 795. doi:10.1167/11.11.795
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Aleksandra Sherman, Marcia Grabowecky, Satoru Suzuki; Consistent frequency-based sound matches to natural visual scenes. Journal of Vision 2011;11(11):795. doi: 10.1167/11.11.795.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

We previously demonstrated a consistent relationship between visual spatial-frequency and auditory amplitude-modulation (AM) frequency, in which Gabors of 0.5–8 cycles/degree (c/d) were linearly matched to auditory AM frequencies of 1–12 Hz (Guzman et al., VSS 2009). Here, we investigated whether similar crossmodal associations occur for natural scenes, which are dominated by various spatial-frequency components. We asked whether people consistently match specific auditory AM frequencies to photographed scenes, and if so, how these crossmodal matches are associated with the dominant spatial-frequency component and subjective impression (dense, stimulating) of the scene. We found that eighteen observers matched specific auditory AM frequencies to 26 scenes from diverse categories (nature, urban, indoor) with surprising consistency. We applied a 2D Fourier transform to each scene to determine the contrast energy for 12 spatial-frequency bins ranging 0.05–12.8 c/d. Interestingly, scenes with higher contrast energy in the mid-spatial-frequency range 0.5–1.25 c/d were matched to faster AM frequencies, whereas other spatial-frequency components did not contribute to AM frequency matches. Analysis of our images suggests that scenes with stronger mid-spatial-frequency components appear to have numerous object boundaries. Thus, the results suggest a crossmodal association between the visual coding of multiple object boundaries and the auditory coding of AM frequency. Furthermore, a multiple regression of AM frequency matches to subjective scene ratings (obtained after the experiment) indicates that dense (vs. sparse) and stimulating (vs. calm) ratings independently contribute to faster AM frequency matches. Based on the spatial-frequency analysis and subjective ratings, our results demonstrate an association between visual object density and faster auditory AM frequencies in scene perception, and that visual features conveying stimulating content additionally contribute to faster AM frequency matches.

NIH R01 EY018197, NSF BCS 0643191. 
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×