August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Using Online Testing to Measure Spatial Frequency and Orientation Tuning in Face Processing
Author Affiliations & Notes
  • Francis Gingras
    Université du Québec à Montréal
    Université du Québec en Outaouais
  • Justin Duncan
    Université du Québec en Outaouais
  • Frédéric Gosselin
    Université de Montréal
  • Daniel Fiset
    Université du Québec en Outaouais
  • Caroline Blais
    Université du Québec en Outaouais
  • Footnotes
    Acknowledgements  Canada Research Chair in cognitive and social vision (# 950-232282); Natural Sciences and Engineering Research Council of Canada (RGPIN-2019-06201)
Journal of Vision August 2023, Vol.23, 5028. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Francis Gingras, Justin Duncan, Frédéric Gosselin, Daniel Fiset, Caroline Blais; Using Online Testing to Measure Spatial Frequency and Orientation Tuning in Face Processing. Journal of Vision 2023;23(9):5028.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Spatial frequency and orientation (SFO) use has been extensively studied. They typically involve precise control of external parameters (screen resolution, lighting, distance from screen). Consequently, the study of SFO has been constrained mostly to lab studies, which can be an important obstacle to recruiting participants in cross-cultural studies. In this project, we aimed to evaluate if SFO tunings for face recognition can successfully be measured via online testing. Using the newly released Vpixx Pack&Go(tm) platform, we built a same/different face matching experiment in Matlab, using SFO bubbles to filter the faces. The method applies Gaussian filters centered on randomly selected points in Fourier space, akin to Bubbles (Gosselin & Schyns, 2001), with each point sampling a specific SFO combination. We recruited 72 participants from Prolific and 28 participants tested locally in lab. Online, we used the credit card test (Li et al., 2020) to calculate the proper distance to maintain a visual angle of 6°. We computed a weighted sum of all the presented filters to build a 2D classification image representing the SFO needed to succeed at the task. One-sample pixel-tests (tCrit = 7.66 in lab and 6.22 online, p < .001; Chauvin et al., 2005) reveal significant SFO peaking at 10.19 cycles per face 12° off the horizontal axis for the online sample, while the lab sample peaked at 14.5 cycles per face .5° off the horizontal axis. An independent pixel-test failed to detect significant differences in tunings between the two samples (tCrit = 5.12, p < .05, tMax = 3.62). The highest t values correspond to the more relevant SFO in both experimental settings, suggesting at most quantitative differences between the two samples. We show that SFO tunings can be measured with precision online, allowing more diverse samples to be tested in psychophysical studies.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.