August 2016
Volume 16, Issue 12
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2016
LabelMeSymmetry: a tool for human symmetry perception
Author Affiliations
  • Chris Funk
    CSE and EE, College of Engineering, PSU
  • Yanxi Liu
    CSE and EE, College of Engineering, PSU
Journal of Vision September 2016, Vol.16, 306. doi:https://doi.org/10.1167/16.12.306
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Chris Funk, Yanxi Liu; LabelMeSymmetry: a tool for human symmetry perception. Journal of Vision 2016;16(12):306. https://doi.org/10.1167/16.12.306.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Even at a young age, most people can identify where symmetry resides within an image. However, there are no open datasets on human symmetry labels on real world images. There have been a few datasets created to compare symmetry detection algorithms but they have been limited and only labeled by a few individuals, preventing in depth study of symmetry perception on real images. We utilize an in-house crowdsourcing tool and the publicly available Microsoft COCO dataset to collect symmetry labels on real world images deployed on Amazon Mechanical Turk (AMT). The tool strives not to influence people's understanding of symmetry. For relatively low cost, the crowdsourcing tool has been able to create a dataset of 78,724 symmetry labels on over 1,000 images by 300+ AMTs from all over the world. To our knowledge, this is the largest dataset of symmetry labels for both perceived rotation symmetries and reflection (mirror) symmetries on natural photos. The labeled symmetries are then clustered automatically to find the statistical consensus on each image. Based on these consensuses, the Precision-Recall (PR) curves are analyzed between the different gender, education level and age group of the AMTs. No statistical difference is found between male and female AMTs, while statistically significant differences in symmetry perception are found between different education levels and age-groups. For example, we demonstrate p-value less than 0.001 for AMTs of age 51-71 with all other age groups in PR performance, and doctoral/professional AMTs with all other education levels. The younger AMTs (less than or equal to 20 years old) and AMTs with high school education level, respectively, perform the best among all categories.

Meeting abstract presented at VSS 2016

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×