December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
Viewpoint similarity of 3D objects predicted by image-plane position shifts
Author Affiliations & Notes
  • Emma E.M. Stewart
    Justus-Liebig University Giessen, Germany
  • Frieder T. Hartmann
    Justus-Liebig University Giessen, Germany
  • Roland W. Fleming
    Justus-Liebig University Giessen, Germany
  • Footnotes
    Acknowledgements  This project has received funding from the Deutsche Forschungsgemeinschaft (DFG), grant number 460533638.
Journal of Vision December 2022, Vol.22, 3886. doi:https://doi.org/10.1167/jov.22.14.3886
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Emma E.M. Stewart, Frieder T. Hartmann, Roland W. Fleming; Viewpoint similarity of 3D objects predicted by image-plane position shifts. Journal of Vision 2022;22(14):3886. https://doi.org/10.1167/jov.22.14.3886.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Our world is filled with complex objects. To make perceptual inferences, decisions, and action plans, humans must often judge the three-dimensional shape and orientation of objects they encounter. Despite the importance of view-invariant object perception, the computations underlying the 3D rotational discriminability of objects remain unclear. Here, we aimed to devise a metric that predicts viewpoint similarity for rotated novel objects, based on the projected position shifts of surface points as the object rotates. We created six 3D Shepard-Metzler-like objects of varying complexity, and rendered each from nineteen equally-spaced viewpoints rotated independently around the horizontal or vertical axes. For each object and viewpoint, participants were shown a standard pair of views of the object, along with a test pair, and used a mouse to rotate one of the test pair objects so that the test views were the same perceived rotational distance apart as the standard pair. If the reported distance between the test pair was smaller than the standard pair, this viewpoint was taken to be more rotationally discriminable, and vice versa. For n = 29 participants, our findings reveal substantial and consistent variations in perceived viewpoint similarity across different object orientations. We then developed a metric that predicted these variations, based on the sum of the projected displacement vectors of visible surface points as the object incrementally rotated from one viewpoint to the next. This metric predicted human rotational discriminability results for both horizontal and vertical rotations with striking accuracy [R = 0.60]. This suggests that to judge viewpoint similarity, observers mentally rotate the object and estimate the projected position shifts of points in the imagined view relative to the seen view. This metric provides a computational correlate linking theories of viewpoint discrimination and mental rotation.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×