December 2018
Volume 18, Issue 13
Open Access
Article  |   December 2018
The human visual system estimates angle features in an internal reference frame: A computational and psychophysical study
Author Affiliations & Notes
  • Zhe-Xin Xu
    Shanghai Key Laboratory of Brain Functional Genomics, Key Laboratory of Brain Functional Genomics, Ministry of Education, School of Psychology and Cognitive Science, East China Normal University, Shanghai, China
    u53@ur.rochester.edu
  • Yan Chen
    Shanghai Key Laboratory of Brain Functional Genomics, Key Laboratory of Brain Functional Genomics, Ministry of Education, School of Psychology and Cognitive Science, East China Normal University, Shanghai, China
  • Shu-Guang Kuai
    Shanghai Key Laboratory of Brain Functional Genomics, Key Laboratory of Brain Functional Genomics, Ministry of Education, School of Psychology and Cognitive Science, East China Normal University, Shanghai, China
    Institute of Brain and Education Innovation, East China Normal University
    NYU-ECNU Institute of Brain and Cognitive Science, New York University Shanghai, China
    sgkuai@psy.ecnu.edu.cn
  • Footnotes
    *  Z-XX and YC contributed equally to this article.
Journal of Vision December 2018, Vol.18, 10. doi:https://doi.org/10.1167/18.13.10
  • Views
  • PDF
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Zhe-Xin Xu, Yan Chen, Shu-Guang Kuai; The human visual system estimates angle features in an internal reference frame: A computational and psychophysical study. Journal of Vision 2018;18(13):10. https://doi.org/10.1167/18.13.10.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Angle perception is an important middle-level visual process, combining line features to generate an integrated shape percept. Previous studies have proposed two theories of angle perception—a combination of lines and a holistic feature following Weber's law. However, both theories failed to explain the dual-peak fluctuations of the just-noticeable difference (JND) across angle sizes. In this study, we found that the human visual system processes the angle feature in two stages: first, by encoding the orientation of the bounding lines and combining them into an angle feature; and second, by estimating the angle in an orthogonal internal reference frame (IRF). The IRF model fits well with the dual-peak fluctuations of the JND that neither the theory of line combinations nor Weber's law can explain. A statistical image analysis of natural images revealed that the IRF was in alignment with the distribution of the angle features in the natural environment, suggesting that the IRF reflects human prior knowledge of angles in the real world. This study provides a new computational framework for angle discrimination, thereby resolving a long-standing debate on angle perception.

Introduction
An angle is a shape feature consisting of two bounding lines sharing a common endpoint. As angles are one of the most essential visual features, angle perception plays a vital role in object perception (Attneave, 1954; Biederman, 1987; Loffler, 2008). According to the hierarchical stages during the processing of visual information, angle perception is a natural follow-up process after encoding orientations of lines. An intuitive explanation of angle perception is that the human visual cortex extracts orientations of two bounding lines and calculates the differences between them (Snippe & Koenderink, 1994). Although this theory is quite reasonable and follows the hierarchical structure of visual information processing, it has been challenged by a series of psychophysical studies that show the just noticeable differences of bounding lines fail to predict the just noticeable differences for the corresponding angle (S. Chen & Levi, 1996; Heeley & Buchanan-Smith, 1996). Recent studies have demonstrated that angle discrimination performance is dependent on the global shape of stimuli, indicating the contribution of the high-level shape processing in angle perception (Kennedy, Orbach, & Loffler, 2006; Loffler, 2008, 2015). Moreover, S. Chen and Levi found that the JND in angle is more determined by the size of the angle rather than the orientations of the two bounding lines (S. Chen & Levi, 1996). They proposed that an angle is a single independent feature, and its JND is proportional to the size of an angle, as suggested in Weber's law. However, unlike the classic Weber's law with a fixed ratio of JNDs and stimulus magnitudes, the JND in angle does not monotonically increase as angle size becomes larger, but fluctuates at several points. 
To fit the data, S. Chen and Levi implemented a two-segmented Weber's law (Figure 1a). In the first segment, the JND rises linearly with angle size up to 135°, and then diminishes when angle size increases from 135° to 180° in the second segment. However, the two-segmented function is still not able to explain the multiple inflection points of the JND function. Moreover, there is no theoretical basis for separating the function into two sections at 135°. Although S. Chen and Levi's study did not fully address the question, their idea of creating a piecewise function offers us valuable insights. 
Figure 1
 
A comparison of the four models of angle perception. (a) Examples of angle discrimination thresholds (JNDs in angle) in previous studies (left panel: data from S. Chen & Levi's 1996 study; right panel: data from Heeley & Buchanan-Smith's 1996 study) were fitted to the three models: line combinations (solid lines), Weber's law (dashed lines), and two-segmented Weber's law (dashed-dotted lines). (b) An illustration of the orthogonal internal reference frame (IRF) model for angle perception. (c) The fitting results of the IRF model for S. Chen and Levi's data (left panel) and Heeley and Buchanan-Smith's data (right panel), and (d) A comparison of AIC values across the four models fitting S. Chen and Levi's and Heeley and Buchanan-Smith's data.
Figure 1
 
A comparison of the four models of angle perception. (a) Examples of angle discrimination thresholds (JNDs in angle) in previous studies (left panel: data from S. Chen & Levi's 1996 study; right panel: data from Heeley & Buchanan-Smith's 1996 study) were fitted to the three models: line combinations (solid lines), Weber's law (dashed lines), and two-segmented Weber's law (dashed-dotted lines). (b) An illustration of the orthogonal internal reference frame (IRF) model for angle perception. (c) The fitting results of the IRF model for S. Chen and Levi's data (left panel) and Heeley and Buchanan-Smith's data (right panel), and (d) A comparison of AIC values across the four models fitting S. Chen and Levi's and Heeley and Buchanan-Smith's data.
By reviewing the data in previous studies on the angle discrimination task, we noticed that there are three reversing points of the JNDs at 45°, 90°, and 135° (S. Chen & Levi, 1996; Heeley & Buchanan-Smith, 1996). The JND increases when angle size rises from 0° to 45° but decreases when angle size is raised from 45° to 90°. The JND increases again when angle size exceeds 90° and falls once more after 135°, up to 180°. This phenomenon suggests that humans might build an orthogonal coordinate system as an internal reference frame for estimating angles (Figure 1b). When an angle is presented, the human visual system aligns one of the bounding lines with the closest IRF axis and calculates the orientation difference between the other bounding line and its closest axis. The horizontal and vertical reference axes provide useful contextual cues for human visual information processing. As early as the 1930s, Gibson proposed a normalization hypothesis, indicating the human brain holds an internal visual reference-axes (Gibson, 1933). The theory explains tilt illusion, a contextual effect on orientation perception (Tomassini, Solomon, & Morgan, 2014). However, it remains unknown how internal reference axes affect angle perception in the human brain. In this study, we proposed a hierarchical model for angle perception, including two stages: First, orientation encoding encodes an angle feature by extracting the orientations of the bounding lines; and second, angle estimation using an orthogonal internal reference calculates the orientation difference between the bounding line and its nearest axes. We conducted a series of model analyses and a psychophysical experiment to support the two-stage model of angle perception. 
Methods
Computational models
Line combinations (LCs) model
To simulate observers' sensitivity to the orientation of lines, we employed a rectified sine wave function to model the orientation discrimination thresholds (JNDs in orientation) at various reference orientations (Appelle, 1972; Girshick, Landy, & Simoncelli, 2011):  
\(\def\upalpha{\unicode[Times]{x3B1}}\)\(\def\upbeta{\unicode[Times]{x3B2}}\)\(\def\upgamma{\unicode[Times]{x3B3}}\)\(\def\updelta{\unicode[Times]{x3B4}}\)\(\def\upvarepsilon{\unicode[Times]{x3B5}}\)\(\def\upzeta{\unicode[Times]{x3B6}}\)\(\def\upeta{\unicode[Times]{x3B7}}\)\(\def\uptheta{\unicode[Times]{x3B8}}\)\(\def\upiota{\unicode[Times]{x3B9}}\)\(\def\upkappa{\unicode[Times]{x3BA}}\)\(\def\uplambda{\unicode[Times]{x3BB}}\)\(\def\upmu{\unicode[Times]{x3BC}}\)\(\def\upnu{\unicode[Times]{x3BD}}\)\(\def\upxi{\unicode[Times]{x3BE}}\)\(\def\upomicron{\unicode[Times]{x3BF}}\)\(\def\uppi{\unicode[Times]{x3C0}}\)\(\def\uprho{\unicode[Times]{x3C1}}\)\(\def\upsigma{\unicode[Times]{x3C3}}\)\(\def\uptau{\unicode[Times]{x3C4}}\)\(\def\upupsilon{\unicode[Times]{x3C5}}\)\(\def\upphi{\unicode[Times]{x3C6}}\)\(\def\upchi{\unicode[Times]{x3C7}}\)\(\def\uppsy{\unicode[Times]{x3C8}}\)\(\def\upomega{\unicode[Times]{x3C9}}\)\(\def\bialpha{\boldsymbol{\alpha}}\)\(\def\bibeta{\boldsymbol{\beta}}\)\(\def\bigamma{\boldsymbol{\gamma}}\)\(\def\bidelta{\boldsymbol{\delta}}\)\(\def\bivarepsilon{\boldsymbol{\varepsilon}}\)\(\def\bizeta{\boldsymbol{\zeta}}\)\(\def\bieta{\boldsymbol{\eta}}\)\(\def\bitheta{\boldsymbol{\theta}}\)\(\def\biiota{\boldsymbol{\iota}}\)\(\def\bikappa{\boldsymbol{\kappa}}\)\(\def\bilambda{\boldsymbol{\lambda}}\)\(\def\bimu{\boldsymbol{\mu}}\)\(\def\binu{\boldsymbol{\nu}}\)\(\def\bixi{\boldsymbol{\xi}}\)\(\def\biomicron{\boldsymbol{\micron}}\)\(\def\bipi{\boldsymbol{\pi}}\)\(\def\birho{\boldsymbol{\rho}}\)\(\def\bisigma{\boldsymbol{\sigma}}\)\(\def\bitau{\boldsymbol{\tau}}\)\(\def\biupsilon{\boldsymbol{\upsilon}}\)\(\def\biphi{\boldsymbol{\phi}}\)\(\def\bichi{\boldsymbol{\chi}}\)\(\def\bipsy{\boldsymbol{\psy}}\)\(\def\biomega{\boldsymbol{\omega}}\)\(\def\bupalpha{\unicode[Times]{x1D6C2}}\)\(\def\bupbeta{\unicode[Times]{x1D6C3}}\)\(\def\bupgamma{\unicode[Times]{x1D6C4}}\)\(\def\bupdelta{\unicode[Times]{x1D6C5}}\)\(\def\bupepsilon{\unicode[Times]{x1D6C6}}\)\(\def\bupvarepsilon{\unicode[Times]{x1D6DC}}\)\(\def\bupzeta{\unicode[Times]{x1D6C7}}\)\(\def\bupeta{\unicode[Times]{x1D6C8}}\)\(\def\buptheta{\unicode[Times]{x1D6C9}}\)\(\def\bupiota{\unicode[Times]{x1D6CA}}\)\(\def\bupkappa{\unicode[Times]{x1D6CB}}\)\(\def\buplambda{\unicode[Times]{x1D6CC}}\)\(\def\bupmu{\unicode[Times]{x1D6CD}}\)\(\def\bupnu{\unicode[Times]{x1D6CE}}\)\(\def\bupxi{\unicode[Times]{x1D6CF}}\)\(\def\bupomicron{\unicode[Times]{x1D6D0}}\)\(\def\buppi{\unicode[Times]{x1D6D1}}\)\(\def\buprho{\unicode[Times]{x1D6D2}}\)\(\def\bupsigma{\unicode[Times]{x1D6D4}}\)\(\def\buptau{\unicode[Times]{x1D6D5}}\)\(\def\bupupsilon{\unicode[Times]{x1D6D6}}\)\(\def\bupphi{\unicode[Times]{x1D6D7}}\)\(\def\bupchi{\unicode[Times]{x1D6D8}}\)\(\def\buppsy{\unicode[Times]{x1D6D9}}\)\(\def\bupomega{\unicode[Times]{x1D6DA}}\)\(\def\bupvartheta{\unicode[Times]{x1D6DD}}\)\(\def\bGamma{\bf{\Gamma}}\)\(\def\bDelta{\bf{\Delta}}\)\(\def\bTheta{\bf{\Theta}}\)\(\def\bLambda{\bf{\Lambda}}\)\(\def\bXi{\bf{\Xi}}\)\(\def\bPi{\bf{\Pi}}\)\(\def\bSigma{\bf{\Sigma}}\)\(\def\bUpsilon{\bf{\Upsilon}}\)\(\def\bPhi{\bf{\Phi}}\)\(\def\bPsi{\bf{\Psi}}\)\(\def\bOmega{\bf{\Omega}}\)\(\def\iGamma{\unicode[Times]{x1D6E4}}\)\(\def\iDelta{\unicode[Times]{x1D6E5}}\)\(\def\iTheta{\unicode[Times]{x1D6E9}}\)\(\def\iLambda{\unicode[Times]{x1D6EC}}\)\(\def\iXi{\unicode[Times]{x1D6EF}}\)\(\def\iPi{\unicode[Times]{x1D6F1}}\)\(\def\iSigma{\unicode[Times]{x1D6F4}}\)\(\def\iUpsilon{\unicode[Times]{x1D6F6}}\)\(\def\iPhi{\unicode[Times]{x1D6F7}}\)\(\def\iPsi{\unicode[Times]{x1D6F9}}\)\(\def\iOmega{\unicode[Times]{x1D6FA}}\)\(\def\biGamma{\unicode[Times]{x1D71E}}\)\(\def\biDelta{\unicode[Times]{x1D71F}}\)\(\def\biTheta{\unicode[Times]{x1D723}}\)\(\def\biLambda{\unicode[Times]{x1D726}}\)\(\def\biXi{\unicode[Times]{x1D729}}\)\(\def\biPi{\unicode[Times]{x1D72B}}\)\(\def\biSigma{\unicode[Times]{x1D72E}}\)\(\def\biUpsilon{\unicode[Times]{x1D730}}\)\(\def\biPhi{\unicode[Times]{x1D731}}\)\(\def\biPsi{\unicode[Times]{x1D733}}\)\(\def\biOmega{\unicode[Times]{x1D734}}\)\begin{equation}\tag{1}JND\left( \alpha \right) = a\left| {\sin 2\alpha } \right| + b{\rm ,}\end{equation}
where α is the reference orientation, Display Formula\(\alpha \in \left[ {{0^ \circ },{{180}^ \circ }} \right]\), a is the variation range of JND, and b is the minimum JND. Display Formula\(a,b \in \left[ {0, + \infty } \right)\), and Display Formula\(JND\left( \alpha \right)\) represents the orientation discrimination threshold at the orientation α.  
As an angle includes two sublines, the size of an angle Display Formula\(\theta \) is defined by the absolute difference between the orientations of two sub-lines, Display Formula\({\alpha _1}\) and Display Formula\({\alpha _2}\):  
\begin{equation}\tag{2}\theta = \left| {{\alpha _1} - {\alpha _2}} \right|{\rm .}\end{equation}
 
Thus, the JNDs in angle are equal to the root-sum-squared of the JNDs of two lines respectively:  
\begin{equation}\tag{3}JND\left( \theta \right) = \sqrt {JN{D^2}\left( {{\alpha _1}} \right) + JN{D^2}\left( {{\alpha _2}} \right)} {\rm .}\end{equation}
 
Weber's law (WL) model
According to the traditional Weber's law, the angle discrimination threshold monotonically increases with angle size. This model is described by a linear function:  
\begin{equation}\tag{4}JND\left( \theta \right) = k\left( {\theta + a} \right){\rm ,}\end{equation}
where Display Formula\(\theta \) is the size of an angle, k is the slope of the function and a is the intercept.  
Two-segmented Weber's law (TS) model
The two-segment Weber's law model proposed by S. Chen and Levi is a piecewise linear function with an increasing part at smaller angles and a decreasing part at larger angles:  
\begin{equation}\tag{5}JND\left( \theta \right) = \left\{ {\matrix{ {{k_1}\left( {\theta + {a_1}} \right),\theta \le {\theta _0}} \cr {{k_2}\left( {\theta + {a_2}} \right),\theta\ \gt\ {\theta _0}} \cr } } \right.{\rm ,}\end{equation}
Where Display Formula\({k_1}\) and Display Formula\({k_2}\) are the slopes of the linear functions; Display Formula\({a_1}\) and Display Formula\({a_2}\) are the intercepts where the linear functions intersect with the x-axis respectively; Display Formula\({\theta _0}\) is the angle at which the two linear functions intersect with each other; and Display Formula\({a_2}\) is determined by making the two segments intersect when the angle is equal to Display Formula\({\theta _0}\):  
\begin{equation}\tag{6}{a_2} = {{{k_1}} \over {{k_2}}}\left( {{a_1} + {\theta _0}} \right) - {\theta _0}{\rm .}\end{equation}
 
Internal reference frame model (IRF)
The size of an angle is estimated according to the IRF, following a rectified sine wave function:  
\begin{equation}\tag{7}JND\left( \theta \right) = {d_i}\left| {\sin 2\omega } \right|{\rm ,}\end{equation}
where Display Formula\({d_i} \in \left[ {0, + \infty } \right)\), and Display Formula\({d_i}\) is the parameter determined by the nearest reference axis i:  
\begin{equation}\tag{8}{d_i} = \left\{ {\matrix{ {{d_1},{\rm{\ \ if\ }}i = {\rm{\it x\mbox{-}\rm axis}}} \hfill\cr {{d_2},{\rm{\ \ if\ }}i = {\rm{\it y\mbox{-}\rm axis}}} \hfill\cr } } \right.{\rm ,}\end{equation}
Display Formula\(\omega \) is the difference in orientation between the unaligned bounding line and the nearest reference axis at the angle Display Formula\(\theta \):  
\begin{equation}\tag{9}\omega = {\rm{min}}\left\{ {\theta\ {\rm{mod\ }}{{90}^\circ }, - \theta\ {\rm{mod\ }}{{90}^\circ }} \right\}{\rm .}\end{equation}
 
Here, when the size of an angle is smaller than 45°, Display Formula\(\omega = \theta \); when the size of an angle is larger than 45° and smaller than 90°, Display Formula\(\omega = {90^\circ } - \theta \); when the size of an angle is between 90° and 135°, Display Formula\(\omega = \theta - {90 ^\circ }\); and when the size of an angle is larger than 135°, Display Formula\(\omega = {180 ^\circ } - \theta \)
Model fitting
The models were fitted with the data using the least-squares method. To compare the goodness-of-fit of these models, we calculated the Akaike information criterion (AIC; Akaike, 1974).  
\begin{equation}\tag{10}AIC = 2k + n \cdot \ln \left( {n \cdot MSE} \right){\rm ,}\end{equation}
where k is the number of parameters, n is the number of data points, and MSE is the mean square error of the fitting calculated by the following equation:  
\begin{equation}\tag{11}MSE = {1 \over n}\mathop \sum \limits_{i = 1}^n {\left( {\widehat {{y_i}} - {y_i}} \right)^2}{\rm ,}\end{equation}
where Display Formula\(\widehat {{y_i}}\) is the output of the model and Display Formula\({y_i}\) is the observed value.  
Natural image statistics
Estimating the environmental distribution of angles
The distribution of angles was computed from 1388 images in a public image database (Olmos & Kingdom, 2004; McGill Calibrated Color Image Database, 1388, 1920 × 2560 TIF photographs of natural scenes). Each image was converted from the linear RGB color space into the CIE XYZ color space and normalized by its mean luminance. Only the luminance information (i.e., Y values) was used for analysis. We first extracted the edge features in the grayscale image based on the Canny method (Canny, 1986). The standard deviation of the Gaussian filter in the Canny method was three pixels with the lower threshold of the luminance gradient at 0.08 and the higher threshold at 0.2. Then we performed the Standard Hough Transform to extract line segments with the minimum line length of 50 pixels. Two lines with the same orientation were merged if the interdistance was less than 15 pixels. We calculated the size of an angle between each pair of intersected lines and calculated the frequency distribution of those angles. The histogram of the distribution of angles was smoothed using a moving average filter with the span of 5°. 
Psychophysical experiments
Participants
Six observers (three males and three females; mean age 21.5 years) with normal or corrected-to-normal vision participated in the psychophysical experiments. All subjects were new to psychophysical experiments and unaware of the purpose of the study. Informed written consent was obtained from each participant before data collection. The study complied with the tenets of the Declaration of Helsinki and was approved by the research ethics committees of the East China Normal University where the data were collected. 
Apparatus
We presented the stimuli on a 19-in. CRT monitor (IIYAMA Vision Master Pro 456; mean luminance was 56.15 cd/m2). The resolution of the display was 1280 × 1024, and the pixel size was 0.026° × 0.026°. The refresh rate was 85 Hz. We covered the edge of the monitor by a black cardboard, leaving only a circular aperture of a diameter of 20.6°. This circular aperture prevented observers from using the vertical and horizontal edges of the rectangular screen as cues in the experiment. The stimuli were generated by the Psychophysics Toolbox (Brainard, 1997; Pelli, 1997) based on MATLAB (MathWorks, Natick, MA). A chin rest was used to restrict participants' head movements. Viewing was binocular at a distance of 55 cm. The experiments were conducted in a dark room. 
Stimuli
The stimuli were of a circular random dots pattern (10° of visual angle in diameter) including a group of gray dots (0.026° in diameter) against a dark background. The lines were superimposed upon the dots with 0.15° intergap distance. For the orientation discrimination task, the lengths of the lines varied from 3° ± 0.2°. The midpoints of the lines were randomly jittered ±0.5° horizontally or vertically away from the screen center to prevent participants from using line endpoints as reliable cues in performing the task (Figure 3a). The jitter of line feature also erased the afterimage of the first stimulus in a temporal 2AFC procedure. With the angle discrimination task, a stimulus was a V-shape angle composed of two 3° ± 0.2° long dotted lines. The vertex of the angle was randomly located within a range of ±0.5° from the screen center (Figure 3a). 
Procedure
The JNDs in angle and JNDs in orientation were measured with a temporal 2AFC procedure in three-down-one-up staircases. Each trial was preceded by a fixation point that persisted for 500 ms at the beginning of the trial. The test and reference stimuli were presented in a random order (500 ms for each), separated by a 500-ms interstimulus interval. With the orientation discrimination task, the participant's task was to judge whether the stimulus in the second interval was more clockwise or counterclockwise. In the angle discrimination task, the participant's task was to judge which stimulus interval contained the stimulus with a larger angle. Each staircase included four preliminary reversals and six experimental ones. The average of the experimental reversals was calculated as the JND for each staircase run. The staircase procedure was repeated 10 times for each condition and the mean JNDs of 10 staircases were calculated. 
Results
Computational modeling: The role of the IRF on angle perception
We proposed that the human visual system uses an orthogonal internal reference to compute an angle feature. In particular, the human visual system aligns one bounding line with the IRF axis and calculates the orientation difference between the other bounding line and its closest axis. When the angle is smaller than 45°, the nearest axis for the second bounding line is the axis aligning with the first bounding line. Given that configuration, if an angle becomes larger, the orientation difference between the second bounding line and the reference axis increases. According to Weber's law, the JND should increase as angle size become larger. However, when angle size is over 45° and less than 90°, the nearest axis changes to the vertical axis. Increasing angle size shrinks the gap between the second bounding line and the nearest IRF axis, resulting in a decrease of the JND in this range. The similar computation rule applies to the ranges of 90°–135° and 135°–180°, resulting in an inflection point at 135°. Using these rules, we built a computational model of the IRF to explain human angle discrimination performance. We compared this IRF model with the other three existing models: line combinations (LCs), Weber's law (WL), and two-segmented Weber's law (TS). We fit the data from the two classical studies on angle discrimination (S. Chen & Levi, 1996; Heeley & Buchanan-Smith, 1996) using the four models. Both studies measured the JNDs for V-shape angles comprised of two solid lines when the sizes and orientations of angles were varied. In both studies, JNDs in angle exhibited a similar bimodal distribution as a function of angle size, resulting in four segments with different trends. The LCs model and Weber's law were unable to explain this important aspect of the data. The r2 of the LCs model was 0.19 (p = 0.033, 95% CI = [0.00, 0.51]) for the data of S. Chen and Levi's study and was 0.08 (p = 0.029, 95% CI = [0.00, 0.35]) in fitting the data of Heeley and Buchanan-Smith's study. Similarly, the r2 of Weber's law was 0.05 (p = 0.292, 95% CI = [0.04, 0.33]) and 0.01 (p = 0.222, 95% CI = [0.01, 0.23]) in fitting the data of S. Chen and Levi's (1996) and Heeley and Buchanan-Smith's (1996) studies, respectively (Figure 1a). Although the two-segmented Weber's law model proposed by S. Chen and Levi seemed to align with their own data (r2 = 0.80, p < 0.001, 95% CI = [0.58, 0.91]), its performance of fitting the data in Heeley and Buchanan-Smith's study was lower (r2 = 0.52, p < 0.001, 95% CI = [0.32, 0.74]; Figure 1a), indicating the model is likely to be overfitting of their own data. The IRF model (Figure 1b) we proposed divided the space into four subregions, accounting for the four sections of the JNDs in angle, r2 = 0.88, p < 0.001, 95% CI = [0.74, 0.95], for the data from S. Chen and Levi's study; r2 = 0.83, p < 0.001, 95% CI = [0.76, 0.93], for the data from Heeley and Buchanan-Smith's study; (Figure 1c). All groups of AIC values passed Anderson-Darling (A-D) test and one-sample Kolmogorov-Smirnov (K-S) test of normality (A-D test: p = 0.484 for AIC values in LCs model, p = 0.258 for AIC values in WL model, p = 0.119 for AIC values in TS model, and p = 0.857 for AIC values in IRF model; K-S test: p = 0.744 for LCs model, p = 0.846 for WL model, p = 0.572 for TS model, and p = 0.956 for IRF model). Given these results, we assumed that AIC values are normally distributed in our cases and conducted a repeated-measures ANOVA to compare AICs across models. Our results demonstrated a significant main effect of AIC values across models, F(3, 15) = 8.1, p = 0.002, η2 = 0.618 (Figure 1d). A posthoc test further verified that the IRF model demonstrated significantly lower AIC values compared with the other three models, F(1, 5) = 33.4, p = 0.002, η2 = 0.87. 
Natural image statistics: Prior distribution of angles supports the IRF
How does the human brain establish the IRF? A possible explanation is that the human visual system establishes the reference frame through experience with natural environments. To verify this, we analyzed the statistical distributions of angles in natural images. In particular, we selected more than 1,000 images in a public image database and calculated the size of angles in these images (Figure 2a through b). We calculated the frequency distribution of those angles at five different spatial scales. Although there are slight differences in the distribution of prior across spatial scales, prior distributions were generally high around 0°, 90°, and 180° (Figure 2c). The result supports the idea that the IRF is established based on human experience in natural environments. 
Figure 2
 
The prior distribution of angles from natural images and the observers' data. (a) An exemplary natural image from the database; (b) Edge information of the exemplary nature image; (c) A histogram of angle distribution is computed from the images in the database.
Figure 2
 
The prior distribution of angles from natural images and the observers' data. (a) An exemplary natural image from the database; (b) Edge information of the exemplary nature image; (c) A histogram of angle distribution is computed from the images in the database.
Figure 3
 
The JNDs in angle were predicted by the JNDs in orientation. (a) The stimuli and procedure for the orientation discrimination task and angle discrimination task. Each trial began with a fixation point for 500 ms. The first stimulus image was presented for 500 ms followed by a 500 ms interstimulus interval; then the second stimulus image was presented for another 500 ms. The participants next indicated their responses to judge whether the orientation of the second line was clockwise or counterclockwise relative to the first in the orientation discrimination task or which stimulus had a larger angle size in the angle discrimination task. (b) JNDs in orientation as a function of reference orientation. The JNDs were fitted with a rectified sine-wave function (the solid line). (c) JNDs in angle as a function of the angle orientation. The dashed line represents the quadratic sums of the JNDs in orientation of the two bounding lines. We aligned the predicted values of the quadratic sum model with the participants' JNDs in angle.
Figure 3
 
The JNDs in angle were predicted by the JNDs in orientation. (a) The stimuli and procedure for the orientation discrimination task and angle discrimination task. Each trial began with a fixation point for 500 ms. The first stimulus image was presented for 500 ms followed by a 500 ms interstimulus interval; then the second stimulus image was presented for another 500 ms. The participants next indicated their responses to judge whether the orientation of the second line was clockwise or counterclockwise relative to the first in the orientation discrimination task or which stimulus had a larger angle size in the angle discrimination task. (b) JNDs in orientation as a function of reference orientation. The JNDs were fitted with a rectified sine-wave function (the solid line). (c) JNDs in angle as a function of the angle orientation. The dashed line represents the quadratic sums of the JNDs in orientation of the two bounding lines. We aligned the predicted values of the quadratic sum model with the participants' JNDs in angle.
Does orientation encoding of the bounding lines contribute to angle perception?
Does the human brain process an angle as a holistic feature in the IRF, or does it integrate two bounding lines into an angle feature first and estimate the size of the corresponding angle in the IRF during the second stage? To determine the role of the orientation encoding of the bounding lines in angle perception, we conducted the following experiment: In particular, we asked participants to perform an angle discrimination task at a right angle (90°). As the bounding lines of the right angle coincide with the axes of the orthogonal internal reference, the effect of the IRF on the JND in angle is constant and minimized. We investigated whether the JND in angle was dependent on the orientation sensitivity of the bounding lines. 
We asked six participants to perform an orientation discrimination task and an angle discrimination task. In the orientation discrimination task, we asked participants to judge whether a line was clockwise or counterclockwise to the reference lines from 9° to 180° at a step of 9° (Figure 3a). The results indicated the classic oblique effect such that the JND in orientation was the highest at 45° or 135° whereas being the lowest at 0°, 90°, or 180° (Figure 3b). We then fitted the JNDs in orientation of each participant using a rectified sine function, Display Formula\(JND\left( \alpha \right) = a\left| {\sin 2\alpha + b} \right|\), where Display Formula\(\alpha \) is the reference orientation, and a and b are two free parameters. 
In the angle discrimination task, the size of the angle was fixed at 90° and the orientation of the angle was varied from 9° to 180°, at a step of 9°. The JNDs in angle were plotted as a function of angle orientation in Figure 3c. We noted a significant main effect of angle orientation on the JND in angle, suggesting that the orientation sensitivity to the bounding lines did affect the angle perception. In order to quantitatively determine the relationship between the JNDs in orientation and JNDs in angle, we computed the quadratic sums of JNDs in orientation of the two bounding lines to predict the JNDs of the corresponding angle: Display Formula\(JND\left( \theta \right) = \sqrt {JN{D^2}\left( {{\alpha _1}} \right) + JN{D^2}\left( {{\alpha _2}} \right)} \), where Display Formula\({\alpha _1}\) and Display Formula\({\alpha _2}\) are orientations of two bounding lines and θ is the angle size. We compared the predicted values with measured JNDs in angle, and found that the LCs model could well predict the JNDs in angle (r2 = 0.82, p < 0.001). 
Why does the LCs model explain the JNDs of the right angle (90°) but does not fit the data suitably in previous studies? It is probably because of the effects of orientation sensitivity are not equal across angle sizes. Figure 4 lists a computational simulation of the quadratic sums of the JNDs of the bounding lines. The JNDs in orientation were calculated by the equation Display Formula\(JND\left( \alpha \right) = a\left| {\sin 2\alpha + b} \right|\), where a = 1.241 and b = 1.318. The parameters a and b are from the fitting results of our experimental data. If an angle is 45° (Figure 4a), the JNDs of the two bounding lines do not vary according to the same trend as a function of angle orientation, resulting in a very small variation of the quadratic sums of the JNDs across angle orientations. However, when an angle is 90° (Figure 4b), the JNDs of two bounding lines are covariant. The quadratic sums of the JNDs of the two bounding lines fluctuate across a relatively large range. We computed the absolute difference between the highest value and lowest values of the quadratic sum of the JNDs and plotted the JNDs' variation range as a function of angle size (Figure 4c). The simulation results indicated that the JNDs' variation range was the highest at an angle of 90° and the lowest at an angle of 45°. Although S. Chen and Levi's (1996) and Heeley and Buchanan-Smith's (1996) studies measured the JNDs across angle sizes and orientations, they did not analyze the effect of angle orientation at different angle sizes separately. In many cases, the quadratic sums of the JNDs only had a small range of variation across angle orientations. Therefore, the contribution of the orientations of the bounding lines was not very visible in the model fitting of their data. 
Figure 4
 
A computational simulation of the quadratic sums of the JNDs in orientation across angle orientations. (a) An illustration of the JNDs of the two bounding lines and their quadratic sum for an angle of 45°. (b) The JNDs in orientation and their quadratic sum for an angle of 90°. (c) The variation ranges of the quadratic sums of the JNDs of the two bounding lines across angle sizes.
Figure 4
 
A computational simulation of the quadratic sums of the JNDs in orientation across angle orientations. (a) An illustration of the JNDs of the two bounding lines and their quadratic sum for an angle of 45°. (b) The JNDs in orientation and their quadratic sum for an angle of 90°. (c) The variation ranges of the quadratic sums of the JNDs of the two bounding lines across angle sizes.
A two-stage model of angle perception
Based on our model analysis and experimental findings, we verified that both the combination of two bounding lines and the IRF are necessary computational processes in angle perception. Thus, we proposed a two-stage model for angle perception. In the first stage, the human visual system encodes the orientation features of the two bounding lines and combines them to generate an angle feature. The output of the first stage is the quadratic sum of the JNDs in orientation of the two bounding lines. During the second stage, the human visual system estimates the size of an angle in an internal orthogonal reference frame. The outputs of the second stage are determined by the angular distance between the bounding lines and reference axes. The JNDs in angle are the quadratic sum of the outputs in the first and the second stage. 
Using the model, we conducted a computational simulation on JNDs in angle across sizes and orientations of the angle. We sampled 180 angle sizes and 180 angle orientations from 1° to 180° at a step of 1°. In the computational simulation, we used the parameters fitted by S. Chen and Levi's (1996) and Heeley and Buchanan-Smith's (1996) data and our data in the model. The JNDs in orientation were calculated by Display Formula\(JND\left( \alpha \right) = a\left| {\sin 2\alpha + b} \right|\), where a = 0.251, b = 0.699. The output of the IRF model was calculated by Display Formula\(JND\left( \theta \right) = {d_i}\left| {\sin 2\omega } \right|\), where ω is the angular difference between the bounding line and its nearest axis, with d1 = 0.609, d2 = 1.898. The simulated results generated a two-dimensional map of JNDs in angle (Figure 5). The figure supplies an overview of the predicted human angle discrimination performance under all conditions. Comparatively, experimental studies only measured partial conditions on the map and had just a limited view. S. Chen and Levi's and Heeley and Buchanan's studies assessed the conditions varying in the horizontal axis, which highlighted the contributions of the IRF. Our experiment measured the conditions in the vertical direction and made the contributions of orientation sensitivity to the bounding lines more visible. 
Figure 5
 
A computational simulation of the predicted JNDs across sizes and orientations of angles. S. Chen and Levi' studies measured the conditions on line b and c while Heeley and Buchanan-Smith's studies assessed the conditions on line a, b, c, and d. Our experiment measured the condition on line e.
Figure 5
 
A computational simulation of the predicted JNDs across sizes and orientations of angles. S. Chen and Levi' studies measured the conditions on line b and c while Heeley and Buchanan-Smith's studies assessed the conditions on line a, b, c, and d. Our experiment measured the condition on line e.
Discussion
As an important step in visual information processing, angle estimation plays an essential role in generating object perception in our daily lives. In this study, we propose a computational model that the human visual system represents an angle feature in an orthogonal IRF. The IRF model successfully predicted the bimodal distribution of the JND in angle, addressing the problem that two previous models, line combinations, and Weber's law failed to resolve. 
Moreover, we measured participants' JNDs at the angle of 90° to examine the contribution of orientation sensitivity to the bounding lines. Our results support that orientation sensitivity to the bounding lines still plays a major part in angle perception. Moreover, the effect of line orientation sensitivity on JND in angle is the highest at the angle of 90° and the lowest at 45°. The results are in agreement with S. Chen and Levi's (1996) findings that show the JND in angle depends on the orientations of the bounding lines in a small range near 90°. S. Chen and Levi considered the data near 90° as a special range in their model analysis. Our results offer a new and unified theoretical explanation of this phenomenon. Based on that explanation, we proposed a two-stage model of angle perception, addressing the debate of whether angle discrimination performance depends on orientations of its bounding lines or angle size. 
Our results highlight the important role of orthogonal IRF in angle perception. A previous study showed that human participants had a high accuracy and strong memory for reproducing the angle of 90°, indicating the human brain may maintain an internal template of the right angle (Gray & Regan, 1996). Our statistical analysis of natural images provides evidence that the human brain may build the IRF through their experience to natural environments. By evaluating natural images in an extensive database, we determined that the natural environment has a considerable proportion of angle features around 0°, 90°, and 180°. With exposure to such an environment, the human visual system has an expectation that these angels will be more frequent. Such an expectation alters the prior knowledge in a Bayesian decision-making framework, which is beneficial for processing angle features in natural environments. 
The reason for establishing such a computing framework might relate to the role of angle perception in human daily life. If a feature is particularly vital for human life, the human brain should prefer to use the speed-first strategy to create an independent template for the feature, reducing real-time computing time such as face recognition (Kanwisher, McDermott, & Chun, 1997; Xu, 2005). However, this theory raises the potential problem that our brain might need to store many templates, creating “pressure” on our memory system. To avoid this pressure, the brain tends not to store so many templates, but instead computes results from basic units when the feature appears (Riesenhuber & Poggio, 1999; Vanessen, Anderson, & Felleman, 1992). This strategy, of course, has the cost of increasing “pressure” on real-time computing. The human brain must thus balance these two strategies to optimize information processing based on the characteristic of features. Angle is a typical feature in middle-level vision. It may be too extravagant for our memory system to build a large number of templates for all angle features, and it is also inefficient to compute every angle from basic line features. Therefore, an internal reference frame could be an efficient computing strategy that balances computational complexity and the number of templates stored in our brain. In mathematics, a Cartesian coordinate system is an efficient computational framework and has been widely used in various fields. The orthogonal reference frame divides the space into four quadrants, in which the largest orientation difference between a line and its nearest axis is only 45°. Thus, the coordinate system could process all types of angles in an effective way. The computational framework appears to be an example of maximizing the usage of our brain by balancing memory capacity and computing efficiency. 
The human brain builds different types of reference frames to process, store, and interpret visual information. According to previous studies, the human reference system, in general, has two categories: egocentric and allocentric reference frames (Chang, Harris, & Troje, 2010; Dyde, Jenkin, & Harris, 2006). With egocentric reference frames, objects are presented in space relative to our retina, head, or body axes. In allocentric reference frames, we use information in the environment as cues, including visual context and gravity. These reference frames play different roles in visual information processing, engaging distinct brain mechanisms. The retinotopic visual areas encode low-level visual features in egocentric coordinates (Pouget, Fisher, & Sejnowski, 1993) while the parahippocampal and retrosplenial cortex analyze allocentric visual contextual information (Aminoff, Kveraga, & Bar, 2013). The caudal intraparietal area (CIP) and visual posterior Sylvian (VPS) area may be involved in creating a gravity-centered allocentric reference frame in the brain (A. Chen, DeAngelis, & Angelaki, 2011; Lacquaniti et al., 2013; Liu, Gu, DeAngelis, & Angelaki, 2013; Rosenberg & Angelaki, 2014). It is unclear which types of reference frames affect angle perception. Future studies will explore the contribution of different types of reference frames to the IRF in angle perception. 
Conclusions
In conclusion, we propose the theory that the human visual system processes the angle feature as two stages. The human visual system first encodes the orientation of its bounding lines and represents an angle feature in an orthogonal IRF in the second stage. This theory offers a new and efficient computational approach for angle perception, complementing the theory of line combination and Weber's law. 
Acknowledgments
This research was supported by the National Natural Science Foundation of China Grants (No. 31771209, No. 31571160), the National Social Science Foundation of China Grants (No. 15ZDB016), the Fundamental Research Funds for the Central Universities, and the JRI Seed Grants NYU-ECNU from the Institute of Brain and Cognitive Science at NYU Shanghai to SK. We are grateful for Li Li and Peng Zhang for helpful comments on our manuscript. 
Commercial relationships: none. 
Corresponding author: Shu-Guang Kuai. 
Address: The School of Psychology and Cognitive Science, East China Normal University, Shanghai, China. 
References
Akaike, H. (1974). A new look at the statistical model identification. IEEE Transactions on Automatic Control, 19 (6), 716–723.
Aminoff, E. M., Kveraga, K., & Bar, M. (2013). The role of the parahippocampal cortex in cognition. Trends in Cognitive Sciences, 17 (8), 379–390.
Appelle, S. (1972). Perception and discrimination as a function of stimulus orientation: The “oblique effect” in man and animals. Psychological Bulletin, 78 (4), 266–278.
Attneave, F. (1954). Some informational aspects of visual perception. Psychological Review, 61 (3), 183–193.
Biederman, I. (1987). Recognition-by-components: a theory of human image understanding. Psychological review, 94 (2), 115–147.
Brainard, D. H. (1997). The Psychophysics Toolbox. Spatial Vision, 10 (4), 433–436.
Canny, J. (1986). A computational approach to edge detection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 8 (6), 679–698.
Chang, D. H., Harris, L. R., & Troje, N. F. (2010). Frames of reference for biological motion and face perception. Journal of Vision, 10 (6): 22, 1–11, https://doi.org/10.1167/10.6.22. [PubMed] [Article]
Chen, A., DeAngelis, G. C., & Angelaki, D. E. (2011). Representation of vestibular and visual cues to self-motion in ventral intraparietal cortex. Journal of Neuroscience, 31 (33), 12036–12052.
Chen, S., & Levi, D. M. (1996). Angle judgment: Is the whole the sum of its parts? Vision Research, 36 (12), 1721–1735.
Dyde, R. T., Jenkin, M. R., & Harris, L. R. (2006). The subjective visual vertical and the perceptual upright. Experimental Brain Research, 173 (4), 612–622.
Gibson, J. J. (1933). Adaptation, after-effect and contrast in the perception of curved lines. Journal of Experimental Psychology, 16 (1), 1–31.
Girshick, A. R., Landy, M. S., & Simoncelli, E. P. (2011). Cardinal rules: Visual orientation perception reflects knowledge of environmental statistics. Nature Neuroscience, 14 (7), 926–932.
Gray, R., & Regan, D. (1996). Accuracy of reproducing angles: Is a right angle special? Perception, 25 (5), 531–542.
Heeley, D., & Buchanan-Smith, H. (1996). Mechanisms specialized for the perception of image geometry. Vision Research, 36 (22), 3607–3627.
Kanwisher, N., McDermott, J., & Chun, M. M. (1997). The fusiform face area: A module in human extrastriate cortex specialized for face perception. Journal of Neuroscience, 17 (11), 4302–4311.
Kennedy, G. J., Orbach, H. S., & Loffler, G. (2006). Effects of global shape on angle discrimination. Vision Research, 46 (8–9), 1530–1539.
Lacquaniti, F., Bosco, G., Indovina, I., La Scaleia, B., Maffei, V., Moscatelli, A., & Zago, M. (2013). Visual gravitational motion and the vestibular system in humans. Frontiers in Integrative Neuroscience, 7, 101.
Liu, S., Gu, Y., DeAngelis, G. C., & Angelaki, D. E. (2013). Choice-related activity and correlated noise in subcortical vestibular neurons. Nature Neuroscience, 16 (1), 89–97.
Loffler, G. (2008). Perception of contours and shapes: Low and intermediate stage mechanisms. Vision research, 48 (20), 2106–2127.
Loffler, G. (2015). Probing intermediate stages of shape processing. Journal of Vision, 15 (7): 1, 1–1, https://doi.org/10.1167/15.7.19. [PubMed] [Article]
Olmos, A., & Kingdom, F. A. (2004). A biologically inspired algorithm for the recovery of shading and reflectance images. Perception, 33 (12), 1463–1473.
Pelli, D. G. (1997). The VideoToolbox software for visual psychophysics: transforming numbers into movies. Spatial Vision, 10 (4), 437–442.
Pouget, A., Fisher, S. A., & Sejnowski, T. J. (1993). Egocentric spatial representation in early vision. Journal of Cognitive Neuroscience, 5 (2), 150–161.
Riesenhuber, M., & Poggio, T. (1999). Hierarchical models of object recognition in cortex. Nature Neuroscience, 2 (11), 1019–1025.
Rosenberg, A., & Angelaki, D. E. (2014). Gravity influences the visual representation of object tilt in parietal cortex. Journal of Neuroscience, 34 (43), 14170–14180.
Snippe, H. P., & Koenderink, J. J. (1994). Discrimination of geometric angle in the fronto-parallel plane. Spatial Vision, 8 (3), 309–328.
Tomassini, A., Solomon, J. A., & Morgan, M. J. (2014). Which way Is down? Positional distortion in the tilt illusion. PLoS One, 9 (10): e110729.
Vanessen, D. C., Anderson, C. H., & Felleman, D. J. (1992, January 24). Information-processing in the primate visual-system—An integrated systems perspective. Science, 255 (5043), 419–423.
Xu, Y. (2005). Revisiting the role of the fusiform face area in visual expertise. Cerebral Cortex, 15 (8), 1234–1242.
Figure 1
 
A comparison of the four models of angle perception. (a) Examples of angle discrimination thresholds (JNDs in angle) in previous studies (left panel: data from S. Chen & Levi's 1996 study; right panel: data from Heeley & Buchanan-Smith's 1996 study) were fitted to the three models: line combinations (solid lines), Weber's law (dashed lines), and two-segmented Weber's law (dashed-dotted lines). (b) An illustration of the orthogonal internal reference frame (IRF) model for angle perception. (c) The fitting results of the IRF model for S. Chen and Levi's data (left panel) and Heeley and Buchanan-Smith's data (right panel), and (d) A comparison of AIC values across the four models fitting S. Chen and Levi's and Heeley and Buchanan-Smith's data.
Figure 1
 
A comparison of the four models of angle perception. (a) Examples of angle discrimination thresholds (JNDs in angle) in previous studies (left panel: data from S. Chen & Levi's 1996 study; right panel: data from Heeley & Buchanan-Smith's 1996 study) were fitted to the three models: line combinations (solid lines), Weber's law (dashed lines), and two-segmented Weber's law (dashed-dotted lines). (b) An illustration of the orthogonal internal reference frame (IRF) model for angle perception. (c) The fitting results of the IRF model for S. Chen and Levi's data (left panel) and Heeley and Buchanan-Smith's data (right panel), and (d) A comparison of AIC values across the four models fitting S. Chen and Levi's and Heeley and Buchanan-Smith's data.
Figure 2
 
The prior distribution of angles from natural images and the observers' data. (a) An exemplary natural image from the database; (b) Edge information of the exemplary nature image; (c) A histogram of angle distribution is computed from the images in the database.
Figure 2
 
The prior distribution of angles from natural images and the observers' data. (a) An exemplary natural image from the database; (b) Edge information of the exemplary nature image; (c) A histogram of angle distribution is computed from the images in the database.
Figure 3
 
The JNDs in angle were predicted by the JNDs in orientation. (a) The stimuli and procedure for the orientation discrimination task and angle discrimination task. Each trial began with a fixation point for 500 ms. The first stimulus image was presented for 500 ms followed by a 500 ms interstimulus interval; then the second stimulus image was presented for another 500 ms. The participants next indicated their responses to judge whether the orientation of the second line was clockwise or counterclockwise relative to the first in the orientation discrimination task or which stimulus had a larger angle size in the angle discrimination task. (b) JNDs in orientation as a function of reference orientation. The JNDs were fitted with a rectified sine-wave function (the solid line). (c) JNDs in angle as a function of the angle orientation. The dashed line represents the quadratic sums of the JNDs in orientation of the two bounding lines. We aligned the predicted values of the quadratic sum model with the participants' JNDs in angle.
Figure 3
 
The JNDs in angle were predicted by the JNDs in orientation. (a) The stimuli and procedure for the orientation discrimination task and angle discrimination task. Each trial began with a fixation point for 500 ms. The first stimulus image was presented for 500 ms followed by a 500 ms interstimulus interval; then the second stimulus image was presented for another 500 ms. The participants next indicated their responses to judge whether the orientation of the second line was clockwise or counterclockwise relative to the first in the orientation discrimination task or which stimulus had a larger angle size in the angle discrimination task. (b) JNDs in orientation as a function of reference orientation. The JNDs were fitted with a rectified sine-wave function (the solid line). (c) JNDs in angle as a function of the angle orientation. The dashed line represents the quadratic sums of the JNDs in orientation of the two bounding lines. We aligned the predicted values of the quadratic sum model with the participants' JNDs in angle.
Figure 4
 
A computational simulation of the quadratic sums of the JNDs in orientation across angle orientations. (a) An illustration of the JNDs of the two bounding lines and their quadratic sum for an angle of 45°. (b) The JNDs in orientation and their quadratic sum for an angle of 90°. (c) The variation ranges of the quadratic sums of the JNDs of the two bounding lines across angle sizes.
Figure 4
 
A computational simulation of the quadratic sums of the JNDs in orientation across angle orientations. (a) An illustration of the JNDs of the two bounding lines and their quadratic sum for an angle of 45°. (b) The JNDs in orientation and their quadratic sum for an angle of 90°. (c) The variation ranges of the quadratic sums of the JNDs of the two bounding lines across angle sizes.
Figure 5
 
A computational simulation of the predicted JNDs across sizes and orientations of angles. S. Chen and Levi' studies measured the conditions on line b and c while Heeley and Buchanan-Smith's studies assessed the conditions on line a, b, c, and d. Our experiment measured the condition on line e.
Figure 5
 
A computational simulation of the predicted JNDs across sizes and orientations of angles. S. Chen and Levi' studies measured the conditions on line b and c while Heeley and Buchanan-Smith's studies assessed the conditions on line a, b, c, and d. Our experiment measured the condition on line e.
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×