Abstract
The RGB primaries vary substantially between different monitors. It is known that color perception adapts to the environment. For instance, there is an adaptive change in unique yellow settings across seasons (Welbourne et al., 2015). However, it is unknown whether color perception would change between monitors. Here, we simulated 3 different RGB primary ratios on one monitor, with their luminance ratio at 1:2.8:0.4, 1:3.4:0.4, and 1:3.8:0.4. These 3 simulated monitors differed only in the green primary but not in red and blue. Observers (N = 20) ranked the brightness of seven heterochromatic patches (7 principal colors: R, G, B, RG, RB, GB, RGB). We employed a non-linear max-weighted RGB model to establish the relationship between brightness perception and the weights of R, G, and B luminance values. The model predicted 88.9% of the observers’ rankings correctly. The weight of the green primary decreased across the three simulated monitors (Ps < 0.01), as the intensity of the green primary increased. The weights of the other two primaries showed no significant differences (Ps > 0.3). Thus, when a monitor has a higher intensity in one primary (i.e., green primary in the current setting), this primary’s contribution to brightness perception is weighted less by observers, in adaptation to the monitor color statistics. This adaptation was only partial, approximately 61.1% relative to the intensity change. When looking at the time course of adaptation, we found that the effect requires a few dozens of trials to build up. In conclusion, the present study suggests that our perception of heterochromatic brightness adapts to different monitors.