The Post-Processing Stack is an optional Unity package that provides several tools for modifying rendered images before display, including simulations of optical effects such as chromatic aberration (
Post-Processing Stack v2, n.d.). It provides a method called “color grading,” whose purpose is to adjust the color appearance of a rendered scene. Here we show that color grading can be used to compensate for a VR headset’s nonlinear luminance response and make luminance proportional to achromatic RGB value.
Suppose we choose three \(256 \times 256 \times 256\) arrays of integers from 0 to 255. We call them \(R\), \(G\), and \(B\) and denote their entries as \(R_{ijk}\), \(B_{ijk}\), and \(G_{ijk}\), in the usual matrix- and tensor-like fashion, except that subscript indices begin at zero instead of 1. We can use these arrays to define a function \(M\) that transforms RGB triplets by mapping \((r, g, b)\) to \(M(r,g,b) \equiv (R_{rgb}, G_{rgb}, B_{rgb})\). For example, the RGB value (50, 100, 150) is mapped to \(M(50, 100, 150) = ( R_{50,100,150}, G_{50,100,150}, B_{50,100,150} )\).
Color grading in Unity uses such a mapping, with two modifications. First, the maximum array size is
\(32 \times 32 \times 32\), and the full range of RGB values (0–255) is mapped using interpolation. Second, the arrays
\(R\),
\(G\), and
\(B\) are encoded as a single
\(n \times n^2 \times 3\) color image. To construct this image, we rearrange the
\(n \times n \times n\) array
\(R\) described above into an
\(n \times n^2\) array
\(R^{\prime }\) as follows. Let
\(R_{::k}\) denote the
\(n \times n\) subarray of
\(R\) obtained by holding the third subscript constant at some value
\(k\). Then we define
\begin{eqnarray}
R^{\prime } = [ R_{::0} \; \; R_{::1} \; \; \ldots \; \; R_{::n-1} ] \quad
\end{eqnarray}
That is, the
\(n \times n\) slices of
\(R\) obtained by setting
\(k=0\),
\(k=1\), and so on are concatenated left to right. More concisely,
\begin{eqnarray}
R^{\prime }_{i,j} = R_{i, ( j \: \text{mod} \: n), \lfloor j / n \rfloor } \quad
\end{eqnarray}
for
\(0 \le i \lt n\) and
\(0 \le j \lt n^2-1\), where “mod” is modulus and
\(\lfloor j / n \rfloor\) is integer division. Arrays
\(G\) and
\(B\) are rearranged into
\(G^{\prime }\) and
\(B^{\prime }\) in the same way. Finally, we construct an 8-bit color image with
\(R^{\prime }\),
\(G^{\prime }\), and
\(B^{\prime }\) as the three color channels and save the image in a lossless format such as PNG. In the Unity documentation, such an image is called a “lookup table (LUT) texture.” When the LUT texture is loaded into the Post-Processing Volume component, the LUT transformation described above is applied as a late stage of calculating rendered RGB values. In the Supplementary Material, we provide MATLAB code for generating LUT texture images and instructions for incorporating the Post-Processing Stack into a Unity project (Supplementary Material,
Section 1(f)).
Figure 3a shows that the color grading methods described here succeed in linearizing luminance on an Oculus Rift S headset. The red symbols show luminance measurements (made with an LS-110 spot photometer) from a set of achromatic RGB values assigned to a planar object with an “Unlit” material type and displayed in the headset. The plane was large enough to fill the headset’s field of view, as recommended in
Appendix A. The luminance values follow a typical expansive nonlinearity. The green symbols show the corresponding measurements after a LUT texture generated by code provided in the Supplementary Material was loaded into the Post-Processing Volume component. With the LUT in place, luminance is proportional to achromatic RGB.
As a further test of whether we had succeeded in linearizing luminance,
Figure 3b shows luminance measurements made with the LUT in place, using the pinhole camera described in
Appendix B, which does not rely on the model of light scatter used in
Appendix A. Here too, luminance is proportional to achromatic RGB. We do note a slight dip in measured luminance below the linear fit at low luminances and a slight increase above at high luminances. This may be due to small calibration errors in the pinhole camera.
The Unity documentation reports that color grading interpolates RGB values linearly between the values specified in the LUT texture. We find that there are sometimes small departures from this claim. For example,
Figures 4a and
4b show the result of repeating the experiment of
Figure 2c but with color grading based on a LUT texture that represents the identity mapping. Here we expect rendered RGB to be equal to the Lambertian material’s RGB value. This is approximately true, but the zoomed-in plot in
Figure 4b shows that there are small and consistent deviations.
Figure 4c shows a more demanding version of this test, with a LUT texture that has just eight samples and represents a triangular wave that oscillates between RGB values of 100 and 200. Rendered RGB follows the expected pattern approximately, but again with some deviations. Except for unusual mappings like the triangular wave, departures from linear interpolation usually seem to be limited to just one RGB value, so in most applications, they will not interfere with using color grading to linearize luminance. We mention these approximations, though, as another example of the need to examine rendered stimuli closely.
We wished to confirm that the effect of color grading was the same in a few other common Unity configurations. We rendered a sphere with (a) a Lambertian material (Legacy/Diffuse), (b) a Standard material with default settings, (c) a Standard material with an achromatic texture, and (d) a Standard material with baked lighting. We rendered each sphere with and without color grading, using a LUT texture generated by the software in the Supplementary Material to linearize luminance in the Oculus Rift S headset used throughout this article.
Figure 5 shows how color grading transformed the images, by plotting the achromatic RGB value rendered for each pixel with color grading against the achromatic RGB value of the same pixel without color grading. The effect of color grading was identical in all four cases.
Color grading in Unity differs from some other lookup table methods that readers may be familiar with, such as the one implemented for MATLAB in the Psychophysics Toolbox (PTB;
Brainard, 1997). One difference is that the PTB lookup table is a 256 × 3 matrix (or
\(n \times 3\) for greater color depths), which we can call
\(L\), and an RGB triplet
\((r, g, b)\) is mapped to
\((L_{r1}, L_{g2}, L_{b3})\). This means that each component of the triplet is transformed independently of the other components. In Unity color grading, the mapping of each component can depend on the values of the other components as well (recall that each component is effectively looked up in an
\(n \times n \times n\) array), so the transformation is much more flexible. With this degree of flexibility, it should be possible to use color grading to implement standard methods for color calibration as well as luminance calibration (
Brainard et al., 2002).
Another difference is that the PTB lookup table does not affect the RGB values in video memory. Instead, the lookup table is written to graphics hardware, and RGB values in video memory are transformed en route to being displayed on the monitor. The RGB values in a screen capture, for example, are unaffected by the PTB lookup table. In Unity, the LUT texture is used in a late stage of the rendering process, and it does modify the RGB values written to video memory. One consequence is that if a display has a PTB-type lookup table, then it still needs to be controlled by some mechanism other than Unity color grading.