Hdr
The range of brightness and color values in Framebuffer is limited to between 0.0 and 1.0. When we set the light and color in the scene, we can only take the value of this range. This is OK in most cases, but the result is also possible, but what happens when there is a multi-light source in the scene with a total brightness of more than 1.0 areas? The answer is that all segments with a luminance or color value exceeding 1.0 are truncated directly to 1.0. This is not the result we want!
Because a large number of fragment brightness or color is truncated directly to 1.0, so that the truncated fragments of the value of the stored values are white values, the resulting screen is a large piece of a seemingly white, obviously lost a lot of detail, it does not look real.
One solution to this problem is to reduce the intensity of the light source and ensure that no area in the scene has more than 1.0 brightness, but this is not a good solution because the light source brightness of the scene is limited and the natural realism is reduced. The good solution is to allow the color value to temporarily exceed 1.0, and then transform it between 0.0 and 1.0, so that no details will be lost.
Although the display can only display colors ranging from 0.0 to 1.0, the lighting calculation does not have this range limit. We can use high dynamic range (HDR) to save a value with a fragment color greater than 1.0. The HDR color range values simulate the brightest rays from the sun to the darkest shadows, and the details are visible.
HDR was first used for shooting, and the photographer synthesized the final HDR image by taking multiple photographs of different exposures in the same scene, which allowed for a wide range of details to be retained for HDR photos. For example, the image below shows a lot of scene detail under a long exposure, which is not captured under short exposure.
This is similar to how the human eye works, when the light is weak or intense, the human eye adjusts itself to look at areas that are dark or very bright, and the human eye's mechanism is like a slider that automatically sets the exposure based on the brightness of the scene.
HDR rendering is similar to the above principle. We use higher precision values to preserve the color values of a wide range of black-to-bright intervals, and the last image output converts the HDR value back to the low dynamic range (LDR) [0.0, 1.0] range. This conversion process is called tone mapping (tone mapping), and there are a number of tone mapping algorithms that allow the conversion process to retain as much HDR detail as possible.
HDR in real-time rendering can not only use the range of values beyond LDR [0.0, 1.0] and retain more detail, but also specify the true brightness of the light source. For example, the sun is brighter than other things like lightning, so you can specify that the sun's brightness is 10.0. This allows us to set more realistic illumination parameters in the scene, which cannot be used in LDR rendering because they are truncated directly to 1.0.
Since the display can only display values ranging from 0.0 to 1.0, we need to convert the current HDR color value back to the range of the display. Simple averaging algorithm is not good enough, because it will make the result is still a white situation. We can use other formulas or curves to do the HDR-to-LDR conversion, which is the previously mentioned tone mapping (tone mapping), which is the final stage of HDR rendering.
Floating point Framebuffers
In order to achieve HDR rendering, we need to avoid the color values being truncated when fragment shading is processed. When Framebuffer uses the normalized fixed-point color format (such as Gl_rgb) as the Colorbuffer built-in format, OpenGL automatically truncates the color value to 0.0 to 1.0 before saving the color value to framebuffer. This truncation operation is performed on most color formats, except for the floating point format, which can be used to save HDR values.
When the format of the Framebuffer Colorbuffer is set to gl_rgb16f, gl_rgba16f, gl_rgb32f, or gl_rgba32f, framebuffer can store color values outside the range of 0.0 to 1.0, Perfectly meets the requirements of HDR rendering.
To create a floating-point framebuffer, you only need to change the Colorbuffer internal format parameter:
The default framebuffer color format for OpenGL is 8-bit to represent a color component. Floating-point framebuffer in gl_rgb32f or gl_rgba32f format represents a color component with 32 bits, and memory consumption is 4 times times the default color format. In most cases, 32-bit accuracy is too high, it is not necessary, gl_rgba16f format 16 bit in the actual application is sufficient.
The color value of the scene contains any value that may exceed 1.0 and is placed into the floating point colorbuffer. In the simple demo scenario of this tutorial, a large elongated box is used to simulate a tunnel with 4 point lights, where the point light is very bright at the end of the tunnel.
Rendering floating-point framebuffer is basically the same as normal rendering framebuffer. The difference is that the fragment shader differs. Let's briefly define a simple pass-through fragment shader:
Here we sampled the floating-point colorbuffer directly and used the output of the fragment shader. This causes the output value to be truncated to 0.0 to 1.0, although the floating-point color map has some more than 1.0 stored values.
Obviously, the light intensity value at the end of the tunnel is truncated to 1.0, because most of the area is full white and loses more than 1.0 of the value in the process because we convert the HDR directly to LDR. We need to improve the conversion process so that after converting to LDR, we keep more details. We call this process a tonal mapping tone mapping.
Tone Mapping
Tone mapping is the process of converting an HDR floating point color value to the last ldr[0.0, 1.0] color range, and try not to lose too much detail.
The simplest tone mapping algorithm is Reinhard tone mapping, we put Reinhard tone mapping in the early version of the fragment shader and did gamma correction (gamma correction)
The Reinhard tone mapping algorithm makes the bright area no longer loses detail, but its simple inverse ratio reduces the brightness value, making the black area lose detail and reduce the sensitivity of the black area.
Here you can re-see the texture of the planks at the end of the tunnel.
Another interesting application of tone mapping is the use of exposure parameters. In the introduction of the previous HDR image, it is possible to get more details by referring to different exposures. If we have day and night cycle weather systems in our scene, it is wise to use low exposure during the day and high exposure at night, just like the human eye's self-adaptation. This exposure parameter can be adjusted to accommodate different light conditions during the day and night.
A relatively simple exposure tone mapping algorithm is as follows:
Here we define the exposure uniform variable, the initial value of 1.0, and allow us to adjust it according to the Black or light. For example, a high exposure value makes the black area clearly show more detail, and the low exposure value makes the details of the black area much more lost, but we see more detail in the bright area. See:
The image above shows the benefits of HDR rendering well. By adjusting the exposure we get different details in the scene, which is not available in LDR rendering. At the end of the tunnel, for example, the wood texture is hard to see under normal exposure, but the wood texture is very clear at low exposure level.
More HDR
The two tone mapping algorithms described above are only a small part of a large number of tone mapping algorithms, and different algorithms have their advantages and disadvantages. Some algorithms focus on specific color or intensity processing, and some algorithms handle both low and high exposure to create color and detail-rich images. There is also a technique called auto-exposure adjustment or adaptive technology to simulate the self-regulation of the human eye, which adjusts the exposure system (slowly) according to the brightness of the previous frame in the scene, and the black area gradually becomes less dark and the bright areas become less bright.
Further, HDR rendering makes a lot of interesting picture effects possible and more realistic. One of these effects is bloom, and we'll talk about it next time.
?
Attached to the original address, the original code is downloaded!
Http://learnopengl.com/#!Advanced-Lighting/HDR
In unity, you'll see a comparison of the HDR in accordance with this article:
HDR in OpenGL