Gamma Correction is a common concept for graphics and images, but there are many legends about the reason and usage of Gamma. This article will try to parse Gamma Correction sources and crack various misunderstandings.
Where does Gamma Correction come from?
There is a common saying that gamma is derived from the eye's perception of light. I also mistakenly used this statement. On Wikipedia, we found the real source of GAMMA:
The gamma code is developed to offset the input and output characteristics of the cathode ray tube (CRT) display. The current of the electron gun, that is, the brightness of the light, is non-linear with the input positive voltage. Gamma compression is used to change the input signal to offset this non-linearity, so the output image can have the expected brightness.
Therefore, Gamma Correction has nothing to do with human characteristics and is only related to CRT. The updated display method, such as LCD and plasma, also selects the same non-linear characteristics as the CRT in the current year to ensure compatibility. (In fact, it is related to the system. Mac OS X 10.6 uses 1.8, while other systems, including TVs, use 2.2)
Gamma computing is simple, just a power, that is:
Gamma is the GAMMA value used for correction.
For more information, seeHttp://www.klayge.org /? P = 921