Glsl tutorial
Normalization issues
Vertex shader
The dot product is commonly used to compute the cosine of the angle between two vectors. as we know this will only hold if both vectors are normalized. this is why we use the normalization operation in your shaders. in here we will see when we can skip this step, and we must use it.
When a normal vector arrives at a vertex shader is common to normalize it
Two vector vertices are generally used to calculate the cosine of two vectors, and the modulo of these two vectors must be the unit value. That is to say, we must unit the two vectors.
Normal = normalize (gl_normalmatrix * gl_normal );
The multiplication by the gl_normamatrix transforms the incoming normal to eye-space. The normalization guarantees a unit length vector as required to compute the cosine with a dot product.
So can we avoid the normalization? We'll in some cases we can. If the gl_normamatrix is orthogonal then we know that the length of the incoming vector is preserved, I. e. the lengthNormalIs equal to the lengthGl_normal. Therefore, if the normals from the OpenGL application are normalized, which is common, we can avoid the normalization in the shader.
Is it possible that we do not use unitization In the vertex coloring tool? The answer is that in some cases it is possible, if the conversion matrix is orthogonal
In practice this means that if we useGlulookatTo set the camera, and then perform only rotations and translations on our models, we can skip the normalization of the normal vector in the shader. it also means that a directional light will have its direction already normalized.
This means that in actualCodeOnly the rotation and shift conversion matrices are available, and the orientation of the oriented light must be unitized.
Fragment shader
In the fragment shader we often find ourselves normalizing a vector which was just normalized in the vertex shader. Do we really need to do this? Well, the answer is yes, in most cases we do.
In the bitwise colorant, we find that the vector we have standardized has been standardized in the vertex colorant. Do we really need to do this? Yes. We need to do this in most cases.
Consider a triangle with three different per vertex normal vectors. the fragment shader has es an interpolated normal, based on the distance from the fragment to the three vertices. the problem is that the interpolated vector, although it has the right direction, it doesn't have unit length.
Consider a triangle with different normal vectors of three vertices,
The following dimo-shows why this is the case. the black lines represent the faces (in 2D), the normals at the vertices are represented in blue. the green vector represents an interpolated normal at the fragment (represented with a dot ). all interpolated normals will lie the dotted line. as can be seen in the figure, the green vector is smaller than the blue vectors (which are unit length, at least that was my intention ).
Note that if the vertex normals were not normalized, not only the length wocould be different from one, but also the direction wocould be worng in the general case. hence, even if a vector isn't used on a vertex shader, if we need to have it normalized in the fragment shader, we must also normalize it on the vertex shader.
There is however a case when normalization can be skipped in the fragment shader, as long as the vectors per vertex are normalized. this is when the vectors per vertex all share the same direction, I. e. they are equal. the Interpolation of such vectors wocould yield exactly the same vertex as the per vertex vectors, hence normalized (we assumed that the vertex vectors were normlized ).
A simple example is when one considers a directional light. The direction is constant for all fragments, so if the direction is previously normalized, we can skip the normalization step in the fragment shader.
If the vertex vector is not unitized, its modulo is not equal to 1, and its direction is generally incorrect.
If there is a directed light, And the directed light is unitized, this requires no unitization.