This series of articles is mainly translated and referenced from "Real-time 3D Rendering with DirectX and HLSL" (thanks to the author of the original book), but also a bit of personal understanding and development, if there are errors in the article, please correct me.
Here is the code and resources in the book.
All of the environment and tools used in this article are based on previous articles, if you do not understand, please refer to the previous articles in this series first.
This article index:
About light Diffuse Lighting diffuse illumination 1 directional Lights parallel light 2 diffuse Lighting Effect Preamble Diffuse light effect prepare 3 diffuse Lighting verte X Shader Diffuse illumination vertex shader 4 diffuse Lighting Pixel Shader 5 Diffuse Lighting Output Summary Reference link
About Lights
In the real world, without light we will not see anything, the object you see or the light that reflects the light or itself can be self-luminous. In the process of computer rendering, you will simulate the interaction of light with the object, and increase the detail of the surface of the 3D object. But the interplay of lights is a very complex process, and in current technology it is not possible to do such a large number of repetitive computations within an interactive frame rate range. As a result, an approximate algorithm is typically used to add more detail to your senses with a light model that describes how light interacts with the 3D model. Some basic lighting models will be introduced in this article. Diffuse Lighting: Diffuse reflection illumination
Different surfaces reflect the light in different ways. The mirror reflects the light in the opposite direction at the same angle as the incident light. The diffuse surface reflects the incident light in all directions equally.
The simplest and most common model for simulating diffuse illumination is the Lambert cosine theorem (Lambert's cosine law). Lambert cosine theorem: The luminance of a model's surface is directly determined by the cosine of the angle between the light vector and the surface normal of the two vectors. The light vector is the point to which the light is injected, and the surface normal defines the orientation of the surface. As shown in the following illustration:
The angle between the two vectors can be calculated by the formula of the two vector point multiplication. Surface normals can be multiplied by two side forks, and normally the surface normal information is provided directly in the 3D model. Let's discuss how to get the light vector. (1) directional Lights: Parallel light
In 3D graphics, there are three common types of light: the parallel Light (directional lights), the point lights, and the spot light (spotlights). The light source represented by the parallel light is the kind of light that has an infinitely long distance, which does not have a specific position on the model in your scene. Because of this, this type of light always reaches the surface of each object in a parallel way, and they are all in the same direction. Sunlight is a good example of this type of light source, although the sun is not a strictly infinite distance light source. But his distance to the Earth is enough for you to discern the direction of each beam of light. The following illustration shows the parallel light:
To model A parallel light, you only need to know the source direction of the light on a three-dimensional vector. You can also include color and intensity information in this light source information as you would like to do in ambient light in this article. The following code shows how to render the effect with a single, parallel light.
Code Snippet Listing 6.2 diffuselighting.fx
#include "include\\common.fxh"/*************** Resources ***************/cbuffer cbufferperframe {float4 Ambientco
Lor:ambient < string uiname = "AMBIENT light";
String uiwidget = "Color";
> = {1.0f, 1.0f, 1.0f, 0.0f};
FLOAT4 Lightcolor:color < string Object = "LightColor0";
String uiname = "Light Color";
String uiwidget = "Color";
> = {1.0f, 1.0f, 1.0f, 1.0f};
FLOAT3 Lightdirection:direction < string Object = "DirectionalLight0";
String uiname = "Light Direction";
String Space = "World";
> = {0.0f, 0.0f, -1.0f};
} cbuffer cbufferperobject {float4x4 worldviewprojection:worldviewprojection <string uiwidget= "None";>;
float4x4 world:world <string uiwidget= "None";>;
} texture2d colortexture < string resourcename = "Default_color.dds";
String uiname = "Color Texture";
String resourcetype = "2D";
>; Samplerstate ColorSampler {Filter = min_mag_mip_linear;
Addressu = WRAP;
ADDRESSV = WRAP;
};
Rasterizerstate disableculling {cullmode = NONE;
};
/*************** Data Structures ***************/struct Vs_input {float4 objectposition:position;
FLOAT2 Texturecoordinate:textcoord;
FLOAT3 Normal:normal;
};
struct Vs_output {float4 position:sv_position;
FLOAT3 Normal:normal;
FLOAT2 texturecoordinate:textcoord0;
FLOAT3 Lightdirection:texcoord1;
};
/*************** Vertex Shader ***************/vs_output vertex_shader (vs_input in) {vs_output out = (vs_output) 0; Out. Position = Mul (in.
Objectposition, worldviewprojection); Out. Texturecoordinate = Get_corrected_texture_coordinate (in.
Texturecoordinate); Out. Normal = Normalize (Mul (FLOAT4 (in.
Normal, 0), world). xyz);
return out; }/*************** Pixel Shader ***************/float4 pixel_shader (vs_output in): sv_position {float4 out = (Floa
T4) 0; FLOAT3 normal = Normalize (in.
Normal); FLOAT3 lightdirection = Normalize (in.
Lightdirection);
float n_dot_1 = dot (lightdirection, normal); FLOAT4 color = colortexture.sample (Colorsampler, in.
Texturecoordinate);
FLOAT3 ambient = Ambientcolor.rgb * AMBIENTCOLOR.A * COLOR.RGB;
FLOAT3 diffuse = (FLOAT3) 0;
if (n_dot_1 > 0) {diffuse = Lightcolor.rgb * LIGHTCOLOR.A * n_dot_1 * COLOR.RGB;
} Out.rgb = ambient + diffuse;
OUT.A = COLOR.A;
return out; }/*************** Techniques ***************/technique10 Main10 {pass P0 {SetVertexShader (Compileshad
ER (vs_4_0, Vertex_shader ()));
Setgeometryshader (NULL);
SetPixelShader (Compileshader (Ps_4_0, Pixel_shader ()));
Setrasterizerstate (disableculling); }
}
(2) Diffuse Lighting Effect preamble: Preparing for diffuse light effects
The first line of the diffuselighting.fx file introduces a common effect function library file in C-style code. You will need to create a new include folder in your project and add this file to the folder. The following code snippet shows the contents of this file. Note that the introduction of the file uses double quotation marks, and the FLIP_TEXTURE_Y macro definition and the Get_corrected_texture_coordinate () function have been transferred to this file for implementation.
Code Snippet Listing 6.3 Common.fxh
#ifndef _common_fxh #define _COMMON_FXH/************* Constants *************/#define Flip_ Texture_y 1/************* Utility Functions *************/float2 get_corrected_texture_coordinate (float2
Texturecoordinate) {#if flip_texture_y return float2 (texturecoordinate.x, 1.0-texturecoordinate.y);
#else return texturecoordinate; #endif} float3 Get_vector_color_contribution (float4 light, float3 color) {//color (. RGB) * Intensity (. a) Retu
RN Light.rgb * LIGHT.A * color; } FLOAT3 Get_scalar_color_contribution (float4 light, float color) {//color (. RGB) * Intensity (. A) return light
. RGB * LIGHT.A * color; } #endif/* _COMMON_FXH */
New members were also added to the Cbufferperframe: Lightcolor and Lightdirection. Lightcolor and Ambientcolor have the same function body, respectively, representing the color and intensity of the parallel light. The direction of the light source in world space is preserved in the lightdirection. These two new members are also associated with new object annotations. These two annotations indicate that the two members can also be associated with the scene object, that is, you can find the corresponding property and adjust it in the FX composer's properties panel with the note string, and the adjustment will be described in a later section of this article [(5) Diffuse Lighting Output)].
Cbufferperobject also joined the new members of the world. This member is closely related to the new member normal in the vs_input structure. The surface normal direction is initially relative to the local coordinate system, just like vertex data. When you use the normal vector and the light vector to get the light intensity of this pixel, the light vector is relative to the world coordinate system, so the normal vector must also be converted into the world coordinate system, and this is the matrix used to make this transformation. The worldviewprojection matrix cannot be converted because the matrix converts the vectors into homogeneous coordinate spaces. It is important to note here that the world matrix may also scale the vectors on the original basis, and the normals we need are the unit vectors, so we need to re-unit the normals once after the transformation of this matrix. (3) Diffuse Lighting Vertex Shader: Diffuse illumination vertex shader
Two new members were added to the vs_output structure: normal and lightdirection. The normal constant is used to transmit the calculated surface normal values. Lightdirection, this member is because what you need in the pixel shader is the direction from the surface to the light source, and from the input parameters of the vertex shader is the light-to-surface shader. Therefore, you need to convert it in the vertex shader, and you can, of course, turn it back on the CPU side (preferably).
About the output of the vertex shader |
Careful child shoes can be found in the output structure of the vertex shader lightdirection The semantics associated with this member is TEXCOORD1. Why is the FLOAT3 constant of a directional type associated with semantics such as Texcoord? And why Texcoord the number behind this semantics is 1. Could it be 2, 3, 4 or even 5? |
The following is the author's own understanding. These constants defined in the struct, not necessarily each have specific semantics and he corresponds, in the absence of the case can choose Texcoord this semantics, because he is float4 type, so of course FLOAT3 type can also pass through him. So again is the second question, Texcoord in the official SDK is defined as Texcoord[n], it can be seen that the following number is arbitrary or not, plus different numbers just to be associated with different constants, so that the data can be separated when passing. |
See the information provided in the reference link "1" for content on HLSL semantics. |
(4) Diffuse Lighting Pixel Shader
This section describes what is more in the pixel shader after adding diffuse light.
Code snippet Listing6.4 pixel shader in diffuselighting.fx file
float4 Pixel_shader (Vs_output in): sv_target {float4 out = (float4) 0; FLOAT3 normal = normalize (in.
Normal); FLOAT3 lightdirection = Normalize (in.
Lightdirection);
float n_dot_1 = dot (lightdirection, normal); FLOAT4 color = colortexture.sample (Colorsampler, in.
Texturecoordinate);
FLOAT3 ambient = Ambientcolor.rgb * AMBIENTCOLOR.A * COLOR.RGB;
FLOAT3 diffuse = (FLOAT3) 0;
if (n_dot_1 > 0) {diffuse = Lightcolor.rgb * LIGHTCOLOR.A * n_dot_1 * COLOR.RGB;
} Out.rgb = ambient + diffuse;
OUT.A = COLOR.A;
return out; }
First, in order to more clearly describe the final color value is also added to the effect of diffuse light, the calculation of the ambient and the final output of the out pixel value is separated.
Second, the incoming normal and lightdirection vectors are all in the unit, because the data passed from the raster phase may have been processed and are not unit-based. The errors here are easy to ignore because this is only a visual difference. This step is important when you find an error while running and you are likely to ignore these errors.
Finally, we multiply the light vector with the surface normals and use this value to calculate the final diffuse light color. Note that when calculating the diffuse color, we use the if sentence, because when n_dot_1 is less than zero, it means that the light is behind the surface and that the surface cannot accept light, so the diffuse light color of this part of the pixel should be pure black. When the n_dot_1 value of 0 o'clock means that the light is completely perpendicular to the surface normal, which is completely parallel to the surface, the surface should not accept diffuse light. In the same vein, the n_dot_1 value of 1 o'clock means that the light is completely perpendicular to the surface and can accept all incoming light. The color value of the final output is obtained by combining the results of ambient light and diffuse light. Note that the default ambient light alpha channel in the code is 0, so assume that ambient light has no effect on the model, so that we can see that the back of the model is completely black. (5) Diffuse Lighting Output
The pixel color values of the final output in the pixel shader are added by the ambient light and diffuse light, and the transparency of the pixel is determined by the image's alpha channel. The following figure shows the results of using the previously used Earth map for the model and adjusting the ambient light intensity to 0:
Note that in this picture you can see that a parallel light is added underneath the model. NVIDIA FX Composer can choose to display the created ambient light (ambient), Point Light, Spotlight (spot), and parallel light (directional) in the render panel. To increase the light, you can select it from the main toolbar or add it in the Create menu. For the added parallel light to be associated with your shader, you must bind the light to the Lightcolor and Lightdirection constants. The annotations in this object come in handy. To bind a light, you need to first select the Globe sphere in the render panel, and then check the material Instance Properties option in the property panel, which is located on the fifth of the icon on the Panel (see below). Then select the parallel light you just created for the directionallight0 and Lightcolor0 two constants in the panel. Now that your parallel light is bound to the shader, your actions on the parallel light will be reflected in the light of the sphere, such as rotating the parallel light, and you will find that the shadow position on the model has changed. It should be noted, however, that no matter how you change the position of the parallel light model, there is no effect on the light of the model, since the parallel light does not have a specific position concept. You can also change the color and intensity of the parallel light in the properties panel by selecting the parallel light.
Warning |
The NVIDIA FX Composer Tool supports manual, automatic binding. When auto bind is enabled, the tool will try to help you find the most appropriate light in your project to bind the shader constant. But it's not always right to succeed, so you need to check that the constants are properly bound. |
However, when you rebuild this shader, the constants that are manually bound are lost and need to be re-bound. |
Summary
This article mainly describes how one of the simplest diffuse illumination models is implemented, and there are other more detailed implementations of the diffuse illumination model, such as the related chapters in the Dragon book, which give more parameters to the Lambert illumination model. The diffuse light model is the basis, and the high-light model in the next article needs to add the high-light point calculation on the basis of the diffuse reflection. For more illumination models, refer to the information given in the reference link "2".
Reference Links
Semantics in the "1" HLSL. (http://blog.csdn.net/pizi0475/article/details/6264388)
"2" Simple light model. (http://www.cnblogs.com/mavaL/archive/2010/11/01/1866451.html)