One-day elimination of a Monkey Plan (1)-Basic illumination model and RT post-processing
1,First, review the basic Lighting Model:
Ambient Diffuse Specluar... You don't need to talk about it much. I don't know how to review it myself. ^_^ ~
Create a basic RenderMonkey project, and you can start to build a basic illumination model.
2, VertexShader:
Bool bViewSpace;
Float4x4 matView;
Float4x4 matViewProjection;
Float3 vecLight;
Float fSinTime0_X;
Float4 vViewPosition;
Struct VS_INPUT
{
Float4 Position: POSITION0;
Float2 TexCoord: TEXCOORD0;
Float3 Normal: NORMAL0;
Float3 Binormal: BINORMAL0;
Float3 Tangent: TANGENT0;
};
Struct VS_OUTPUT
{
Float4 Position: POSITION0;
Float2 TexCoord: TEXCOORD0;
Float3 Normal: TEXCOORD1;
Float3 Binormal: TEXCOORD2;
Float3 Tangent: TEXCOORD3;
Float3 Light: TEXCOORD4;
Float3 View: TEXCOORD5;
};
VS_OUTPUT vs_main (VS_INPUT Input)
{
VS_OUTPUT Output;
Float4x4 matTransform = {1.0f, 0.0f, 0.0f, 0.0f,
0.0f, 1.0f, 0.0f, 0.0f,
0.0f, 0.0f, 1.0f, 0.0f,
0.0f, 0.0f, 0.0f, 1.0f };
If (bViewSpace)
{
MatTransform = matView;
} // End if (bViewSpace)
Output. Position = mul (Input. Position, matViewProjection );
Output. View = Output. Position-vViewPosition;
Output. TexCoord = Input. TexCoord;
Output. Normal = normalize (mul (Input. Normal, matTransform ));
Output. Binormal = (mul (Input. Binormal, matTransform) + 1.0f)/2.0f;
Output. Tangent = (mul (Input. Tangent, matTransform) + 1.0f)/2.0f;
Output. Light = normalize (vecLight );
// Add this sentence to dynamically change the position of the light source.
// Output. Light. x + = fSinTime0_X * 2;
Return (Output );
}
3,PixelShader:
Sampler2D my2DTexture;
Struct PS_INPUT
{
Float2 TexCoord: TEXCOORD0;
Float3 Normal: TEXCOORD1;
Float3 Binormal: TEXCOORD2;
Float3 Tangent: TEXCOORD3;
Float3 Light: TEXCOORD4;
Float3 View: TEXCOORD5;
};
Float4 ps_main (PS_INPUT Input): COLOR0
{
Float4 color = float4 (0.0f, 0.0f, 0.0f, 0.0f );
Float4 ambient = {0.00006f, 0.00006f, 0.00006f, 1.0f };
Float4 diffuse = {0.88f, 0.88f, 0.88f, 1.0f };
Float3 Normal = normalize (Input. Normal );
Float3 LightDir = normalize (Input. Light );
Float3 ViewDir = normalize (Input. View );
Float4 diff = saturate (dot (Normal, LightDir ));
Float3 Reflect = normalize (2 * diff * Normal-LightDir );
Float4 specular = pow (saturate (dot (Reflect, ViewDir), 8 );
Float4 fvBaseColor = tex2D (my2DTexture, Input. TexCoord );
Float4 fvTotalAmbient = ambient * fvBaseColor;
Float4 fvTotalDiffuse = diffuse * diff * fvBaseColor;
Return fvTotalAmbient + fvTotalDiffuse + specular;
}
4,Finally, the result is displayed.RenderMonkeyCan be easily implemented ):
5, RenderMonkeyPost-processing in
RenderMonkey can be used to create an RenderTarget in the pass process.
Can be preheated to the relevant screen space for post-processing
VertexShader:
Struct VS_OUTPUT
{
Float4 pos: POSITION0;
Float2 texCoord: TEXCOORD0;
};
VS_OUTPUT vs_main (float4 inPos: POSITION)
{
VS_OUTPUT o = (VS_OUTPUT) 0;
InPos. xy = sign (inPos. xy );
O. pos = float4 (inPos. xy, 0.0f, 1.0f );
// Get into range [0, 1]
O. texCoord = (float2 (o. pos. x,-o. pos. y) + 1.0f)/2.0f;
Return o;
}
PixelShader:
Sampler2D Texture0;
Float4 ps_main (float2 texCoord: TEXCOORD0): COLOR
{
Float4 color = tex2D (Texture0, texCoord );
// Convert RGB to the intensity value
Float intensity = color. r * 0.299 + color. g * 0.587 + color. B * 0.184;
Return float4 (intensity, color. );
}
The process is very simple, that is, to output the original rendering result to a specified RT, and then perform grayscale image processing in the ps of RT rendering ~
6, Underwater Effect
The underwater effect is a common effect of water processing in the game. The principle is very simple. You only need to add a new tex as BumpMap and then process the result:
VS remains unchanged. PixelShader is as follows:
Sampler2D Texture0;
Sampler2D Texture1;
Float fTime0_1;
Float4 ps_main (float2 texCoord: TEXCOORD0): COLOR
{
Float2 bump = tex2D (Texture1, texCoord + fTime0_1 * 30 );
Float2 texel = texCoord + bump/60;
Float4 color = tex2D (Texture0, texel );
Return color;
}
F5, run, the magic effect is born:
Is it too easy to know, but I don't know why? It's a shameful thing:
First, let's look at the texture. We can sample the color from the code and use only the colors of RG two channels. Open PhotoShop and use the color channel panel to easily discover the mysteries, the RG colors (white) Alternate and are converted to float values of approximately F, left and right. When added to the texture offset, a "stream-by-stream" float effect is generated...
But there is still a problem here, how to control the effect, here using a RenderMonkey pre-defined float value -- "Time0_1", refer to the official SDK, (http://developer.amd.com/gpu_assets/RenderMonkey%20Documentation.pdf)
"Time0_1"
This variable provides a scaled floating point time value [0 .. 1] which repeats itself
Based on the "Cycle time" set in the RenderMonkey Preferences dialog.
Default this "Cycle time" is set to 120 seconds. This means that the value of this
Variable cycles from 0 to 1 in 120 seconds and then goes back to 0 again.
We can know that the value ranges from 0 ~ Loop value of 1, float2 bump = tex2D (Texture1, texCoord + fTime0_1 * 30); float2 texel = texCoord + bump/60;
Try to change the values of 30 and 60 to understand that the offset value on noise maps determines the offset frequency (speed ), the addition on texel, the actual texture coordinate for loading the RT sample, determines the amplitude. A rough understanding is that fTime is multiplied by one m, and the water flow is 1/(120/m) /4), 4 for the estimated addressing range from 0 ~ The numbers of black and white alternation in 1 ~
Further observation shows that some pixels are moved from the left of the screen to the right of the screen due to the disturbance texture.
In the blog of Xiaoyao jianke, I saw the corresponding solution. I directly set the addressing mode of RT to clamp to solve the problem;
7, Heat Flow Disturbance
I originally wanted to achieve the effect of underwater water grain disturbance, but I could not find a suitable image at the moment, but unexpectedly realized the effect of heat flow disturbance, I think of the effect of air disturbance at the air outlets of a helicopter in the battlefield for the first time many years ago. I was so shocked that I never dared to implement it in the past. In fact, the principle is exactly the same (it is really a solid place for others)
Noise textures are as follows:
The result is that the static texture cannot be viewed clearly and can be implemented by yourself, but the result is not smooth enough. The solution is also found on the blog of Xiaoyao jianke (really pf predecessors, learning). The pine and cypress distribution, I will discuss it in detail tomorrow.
One day, I was very excited. I also made mosaic and so on. n more magical screen space post-processing and prepared for delayed rendering. Of course, my purpose is to use it in practice, to truly inherit, we still need to overcome various problems. We need to think about various solutions and trick, but we cannot think too much. All the blog posts are in disorder. We have to focus on one problem every day and think carefully ~~
Work hard ~~ Endless ~~~