Game Development BASICS (18)

Source: Internet
Author: User
Tags pixel coloring

Chapter 2
Pixel shader is a program that runs on the graphics card GPU during raster processing of each pixel. (unlike the vertex coloring tool, direct3d does not simulate the pixel coloring tool through software operations.) In essence, the pixel coloring tool replaces the multitexturing link in the fixed functional assembly line, it also provides the ability to directly operate a single pixel and access the texture coordinates of each pixel. This ability to directly access pixel and texture coordinates enables various special effects, such as multi-texture, depth of field, cloud simulation, flame simulation and complex Shadow technology

You can check the pixelshaderversion of the d3dcaps9 structure member and compare it with the macro d3dps_version to test whether the graphics card supports a certain Vertex coloring er version.
Example:
// If the device's supported version is less than Version 2.0
If (caps. pixelshaderversion <d3dps_version (2, 0 ))
// Then pixel shader version 2.0 is not supported on this device

Multi-texture may be the simplest technology implemented by the pixel coloring machine. As the pixel coloring machine replaces the multi-texture link in the fixed function pipeline, it is necessary to have a basic understanding of the Multi-texture link and its functions.

Multi-texture is a complex operation that was considered to be an advanced topic. In addition, the fixed-function multi-texture has been replaced by a new and more powerful pixel coloring tool, therefore, it is meaningful not to devote any effort to discuss the multiple textures of fixed functions.

There is a correlation between the multi-texture concept and fusion. In chapter 7th, we learned how to combine ongoing raster pixels with those previously written to the background cache, to get a special effect. This idea is applied to multiple textures, that is, several layers of textures are enabled at the same time, and the fusion methods of these textures are defined to achieve a special effect.
. A typical application of multiple textures is to perform illumination operations. In the vertex operation phase, we do not intend to use the direct3d illumination model, but rather use a special texture image, light maps specifies how a surface is illuminated. For example, if you want to project a spotlight on a large dashboard, you can define the spotlight in the d3dlight9 structure, or combine the texture map representing the dashboard and the texture map representing the spotlight.

According to the fusion technology introduced in chapter 7th, the final images are related to the fusion methods of these textures. In the fixed multi-texture process, the fusion equations are controlled by the texture rendering state, with the help of the pixel coloring tool, you can write the fusion function as a simple expression in the code in a programmable manner, which allows you to perform Fusion operations on multiple textures in any way.

The combination of multiple textures to illuminate the dashboard and the direct3d illumination computing model has the following advantages:
# The illumination has been calculated in advance and saved in the spotlight illumination texture map. In this way, you do not need to operate the illumination when running the program, thus saving the computing time. Of course, only static objects and fixed light sources can be computed in advance.
# Because the illumination texture map is pre-calculated, you can use a more precise and complex illumination model than direct3d (so that you can get better illumination results in more realistic scenarios)

Note: Multi-texture links are generally used to implement a complete lighting computing engine for static objects. For example, a texture map that stores the color information of objects, such as a canvas texture map, in addition, a diffuse light map is used to save the brightness of the diffuse reflection surface, and a specular light map is used to save the brightness of the mirror reflection surface ), A fog effect texture map (fog map) that stores the total amount of Fog covered by the surface, and a detail map that saves the microscopic high-frequency details of the surface ), when you need to synthesize these textures, you only need to query the pre-calculated texture map to efficiently add illumination, color, and details to the scenario.

Note: spotlight light texture map is a simple type of basic illumination texture map. Generally, when a scene or light source is specified, a special program is used to generate a illumination texture map, use special programs to generate illumination textures

Enable multiple textures
You can use the idirect3ddevice9: settexture method to enable the texture. You can use the sampler state method idirect3ddevie9: setsamplerstate to set the texture.
Example:

Hresult idirect3ddevice9: settexture (DWORD stage, idirect3dbasetexture9 * ptexture );

Hresult idirect3ddevice9: setsamplerstate (DWORD sampler, d3dsamplerstatetype type, DWORD value );

Note: The level (sampling level) index (sampler stage index) of a specific sampler is associated with the layer I texture, that is, the I sampling level specifies the samplerger status for the I-Layer Texture.

The texture layer/sampling-level index identifies the texture layer/sampling level that you want to set the texture/sampler to. In this way, multiple textures can be activated, different layer indexes are used to set the corresponding sampling status for these textures. Before that, the layer index is always set to 0 (indicating the first layer index ), this is because only one layer of texture is needed at that time. If three layers of texture are required, the layer numbers 0, 1 and 2 are used to identify each layer of texture respectively:
// Set first texture and corresponding sampler states
Device-> settexture (0, tex1 );
Device-> setsamplerstate (0, d3dsamp_magfilter, d3dtexf_linear );
Device-> setsamplerstate (0, d3dsamp_minfilter, d3dtexf_linear );
Device-> setsamplerstate (0, d3dsamp_mipfilter, d3dtexf_linear );

// Set second texture and corresponding sampler states
Device-> settexture (1, tex2 );
Device-> setsamplerstate (1, d3dsamp_magfilter, d3dtexf_linear );
Device-> setsamplerstate (1, d3dsamp_minfilter, d3dtexf_linear );
Device-> setsamplerstate (1, d3dsamp_mipfilter, d3dtexf_linear );

// Set third texture and corresponding sampler states
Device-> settexture (2, tex3 );
Device-> setsamplerstate (2, d3dsamp_magfilter, d3dtexf_linear );
Device-> setsamplerstate (2, d3dsamp_minfilter, d3dtexf_linear );
Device-> setsamplerstate (2, d3dsamp_mipfilter, d3dtexf_linear );

Multi-texture coordinates

For each 3D triangle, a corresponding triangle needs to be defined in the texture map to determine the texture data mapped to the 3D triangle, previously, texture coordinates were added for each vertex. In this way, each three vertex of a triangle is defined in the texture and a corresponding triangle is defined.

Now you need to use multiple textures. For each of the three vertices that define a triangle, You need to define the corresponding triangle in each enabled texture. Therefore, several texture coordinate pairs can be added for each vertex-each texture coordinate corresponds to a layer of texture enabled, for example, if three layers of texture are merged (these three layers of texture have been enabled), each vertex must have three pairs of texture coordinates corresponding to the three layers of texture, this can be defined by using a multi-texture vertex structure with a three-layer texture:

Struct multitexvertex
{
Multitexvertex (float X, float y, float Z,
Float U0, float v0,
Float u1, float V1,
Float U2, float V2)
{
_ X = x; _ y = y; _ z = z;
_ U0 = U0; _ V0 = V0;
_ U1 = U1; _ V1 = V1;
_ U2 = u2; _ v2 = V2;
}

Float _ x, _ y, _ z;
Float _ U0, _ V0;
Float _ u1, _ V1;
Float _ U2, _ V2;

Static const DWORD fvf;
}

Cosnt DWORD multitexvertex: fvf = d3dfvf_xyz | d3dfvf_tex3;
# D3dfvf_tex3 the structure consists of three texture coordinate pairs. The fixed function assembly line supports up to eight textures. If you want to use more than eight texture layers, you must use the vertex declaration and programmable vertex assembly line.

In a newer version of the pixel shader, you can use a texture coordinate pair to index multiple textures, so that you do not need to use multiple vertex coordinate pairs. Of course, the premise is that the same texture coordinates are used for each texture layer. If the texture coordinates of each layer are different, multiple texture coordinate pairs are required.

Input and Output of the pixel shader

The input of the pixel shader includes the color and texture coordinates of each pixel.

Note that the color of the vertex should be interpolated on the entire surface of the element.
The texture coordinates of each pixel are actually the coordinates (u, v) of the texture that will be mapped to the current pixel ). before entering the pixel shader, direct3d calculates the color and texture coordinates of each pixel Based on the vertex color and vertex texture coordinates. The number of color and texture coordinate pairs of the input pixel shader is determined by the number of color and texture coordinate pairs output by the vertex shader. For example, if a vertex shader outputs two color values and three texture coordinate pairs, direct3d calculates two color values for each pixel and three texture coordinate pairs, and input these results into the pixel shader. We need to map the input color and texture coordinates into variables in the pixel coloring program using the semantic syntax.
Struct ps_input
{
Vector C0: color0;
Vector C1: color1;
Float2 T0: texcoord0;
Float2 T1: texcoord1;
Float2 t2: texcoord2;
};

In terms of output, the pixel shader outputs a single color value for each pixel calculated
Struct ps_output
{
Vector finalpixelcolor: color0;
};

Steps for using the pixel shader:
1. Compile and compile the pixel shader.
2. Create an idirect3dpixelshader9 interface object to represent the pixel shader Based on the compiled shader code
3. Use the idirect3ddevice9: setpixelshader method to enable the pixel shader.

The pixel shader must be destroyed after use.

Compilation and compilation of pixel coloring ers
The compilation method of the pixel shader is the same as that of the vertex shader. First, you must compile a pixel shader program and use HLSL to compile the program, use the d3dxcompileshaderfromfile function to compile the shader program,

Note: To use the pixel shader, you must change the compilation target to the pixel shader target (for example, ps_2_0) instead of the vertex shader target (for example, vs_2_0 ), the compilation target is specified by a parameter in the d3dxcompileshaderfromfile function.

Create a pixel shader
Once the shader code has been compiled, you can use the following method to obtain the pointer to the idirect3dpixelshader9 interface, which represents a pixel shader.
Hresult createpixelshader (
Const DWORD * pfunction,
Idirect3dpixelshader9 ** ppshader );

# Pfunction refers to the pointer to the compiled shader code
# Ppshader returns a pointer to the idirect3dpixelshader9 Interface
For example, assume that the shader variable is the object of the id3dxbuffer interface and contains the compiled coloring code. To obtain the pointer of the idirect3dpixelshader9 interface, for example:
Idirect3dpixelshader9 * multitexps = 0;
HR = device-> createpixelshader (DWORD *) shader-> getbufferpointer (),
& Multitexps );

Settings of the pixel shader
After obtaining the idirect3dpixelshader9 pointer representing the pixel coloring device, enable:
Hresult setpixelshader (idirect3dpixelshader9 * pshader );
This method only receives one parameter. You can pass the pointer of the specified pixel shader to this parameter,
Example:
Device-> setpixelshader (multitexps );

Destruction of the pixel shader
The idirect3dpixelshader9 interface must call its own release method after use to release the resources it occupies.
D3d: Release <idirect3dpixelshader9 *> (multitexps );

HLSL sampler object
To sample textures in the pixel shader, you can use the HLSL built-in functions related to Tex.
Note: Sampling refers to retrieving the texture elements corresponding to a certain Pixel based on the texture coordinates and sampling status (texture filter state, texture filter State) of the pixel.

Generally, these functions need to specify two things.
# Used to retrieve texture coordinates (u, v );
# Specific textures to be retrieved
Texture coordinates (u, v) are of course input as the pixel shader. The specific texture to be retrieved is identified by a special HLSL object-Sampler in the pixel shader, sampler objects can be considered as objects at the texture layer and sampling level.
For example, if a three-layer texture is used, it means that the texture of each layer should be referenced in the pixel shadow. In the pixel shadow program, you can write as follows:
Sampler firsttex;
Sampler secondtex;
Sampler thirdtex;

Direct3d uniquely Associates each sampler object with a texture layer. In an application, you only need to find the texture layer corresponding to the sampler object, then, set the appropriate texture for the texture layer and the corresponding samplerver status. The following Code demonstrates how to set the texture and samplerver status for firsttex.
// Create texture
Idirect3dtexture9 * Tex;
D3dxcreatetexturefromfile (device, "tex.bmp", & Tex );
...
// Get handle to constant
Firsttexhandle = multitexct-> getconstantbyname (0, "firsttex ");

// Get a description of the constant;
D3dxconstant_desc firsttexdesc;
Uint count;
Multitexct-> getconstantdesc (firsttexhandle, & firsttexdesc, & COUNT );
...
// Set texture/campler states for the sampler firsttex, we identify
// The stage firsttex is associated with form
// D3dxconstant_desc: registerindex member;
Device-> settexture (firsttexdesc. registerindex, Tex );
Device-> setsamplerstate (firsttexdesc. registerindex, d3dsamp_magfilter, d3dtexf_linear );
Device-> setsamplerstate (firsttexdesc. registerindex, d3dsamp_minfilter, d3dtexf_linear );
Device-> setsamplerstate (firsttexdesc. registerindex, d3dsamp_mipfilter, d3dtexf_linear );

Note: In addition to the sampler type, you can also use more specific sampler1d, sampler2d, sampler3d, and samplercube types with stricter type check. These types are more prominent in type security, the preceding types can only be used in the corresponding Tex * function. For example, a sampler2d object can only be used in the tex2d * function. Similarly, a sampler3d object can only be used in the tex3d * function.
(End)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.