What is shader
A Shader (shader) is a program that operates on a 3D object and is executed by the GPU. Shader is not a unified standard, and the shader of different graphics interfaces are not the same. OpenGL's coloring language is GLSL, Nvidia has developed CG, while Microsoft's Direct3D uses the Advanced Shader language (HLSL). Unity's shader is a code-generation framework that embeds the traditional graphical interface's shader (written by CG/HLSL) into a unique, descriptive structure that eventually automatically generates its own shader for each hardware platform, enabling cross-platform.
Unity Shader is not really difficult, beginners are often confused because it has too many fixed commands and structures, and these commands need us to have a certain understanding of 3D rendering to know what they do. Shader species
Both OpenGL and Direct3D provide class three shaders: Vertex shaders: Each vertex is processed, and the space position of the vertex is projected on the screen, that is, the two-dimensional coordinates of the vertex are computed. At the same time, it is responsible for the calculation of the vertex depth buffer (z-buffer). Vertex shaders can control properties such as the position of vertices, color and texture coordinates, but cannot generate new vertices. The output of the vertex shader is passed to the next step of the pipeline. If a geometry shader is defined later, the geometry shader processes the output data of the vertex shader, otherwise the rasterizer continues the pipelining task. A pixel shader (Direct3D), often referred to as a fragment shader (OpenGL), processes data from a rasterizer. The rasterizer fills the polygon and passes through the pipeline to the pixel shader, which calculates the color per pixel. Pixel shaders are commonly used to handle scene lighting and related effects, such as concave-convex texture mapping and color grading. The name fragment shader seems more accurate because the call to the shader and the display of the pixels on the screen do not correspond to one by one. For example, for a pixel, a fragment shader may be called several times to determine its final color, and those objects that are obscured will be counted until the final depth buffer is sorted before and after each object. Geometry shader: You can delete vertices from the polygon mesh. It is able to perform the work of generating geometries that are too onerous for CPUs and adding model details. Direct3D version 10 adds an API that supports geometry shaders and becomes part of Shader Model 4.0. OpenGL can only be used with a geometry shader from one of its plugins.
Unity Shader is divided into surface shaders (surface Shader) and vertex fragment shaders (Vertex and Fragment Shader). A surface shader (surface Shader) is a concept that unity proposes. Writing shaders to interact with lighting is complex, there are many types of light sources, different shading options, different render paths (forward and deferred rendering), and the surface shader simplifies this part. Unity recommends using a surface shader to write shader related to lighting. There is no difference between vertex shader and fragment shader in vertex fragment shaders (Vertex and Fragment Shader) and Opengl,direct3d. Vertex fragment shaders are more free and powerful than surface shaders, and of course light needs to be handled on its own. Unity also allows geometry shaders to be written inside, which is generally not used much. Shader Program Structure
Shader Syntax:
Shader syntax:
Shader "name" {[Properties] subshaders [Fallback] [Customeditor]}
//properties syntax
Properties {property ...]}
Subshader syntax
subshader {[Tags] [commonstate] passdef [passdef ...]}
Pass Grammar
Pass {[name and Tags] [Rendersetup]}
//Fallback syntax
Fallback "name"
Property Definition: Defines the input of the shader, which can specify a child shader (Subshader) when the material is edited: A shader can have more than one child shader. These child shaders are irrelevant and only one will run on the final platform. The purpose of writing multiple is to resolve compatibility issues. Unity will choose the shader run of the compatible terminal platform itself. Rollback (Fallback): If the child shader does not work on the terminal platform, use the alternate shader specified by Fallback, commonly known as spare tires. Pass: A pass is a draw. For surface shaders, there can be only one pass, so there is no pass section. Vertex fragment shaders can have more than one pass. Multiple passes can achieve a number of special effects, such as when the character is obscured by the environment can also see the character outline can be used to achieve the multi-pass. CG code: Each pass can contain a custom CG code, starting from Cgprogram to ENDCG end.
Example of a basic surface shader:
Shader "Custom/newshader" {
Properties {
_maintex ("Base (RGB)", 2D) = "White" {}
}
subshader {
Tags {"Rendertype" = "Opaque"}
LOD
cgprogram
#pragma surface surf Lambert
sampler2d _maintex;
struct Input {
float2 uv_maintex;
};
void Surf (Input in, InOut surfaceoutput o) {
Half4 c = tex2d (_maintex, In.uv_maintex);
O.albedo = C.rgb;
O.alpha = C.A;
}
ENDCG
}
FallBack "diffuse"
}
Example of a basic vertex fragment shader:
Shader "Vertexinputsimple" {
Subshader {
Pass {
cgprogram
#pragma vertex vert
#pragma fragment Frag
#include "unitycg.cginc"
struct v2f {
float4 pos:sv_position;
Fixed4 color:color;
};
v2f Vert (Appdata_base v)
{
v2f o;
O.pos = Mul (UNITY_MATRIX_MVP, V.vertex);
O.COLOR.XYZ = v.normal * 0.5 + 0.5;
O.COLOR.W = 1.0;
return o;
}
Fixed4 Frag (v2f i): sv_target {return i.color;}
ENDCG
}}}
Shader Input
The input of the shader has two sources, one through the attribute definition, and one through the Shader.setglobalxxx method global setting.
Property Definition variable: The variable in the property definition is the primary setting of the shader parameter. It varies with the material, and each material that uses the shader can be set in inspector or in a script. These parameters, in addition to those defined in the Properties section of shader, also need to be declared in CG for use. For example, in the example above the surface shader we define the _maintex type as 2D, and we need to declare sampler2d _maintex in CG.
Global variables: Shader has a set of setglobalxxx methods that can be set for shader uniform variables defined in CG without being defined in the attribute. This setting is global and all shader that define the uniform will be affected. For example, we want the scene to change color over time, you can set a uniform global color variable for the shader used by the scene, and then change the color of the scene by setting the color in the script. This method can also be used when the scene turns black when the character releases the skill.
The types of attributes allowed in Unity shader are:
Key Words |
type |
corresponding CG type |
Example |
Float |
Floating point number |
Float |
_myfloat ("My float", float) = 0.5 |
Range |
Floating point number (within the specified range) |
Float |
_myrange ("My range", Range (0.01, 0.5)) = 0.1 |
Color |
Floating-point Four tuples |
Float4 |
_mycolor ("Some color", color) = (1,1,1,1) |
Vector |
Floating-point Four tuples |
Float4 |
_myvector ("Some vector", vector) = (1,1,1,1) |
The |
Map of the order size of 2 |
Sampler2d |
_mytexture ("Texture", 2D) = "White" {} |
Rect |
Map with order size not 2 |
Sampler2d |
_myrect ("My rect", rect) = "White" {} |
CUBE |
CubeMap |
Samplercube |
_mycubemap ("Cubemap", CUBE) = "" "{} |
Note: CubeMap is a combination of 6 linked 2D maps used primarily to make reflections (such as Sky boxes and dynamic Reflections) Subshader
Subshader In addition to pass, there are two tags worthy of attention: LOD and tags lod
Lod is a shorthand for the level of detail, specifically the shader level of detail, because there is also a model LOD concept in unity, which is two different things. We only introduce LOD in shader, the model LOD please refer here.
Shader LOD lets us set a value that determines what kind of Shader we can use. The maximum allowable Lod can be set by Shader.maximumlod or Shader.globalmaximumlod, and this subshader will not be available when the LOD specified is less than Subshader. With Lod, we can write a set of subshader for a material, specifying that the larger the Lod,lod, the better the rendering and, of course, the higher the hardware requirements, and then set it according to the different terminal hardware configuration Globalmaximumlod to achieve the best performance-balancing results.
Unity built-in shader defines a set of LOD values that we can use as a reference when implementing our own shader to set our own LOD values Vertexlit and their series = Decal, reflective vertexlit = 150 Diffuse = Diffuse Detail, reflective bumped unlit, reflective bumped vertexlit = bumped, specular = bumped Spe Cular = Parallax = Parallax Specular = + Tag
Subshader can be modified by a number of tags, and the hardware determines when the shader is invoked by determining the tags.
The more common tags are: Queue
This tag is important, it defines an integer that determines the order in which the shader is rendered, the smaller the number, the earlier it is rendered, and the early rendering means that it may be overwritten by something that is rendered later.
The pre-defined queue is:
name |
value |
Description |
Background |
1000 |
The first called render, used to render the sky box or background |
Geometry |
2000 |
This is the default value for rendering non-transparent objects (in general, most of the objects in the scene should be non-transparent) |
Alphatest |
2450 |
Used to render alpha test pixels, setting a queue for alphatest alone is a result of efficiency considerations |
Transparent |
3000 |
To render a transparent object from a backward forward order |
Overlay |
4000 |
The effect used to render the overlay is the final stage of the rendering (such as a lens flare and other effects) |
Rendertype
"Opaque" or "Transparent" is two commonly used rendertype. If the output is non-transparent object, it is written in opaque, if you want to render transparent or translucent pixels, it should be written in transparent. This tag is mainly used shaderreplacement, generally this tag seems to have no effect.
commonstate
Subshader can define a set of render state, which is basically a few of the switching options for rendering, they are valid for all passes of the subshader, so they are called common. These render state can also be defined separately in each pass and will be described in detail in pass. Pass Render State
The render state is primarily a switch option that controls the rendering process, such as whether to turn on alpha blending and whether to turn on depth testing.
The commonly used render state is:
Cull
Usage: Cull Back | Front | Off
The polygon surface rejects the switch. Back indicates the reverse side culling, front for positive culling, and off for off surface culling that is double-sided rendering. Sometimes, such as skirts, streamers and other very thin things in the modeling will be made a patch, which need to set cull off to double-sided rendering, otherwise the back will be black.
Zwrite
Usage: Zwrite on | Off
Controls whether the current object's pixels are written to the depth buffer (depth buffer), which is turned on by default. Generally, if you draw an opaque object, the Zwrite opens, and the transparent or translucent object is zwrite closed.
depth Buffers : When a graphics processing card renders an object, the depth (that is, the z-coordinate) of each generated pixel is stored in a buffer. This buffer is called a Z-buffer or depth buffer, which is usually organized into an X-y two-dimensional array that holds the pixel depth of each screen. If another object in the scene produces a render result in the same pixel, the graphics processing card compares the depth of the two and preserves objects closer to the viewer. The point depth of the reserved object is then saved to the depth buffer. Finally, the graphics card can correctly generate the usual depth-aware effect based on the depth buffer: the nearer object obscures the farther object.
Understanding the depth buffer also understands why a transparent or translucent object needs to be closed zwrite, if not closed, the transparent object's depth will also be written to the depth buffer, which will eliminate the object behind it, the object will not be rendered, the back of the object can also be called transparent. So we need to set Zwrite OFF when we use alpha blending.
ZTest
Usage: ZTest (Less | Greater | lequal | gequal | Equal | notequal | Always)
Control how the depth test, which is said above the graphics processing card comparison between the depth of the comparison method. The default is lequal.
It is worth mentioning that when using Aplha blending zwrite need to be closed but the ztest is to be opened, because if there are opaque objects in front of the transparent object, the transparent object should be obscured.
Blend
Mixed. Controls how each shader output mixes with the colors already on the screen.
Usage:
Blend off: Turn off blending
Blend srcfactor dstfactor: Final color = shader generated color Xsrcfactor + original color on screen xdstfactor
Blend srcfactor Dstfactor, Srcfactora dstfactor: Just like above, only the alpha channel is calculated using the following two parameters
The common blend modes are :
Blend srcalpha Oneminussrcalpha//Alpha Blending
Blend one One//Additive
Blend Oneminusdstcolor One//Soft Additive
Blend Dstcolor Zero//multiplicative
Blend dstcolor Srccolor//2x multiplicative
Specific reference here
Unity5 start the following fixed function shader commands are marked as obsolete, the functions of these commands are now proposed to be implemented in shader (CG) through code, listed here for easy reading of previously written shader:lighting on | Off Material {Material Block} separatespecular on | Off Color color-value colormaterial ambientanddiffuse | Emission Fog {Fog Block} alphatest (Less | Greater | lequal | gequal | Equal | notequal | Always) Cutoffvalue settexture textureproperty {combine options} Surface Shader
Surface Shader hides many of the details of lighting processing and is designed to allow users to accomplish many things with just a few instructions (#pragma) and encapsulate many common lighting models and functions. Compared to the underlying vertex and Fragment Shader,suface Shader is more restrictive, it can only have one pass. If you do some regular functions and need lighting, you can use surface shader to write, relatively fast and convenient. If you are writing more advanced Shader, it is recommended to use vertex Shader and Fragment Shader.
Surface shader consists of two main parts, one is the instruction behind #pragma, and the other is the surf function.
Pragma's syntax is #pragma surface surfacefunction Lightmodel [Optionalparams]
-Surfacefunction is usually called the surf function, the function name can be taken by itself
Surf function prototype is: void Surf (Input in, InOut surfaceoutput o)
-Lightmodel is a built-in lighting model for unity, which can be lambert,blinn-phong and so on.
-Optionalparams: Contains many instructions detailed parameter reference here
The surf function mainly has an input structure and the output of the surfaceoutput structure. Input
The Input structure needs to be defined in shader. It can contain fields such that if you define these fields you can use them in the surf function (good magical black technology) The UV coordinates of multiple maps, the name must conform to the format uv+ map name. For example FLOAT2 Uv_maintex float3 viewdir-View orientation (view direction) value. In order to calculate the parallax effect (Parallax effects), edge illumination (rim lighting) and so on, you need to include the view direction (views direction) values. FLOAT4 with color semantic-the interpolated value of each vertex (Per-vertex) color. FLOAT4 Screenpos-The position in the screen space. To reflect the effect, you need to include location information in the screen space. For example, the wetstreet shader used in dark unity. FLOAT3 Worldpos-position in the world space. FLOAT3 WORLDREFL-a reflection vector in world space. This parameter is included if the surface shader (surface shader) does not write to the normal (o.normal) parameter. Please refer to this example: Reflect-diffuse shader. FLOAT3 Worldnormal-The normal vector in world space. This parameter is included if the surface shader (surface shader) does not write to the normal (o.normal) parameter. FLOAT3 WORLDREFL; Internal_data-a reflection vector in world space. This parameter is included if the surface shader (surface shader) does not write to the normal (o.normal) parameter. FLOAT3 Worldnormal; Internal_data-the normal vector in world space (normal vectors). This parameter is included if the surface shader (surface shader) does not write to the normal (o.normal) parameter. Surfaceoutput
Surfaceoutput describes the characteristics of the surface (the color reflectivity of the light, the normal, the scattering, the mirror, etc.), the structure is fixed and does not need to be redefined in the shader.
struct Surfaceoutput {
half3 Albedo; Reflectivity, generally is the original color before the light
half3 Normal; Normal
half3 emission; Self-luminous, used to enhance the brightness of the object itself, so that it seems to be able to light their own
half specular; Mirror
half Gloss; Gloss
half Alpha; Transparent
};
Unity5 with the introduction of a physical-based lighting model, the new two additional output
struct Surfaceoutputstandard
{
fixed3 Albedo; Base (diffuse or specular) color
fixed3 Normal; Tangent space Normal, if written
half3 emission;
Half metallic; 0=non-metal, 1=metal
half smoothness; 0=rough, 1=smooth
half occlusion; Occlusion (default 1)
fixed Alpha; Alpha for transparencies
};
struct Surfaceoutputstandardspecular
{
fixed3 Albedo; Diffuse color
fixed3 specular; Specular color
fixed3 Normal; Tangent space Normal, if written
half3 emission;
half smoothness; 0=rough, 1=smooth
half occlusion; Occlusion (default 1)
fixed Alpha; Alpha for transparencies
};
Unity provides some basic surfaceshader examples to help us understand how the input and output is being used.
Examples of surfaceshader provided by Unity Vertex Shader
If you don't want to use surface shader to write the vertex shaders and fragment shaders that are common in OpenGL and Direct3D, you can embed them in the pass with a CG snippet:
Pass {
//... the usual pass state setup ...
Cgprogram
//compilation directives for this snippet, e.g.:
#pragma vertex vert
#pragma fragment Frag
// The CG/HLSL code itself
ENDCG
//... the rest of the pass setup ...
}
Where Vert is the vertex shader function, Frag is the fragment shader function. In general, calculations that you can make in a vertex shader should not be counted in the fragment shader, because the vertex shader is computed per vertex and the fragment shader is calculated per pixel, and a model vertex is a lot less than that of a pixel.
Writing vertex and fragment shaders typically involves a unity-predefined help file, Unitycg.cginc, which defines some common structures and methods. Windows Unity This file is located ({Unity Install path}/data/cgincludes/unitycg.cginc. Mac Edition is located in/applications/unity/unity.app/contents/cgincludes/unitycg.cginc.
In the code we only need to add #include "unitycg.cginc" to use the structure and methods inside. Input
The prototype of the vertex shader is v2f vert (AppData v)
AppData is an input that can be defined by itself or can be predefined using unity. Unity has pre-defined three commonly used input structures in Unitycg.cginc: Appdata_base,appdata_tan,appdata_full.
struct Appdata_base {
float4 vertex:position;
FLOAT3 Normal:normal;
FLOAT4 texcoord:texcoord0;
};
struct Appdata_tan {
float4 vertex:position;
FLOAT4 tangent:tangent;
FLOAT3 Normal:normal;
FLOAT4 texcoord:texcoord0;
};
struct Appdata_full {
float4 vertex:position;
FLOAT4 tangent:tangent;
FLOAT3 Normal:normal;
FLOAT4 texcoord:texcoord0;
FLOAT4 Texcoord1:texcoord1;
FLOAT4 Texcoord2:texcoord2;
FLOAT4 Texcoord3:texcoord3;
#if defined (shader_api_xbox360)
half4 texcoord4:texcoord4;
Half4 Texcoord5:texcoord5;
#endif
fixed4 color:color;
};
We notice that the fields in these structures differ from the fields in the surface shader, with a colon and a label appended to it. This is the semantics of the field that tells the GPU where the data for this field should go to read and write. After all, the GPU is designed for graphic computing, and a lot of things are fixed, we just remember that there are a few names can be used.
type |
name |
label |
Notes |
Float4 |
Vertex |
POSITION |
The position of the vertex in the model coordinate system |
Float3 |
Normal |
NORMAL |
Normal vector of vertices |
Float4 |
Tangent |
TANGENT |
Tangent vector of vertices |
Float4 |
Color |
COLOR |
Vertex color |
Float4 |
Texcoord |
TEXCOORD0 |
The first UV coordinate of a vertex |
Float4 |
Texcoord1 |
TEXCOORD1 |
The second UV coordinate of a vertex, up to 5 |
Output
The output of the vertex shader is also a structure that can be defined by itself, but the structure content is also relatively fixed, generally including the vertex after the projection of the position, UV, vertex color, etc., you can add some later fragment shader need to use but need to calculate in the vertex shader value. This output is the input of the subsequent fragment shader.
struct v2f
{
float4 pos:sv_position;
Half2 UV : TEXCOORD0;
};
The fields you can use are:
type |
label |
Description |
Float4 |
Sv_position |
The position of the vertex under the projection space, note the position under the input model coordinate system, this field must be set, this coordinate transformation is the important work of vertex shader |
Float3 |
NORMAL |
Normal vector of vertex in view coordinate system |
Float4 |
TEXCOORD0 |
UV coordinates of the first map |
Float4 |
TEXCOORD1 |
UV coordinates of the second posting chart |
Float4 |
TANGENT |
Tangent vector, mainly used to correct normal maps normal maps |
Fixed4 |
COLOR |
First fixed-point color |
Fixed4 |
COLOR1 |
Second fixed-point color |
Any |
Any |
Other custom Fields |
Coordinate Transformation
One of the important tasks of vertex shaders is to coordinate transformations. The coordinates in the input of the vertex shader are the coordinates of the model coordinate system (OBJECTSPACE), and the final drawing to the screen is the projected coordinates.
In our shader, we just need a word to complete the conversion of coordinates, which is also the simplest vertex shader:
v2f Vert (AppData v) {
v2f o;
O.pos = Mul (UNITY_MATRIX_MVP, V.vertex);
return o;
}
The projected coordinates are obtained by multiplying the UNITY_MATRIX_MVP matrix with the coordinates of the vertex in the model coordinate system.
UNITY_MATRIX_MVP is the unity built-in model------The Matrix of unity built-in matrices as follows: UNITY_MATRIX_MVP: Current model, view-based projection matrix. (Note: Model matrix is local, world) UNITY_MATRIX_MV: Current model--view matrix Unity_matrix_v: Current View matrix unity_matrix_p: Current projection matrix UNITY_MATRIX_VP: Current View- > Projection Matrix UNITY_MATRIX_T_MV: Transpose model, view matrix UNITY_MATRIX_IT_MV: Reversal model--visual matrix for rotating normals from ObjectSpace to WorldSpace. Why can't the normal change be as unity_matrix_mv as the position transform? One is because the normal is a 3-dimensional vector and-unity_matrix_mv is a 4x4 matrix, and the second is because the normal is a vector, we only want to rotate it, but in the space transformation, if there is a non-equal scaling, the direction will be offset. Unity_matrix_texture0 to Unity_matrix_texture3: texture transformation matrix
Below is a brief introduction to the following coordinate systems:
Model coordinate system: also called object coordinate system, 3D modeling when each model is built in its own coordinate system, if a character model foot is (0,0,0) Point, it's body coordinates of the other points are relative to the foot of the origin of the point.
World coordinate system: Our scene is a world with its own origin, and when the model is placed in the scene, each vertex on the model has a new world coordinate. This coordinate can be obtained by the model coordinates of the vertex on the model matrix X model.
View coordinate system: also known as the observing coordinate system, which is the coordinate system with the observer (camera) as the Origin point. Objects in the scene are only drawn to the screen by the camera, and the camera can set the viewport size and clipping plane to control the visual range, which is relative to the camera, so you need to convert the world coordinates to the view coordinate system for easy processing.
Projected coordinate system: The scene is 3D, but the final drawing to the screen is 2D, the projection coordinate system to complete the work of the reduced dimension, the projection transformation after the 3D coordinates will become 2D coordinates. Projection has two parallel and perspective projections that can be set on Unity's camera.
Screen coordinate system: coordinates that are eventually drawn to the screen. The bottom left corner of the screen is the Origin point.
In addition to the built-in matrix, Unity also has built-in auxiliary functions that can be used within the vertex shader: float3 worldspaceviewdir (Float4 v): The direction of the camera's return to World Space (denormalized) based on the given local space vertex position (non-normalized) float3 Objspaceviewdir (Float4 v): The direction (denormalized) Float2 parallaxoffset (half h, half height, half3 viewdir) of the camera to return local space based on the given local space vertex position : Calculate UV offset for parallax normal map fixed luminance (fixed3 C): Convert color to Brightness (grayscale) fixed3 decodelightmap (fixed4 color): Decode color from Unity Illumination map (platform-based RGBM or Dldr) float4 Encodefloatrgba (float v): For storing low-precision render targets, encode the [0..1] range of floating-point numbers to the RGBA color. Float Decodefloatrgba (float4 enc): Decodes the Rgba color to float. Float2 Encodeviewnormalstereo (float3 N): Encodes the view space normals to two numbers in the range 0 to 1. FLOAT3 Decodeviewnormalstereo (float4 enc4): Decode view space normals enc4.xy Fragment from Shader
Todo
Reference:
A cat can learn Unity3d shader Getting Started Guide
Original address: http://blog.csdn.net/ring0hx/article/details/46440037