Unity3d Shader Getting Started

Source: Internet
Author: User

What is shader

A Shader (shader) is a program that operates on a 3D object and is executed by the GPU and is responsible for combining the input mesh (mesh) in a specified manner with the input map or color, and then outputting it. The drawing unit can draw the image onto the screen based on this output. The input map or color, plus the corresponding shader, as well as the specific parameters of the shader set, the contents (shader and input parameters) are packaged and stored together, resulting in a material (material). We can then assign the material to the appropriate renderer (renderer) for rendering (output). Shader is not a unified standard, and the shader of different graphics interfaces are not the same. OpenGL's coloring language is GLSL, Nvidia has developed CG, while Microsoft's Direct3D uses the Advanced Shader language (HLSL). Unity's shader is a code-generation framework that embeds the traditional graphical interface's shader (written by CG/HLSL) into a unique, descriptive structure that eventually automatically generates its own shader for each hardware platform, enabling cross-platform.

Shader species

Both OpenGL and Direct3D provide three types of shaders:

    • Vertex shader: Processes each vertex, projecting the space position of the vertex on the screen, which is the two-dimensional coordinates of the computed vertex. At the same time, it is responsible for the calculation of the vertex depth buffer (z-buffer). Vertex shaders can control properties such as the position of vertices, color and texture coordinates, but cannot generate new vertices. The output of the vertex shader is passed to the next step of the pipeline. If a geometry shader is defined later, the geometry shader processes the output data of the vertex shader, otherwise the rasterizer continues the pipelining task.
    • A pixel shader (Direct3D), often referred to as a fragment shader (OpenGL), processes data from a rasterizer. The rasterizer fills the polygon and passes through the pipeline to the pixel shader, which calculates the color per pixel. Pixel shaders are commonly used to handle scene lighting and related effects, such as concave-convex texture mapping and color grading. The name fragment shader seems more accurate because the call to the shader and the display of the pixels on the screen do not correspond to one by one. For example, for a pixel, a fragment shader may be called several times to determine its final color, and those objects that are obscured will be counted until the final depth buffer is sorted before and after each object.
    • Geometry shader: You can delete vertices from the polygon mesh. It is able to perform the work of generating geometries that are too onerous for CPUs and adding model details. Direct3D version 10 adds an API that supports geometry shaders and becomes part of Shader Model 4.0. OpenGL can only be used with a geometry shader from one of its plugins.

Shader can be broadly divided into two categories, simply speaking

    • Surface Shader-For you to do most of the work, you can achieve a lot of good results with simple tricks. Analogy card machine, do not need a lot of effort after the good results can be taken.
    • Fragment shader (Fragment Shader)-Can do more things, but it is also more difficult to write. The primary purpose of using fragment shaders is to develop more complex (or more efficient) targets at lower levels.

Shader Program Structure

Example of a basic surface shader:

Shader"Custom/diffuse Texture"{Properties {_maintex("Base (RGB)", 2D) =" White"{}} subshader {Tags {"Rendertype"="Opaque"} LOD $Cgprogram#pragma surface surf Lambert sampler2d_maintex;      struct Input {float2 Uv_maintex;      }; void Surf (Input in, InOut surfaceoutput o) {half4 c= tex2d (_maintex, In.uv_maintex); O.albedo=C.rgb; O.alpha=C.A; } ENDCG} FallBack"Diffuse"}

Example of a basic vertex fragment shader:

Shader"Vertexinputsimple"{subshader {Pass {cgprogram#pragma vertex vert#pragma fragment Frag#include"Unitycg.cginc"struct V2F {float4 pos:sv_position;      Fixed4 Color:color;      };          v2f Vert (Appdata_base v) {v2f o; O.pos=Mul (UNITY_MATRIX_MVP, V.vertex); O.COLOR.XYZ= V.normal *0.5+0.5; O.COLOR.W=1.0; returno; } fixed4 Frag (v2f i): sv_target {returnI.color;} ENDCG} }}

The following mainly describes the surface shader

In the Properties{} define shader properties, the properties defined here will be provided as input to all child shaders. The syntax for the definition of each property is this:

_Name("Display Name", type) = defaultValue[{options}]

    • _name-the name of the property, which is simply the variable name, will be used to get the contents of the property after the entire shader code.
    • Display Name-This string will be displayed in Unity's material editor as a shader user-readable content
    • Type-the types of this property, the possible type of the content represented by the following:
Key Words type corresponding CG type Example
Float Floating point number (any one floating point number) Float _myfloat ("My float", float) = 0.5
Range Floating-point number (within the specified range) (a floating-point number between the minimum and maximum values, typically used as a parameter to adjust some of the characteristics of the shader (e.g., the cutoff value for transparency rendering can be a value from 0 to 1, etc.)) Float _myrange ("My range", Range (0.01, 0.5)) = 0.1
Color Floating-point four tuples (one color, defined by four quantities of rgba (red, green, and transparency) Float4 _mycolor ("Some color", color) = (1,1,1,1)
Vector Floating-point four tuples (one four-dimensional number) Float4 _myvector ("Some vector", vector) = (1,1,1,1)
The A map of the order size of 2 (a mapping of a 2 order size (256,512). The poster will be converted to the color of each pixel based on the model UV after sampling, which is eventually displayed) Sampler2d _mytexture ("Texture", 2D) = "White" {}
Rect Map of order size other than 2 (a map with a non-2 order size) Sampler2d _myrect ("My rect", rect) = "White" {}
CUBE CubeMap (that is, cube map texture (cube texture), which is simply a combination of 6 linked 2D maps, is mainly used for reflection (such as Sky box and dynamic reflection) and is converted to the corresponding point sampling) Samplercube _mycubemap ("Cubemap", CUBE) = "" "{}
    • DefaultValue defines the default value for this property by entering a conforming default value to specify the initial value of the corresponding property (some effects may require certain parameter values to achieve the desired effect, although these values can be adjusted later, However, if you specify the desired value by default, it saves the time for each adjustment, which is much easier.
      • Color-RGBA colors defined in 0~1, such as (1,1,1,1);
      • 2d/rect/cube-for stickers, the default value can be a string representing the default tint color, either an empty string or "white", "black", "gray", "bump" in one
      • Float,range-a specified floating-point number
      • Vector-a 4-dimensional number, written as (X,Y,Z,W)
    • Option it only has to do with 2d,rect or cube mapping, at least we have to write a pair of blank {} with nothing after the map, when we need to open a specific option, we can write it in the curly braces. If you need to open more than one option at the same time, you can use whitespace delimited. Possible choices are objectlinear, Eyelinear, SphereMap, Cubereflect, and Cubenormal, all of which are texgen patterns in OpenGL.

So, the statement of a set of attributes might look like this.

Define a color with a default value of semi-transparent blue_maincolor ("Main color ", Color) = (0,0,1,0.5)//Define a texture with a default of White_texture ("Texture", "White"  {}
Tag

Subshader can be modified by a number of tags, and the hardware determines when the shader is invoked by determining the tags. For example, the first sentence of Subshader in our example:

" Rendertype "="Opaque" }

The more common tags are:

  • Queue
    This tag is important, it defines an integer that determines the order in which the shader is rendered, the smaller the number, the earlier it is rendered, and the early rendering means that it may be overwritten by something that is rendered later.
    The pre-defined queue is:
    name value Description
    Background 1000 The first called render, used to render the sky box or background
    Geometry 2000 This is the default value for rendering non-transparent objects (in general, most of the objects in the scene should be non-transparent)
    Alphatest 2450 Used to render alpha test pixels, setting a queue for alphatest alone is a result of efficiency considerations
    Transparent 3000 To render a transparent object from a backward forward order
    Overlay 4000 The effect used to render the overlay is the final stage of the rendering (such as a lens flare and other effects)

    • Rendertype
      "Opaque" or "Transparent" is two commonly used rendertype. If the output is non-transparent object, it is written in opaque, if you want to render transparent or translucent pixels, it should be written in transparent. This tag is mainly used shaderreplacement, generally this tag seems to have no effect.

Also more useful tags are "IgnoreProjector"="True" (not affected by projectors), "ForceNoShadowCasting"="True" (never generate Shadows) and "Queue"="xxx" (Specify the render order queue).

Lod

Lod is a shorthand for the level of detail, specifically the shader level of detail, because there is also a model LOD concept in unity, which is two different things. We only introduce LOD in shader, the model LOD please refer here.

Shader LOD lets us set a value that determines what kind of Shader we can use. The maximum allowable Lod can be set by Shader.maximumlod or Shader.globalmaximumlod, and this subshader will not be available when the LOD specified is less than Subshader. With Lod, we can write a set of subshader for a material, specifying that the larger the Lod,lod, the better the rendering and, of course, the higher the hardware requirements, and then set it according to the different terminal hardware configuration Globalmaximumlod to achieve the best performance-balancing results.

Unity built-in shader defines a set of LOD values that we can use as a reference to set our own LOD values when we implement our own shader.

    • Vertexlit and its series = 100
    • Decal, reflective vertexlit = 150
    • Diffuse = 200
    • Diffuse Detail, reflective bumped unlit, reflective bumped vertexlit = 250
    • bumped, specular = 300
    • Bumped specular = 400
    • Parallax = 500
    • Parallax Specular = 600
Shader body

In front of the miscellaneous said, finally can start to look at the most important part, that is, the input into the output of the code part. For your convenience, please allow me to copy the subject section of the Subshader above.

Cgprogram#_maintex; struct Input {    float2 uv_maintex;}; void Surf (Input in, InOut surfaceoutput o) {    = tex2d (_maintex, In.uv_maintex);     = C.rgb;     = C.A;} ENDCG

The first is Cgprogram. This is a start tag, which shows that starting from here is a CG program (we use the CG/HLSL language when writing Unity's shader). The ENDCG of the last line is corresponding to the Cgprogram, which shows that the CG program ends here.

Next is a compiler directive: #pragma surface surf Lambert It declares that we are going to write a surface shader and specify a lighting model. It's written like this.

#pragma surface surfaceFunction lightModel [optionalparams]

    • Surface-a surface shader is declared
    • Surfacefunction-Name of the method of the shader code
    • Lightmodel-the lighting model used.

So in our case, we declare a surface shader, the actual code in the Surf function (which can be found below), using Lambert (that is, the normal diffuse) as the lighting model.

The next sentence sampler2D _MainTex; , sampler2d is a what? In fact, in CG, SAMPLER2D is a data container interface that is bound to texture. Wait a minute.. This argument is still too complex, simple to understand, so-called after the texture (map) is just a piece of memory storage, using the RGB (and maybe a) channel, and each channel 8bits of data. In particular, to know the correspondence between pixels and coordinates, and to get the data, we can not always go to calculate the memory address or offset at once, so it is possible to manipulate the map through sampler2d. More simply, sampler2d is the type of 2D map in GLSL, corresponding, Sampler1d,sampler3d,samplercube and so on.

After explaining what sampler2d is, you need to explain why you need a statement of the right here _MainTex , and we have not already Properties declared it as a decal. The answer is that the shader we use for the example is actually made up of two relatively separate blocks, the outer attribute declaration, the rollback, and so on, which unity can use and compile directly, and now we are in CGPROGRAM...ENDCG a code block, which is a CG program. For this CG program, to access the Properties variable defined in, you must declare it with the same name as the previous variable . So what sampler2D _MainTex; is actually done is to re-declare and link the _maintex, so that the next CG program can use this variable.

Next is a struct structure. I believe that we are already familiar with the structure, we skip first, directly look at the following surf function. The #pragma section above has pointed out the name of our shader code method called Surf, which is the core of our shader's work. We have said more than once that the shader is the code given the input and then the output is colored. CG defines the type and name of the method (which is our surf) declared as a surface shader, so we have no right to determine the type of input and output parameters of the surf, which can only be written in accordance with the rules. The rule is that the first parameter is an input structure, and the second parameter is a inout surfaceoutput structure.

Input is actually a structure that needs to be defined, which gives us an opportunity to put the data that we need to participate in the calculation into this input structure and pass in the surf function; The surfaceoutput is already defined with the type output structure, But initially the content is blank, we need to fill in the output, so we can complete the coloring. Take a closer look at input, and now you can jump back and look at the input structure defined above:

struct Input {    float2 uv_maintex;};

The struct as input must be named input, which defines a FLOAT2 variable that represents the float followed by a number 2,float and VEC can then add a number from 2 to 4 to represent the 2 to 4 homogeneous types that are packaged together. For example, the following definitions:

Define a 2d vector variablevec2 coordinate; Define a color variablefloat4 color;  = Color.rgb * coordinate.x;

When we access these values, we can get the whole set of values by using only the names, or we could use subscripts (such as. Xyzw,.rgba or their parts such as. x, etc.) to get a value.

In this example, we declare a variable called a uv_MainTex two floating-point number.

If you have heard about 3D development, you will not be unfamiliar with the two letters of UV. The role of UV mapping is to map a point on a 2D map to a 3D model in accordance with certain rules, which is the most common vertex processing method in 3D rendering. In the CG program, we have such a convention, in a map variable (in our case _MainTex ) before the addition of two letters of UV, it represents the UV value of extracting it (in fact, two represents the point on the map of the two-dimensional coordinates). We can then get the coordinate value of the point that this map is currently calculating by accessing Uv_maintex directly in the surf program.

Back to the surf function, which has two parameters, the first is input: When the output is computed shader calls the surf function multiple times, each time the point coordinates on a map are given to calculate the output. The second parameter is a writable surfaceoutput,surfaceoutput is a predefined output structure, and the goal of our surf function is to fill in the output structure according to the input. The SURFACEOUTPUT structure is defined as follows

struct Surfaceoutput {    half3 Albedo;      the color of the pixel    half3     Normal; the normal value of pixels is    half3 emission   ; the divergence color of pixels    half specular    ; Pixel specular highlight       half Gloss; the luminous intensity of the pixel    half       Alpha; transparency of pixels};

The half here and ours. Float is similar to a double, both representing floating-point numbers, but with different precision. Maybe you're familiar with single-precision floating-point numbers (float or double) and dual-precision floating-point numbers (double), where half refers to semi-precision floating-point numbers, with the lowest precision and higher-precision floating-point numbers, so they are used extensively.

In the example, the things we do are very simple:

Half4 C = tex2d (_maintex== C.A;

Here is a function that tex2d is used in a CG program to sample a point in a poster and return a float4. Here the _maintex is sampled on the input point and the RBG value of its color is given the pixel color of the output, and the A value is given transparency. As a result, the shader knows how to work: find the corresponding UV point on the map and use the color information directly for coloring.

Next...

I think now that you can read some of the simplest Shader, the next thing I would recommend is to look at Unity's surface Shader examples with a variety of basic Shader.

Original link

Unity3d Shader Getting Started

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.