Unity3d Shader Getting Started Guide (i)

Source: Internet
Author: User
Tags require rollback

It's been a while since I've been using Unity3d, but most of the time it's on the surface, and it's much easier to use this engine as a script control, with little understanding of the deeper layers. While the Unity engine design is designed to create a simple 3D engine that does not require the developer to worry about, it is only superficial to use and may not be able to achieve the desired position, so this situation must change. From where to start, seems to have a sentence called will write shader is a master, so, want to see from the beginning of shader can make their own level to reach a bit more, then, with this series (I hope I can persist in writing it, although it should be dragged for about six months or so).

Unity3d All rendering work is inseparable from shaders (Shader), if you and I have recently begun to be interested in Shader programming, perhaps you and I have the same confusion: how to get started. Unity3d provides some shader manuals and documentation (here, here and here), but the content is scattered and the learning ladder is slightly steeper. This is rather unfriendly to newcomers like me who have never been in touch with the content before. Although there are some shader introduction and experience at home and abroad, but also there is the problem of content dispersion, many tutorials in the previous chapter only introduced the basic concepts, and then immediately moved out a super complex example, for many basic usage does not explain. Maybe it's no problem for shader skilled developers, but I believe there are not a few people like me. After a lot of searching without fruit, I felt it necessary to write a tutorial to introduce some basic steps of shader development in the perspective of an introductory person. In fact, rather than a tutorial, rather than a self-summary, hoping to help people in need.

So, the object of this "tutorial" is generally a new contact with shader development people: Maybe you know what is shader, will also use other people's shader, but only to know some basic built shader name, never open them to view its source code. Want to learn more about shader and developers who have a need for shader development, but have not experienced shader development before.

Of course, because I am in shader development is also a no-no big rookie, this article a lot of content is only in their own understanding plus some may not be very reliable verification and summary. The examples in this article should have a better way to implement, so you are a master and happen to pass by, if there is a good way to implement certain content, I implore you to leave a comment, I will continue to update and maintain this article.


Some basic concepts shader and material

If you are doing 3d game development, you must not be unfamiliar with the two words. The Shader (shader) is actually a small program that is responsible for combining the input mesh (mesh) in a specified manner with the input map or color, and then outputting it. The drawing unit can draw the image onto the screen based on this output. The input map or color, plus the corresponding shader, as well as the specific parameters of the shader set, the contents (shader and input parameters) are packaged and stored together, resulting in a material (material). We can then assign the material to the appropriate renderer (renderer) for rendering (output).

So there's nothing particularly magical about shader, it's just a program that provides a good input (color, texture, etc.) and output (the point and color that the renderer can read). What the shader developers have to do is to generate the output based on the input and the calculation transformation.

Shader can be broadly divided into two categories, simply speaking surface shader (surface Shader)-doing most of the work for you, with just a few simple tricks to achieve a lot of good results. Analogy card machine, do not need a lot of effort after the good results can be taken. Fragment shader (Fragment Shader)-Can do more things, but it is also more difficult to write. The primary purpose of using fragment shaders is to develop more complex (or more efficient) targets at lower levels.

Since this is an introductory article, the following introduction will be focused on the surface shader.


Basic structure of shader program

Because shader code can say that specialization is very strong, it artificially prescribes its basic structure. The structure of an ordinary shader should look like this:

The structure of a shader program

First, there are some attribute definitions that specify what input this code will have. Next is one or more sub-shaders, in which case the child shader is used by the running platform as determined by the actual operation. The child shader is the body of the code, and each child shader contains one or more passes. When calculating shading, the platform selects the most preferred shader, then runs the pass in turn, and then gets the result of the output. Finally, a rollback is specified to handle situations where all subshader cannot run (for example, the target device is too old and all subshader have unsupported features).

It should be stated in advance that, in the actual development of the surface shader, we will write the code directly at the Subshader level, and the system will compile our code into a number of appropriate passes. This is the end of the nonsense, let us actually enter the world of shader.


Hello Shader

The Hundred-line document is not as good as an instance, the following is a simple shader code, and then according to the code to verify the above mentioned structure and the elaboration of some basic shader syntax. Because this article is for Unity3d to write shader, so also use Unity3d to demonstrate it. First, create a new shader, you can find it in the project panel, create, select Shader, and then name it diffuse Texture:


Create a new shader in Unity3d

Just use a text editor to open the newly created shader:

Shader "Custom/diffuse Texture" {
  Properties {
      _maintex ("Base (RGB)", 2D) = "White" {}
  }
  subshader {
  
   tags {"Rendertype" = "Opaque"}
      LOD
       
      cgprogram
      #pragma surface surf Lambert sampler2d
 
      _maintex;
 
      struct Input {
          float2 uv_maintex;
      };
 
      void Surf (Input in, InOut surfaceoutput o) {
          Half4 c = tex2d (_maintex, In.uv_maintex);
          O.albedo = C.rgb;
          O.alpha = C.A;
      }
      ENDCG
  }
  FallBack "diffuse"
}
  


If you haven't seen the shader code before, the details will not be understood. But with the introduction of the basic structure above, you should be able to identify the composition of the shader, such as a Properties section, a Subshader, and a fallback. In addition, the first line is just this shader declaration and assigns it a name, such as our instance shader, you can find the shader in the corresponding position when you select shader in the material panel.


Find the newly-created shader in Unity3d

Next we talk about the shader, in order to understand the meaning of each statement.


Properties

The shader properties are defined in properties{}, and the properties defined here are provided as input to all child shaders:

_name ("Display name", type) = Defaultvalue[{options}] _name-The name of the property, simply said to be the variable name, after which the entire shader code will use this name to get the contents of the property Display Name-This string will be displayed in Unity's material editor as the user-readable content of shader-the type of this property, the possible type is represented by the following: color-a color defined by the RGBA (red-green-blue and transparency) four quantities; 2D- A map with a 2 order size (256,512, and so on). This poster will be converted to the corresponding color of each pixel based on the model Uvs after sampling, which is eventually displayed; Rect-a map of a non-2-order size; cube-the cube map texture (cube texture), which is simply a combination of 6 linked 2D maps, It is mainly used for reflection effects (such as sky boxes and dynamic reflections) and is also converted to the corresponding point sampling; Range (min, max)-a floating-point number between the minimum and maximum values, Generally used as a parameter to adjust shader certain characteristics (such as the cutoff value of transparency rendering can be from 0 to 1, etc.); float-any one floating point; Vector-a four-dimensional number;
DefaultValue defines the default value for this property by entering a conforming default value to specify the initial value of the corresponding property (some effects may require certain parameter values to achieve the desired effect, although these values can be adjusted later, However, if you specify the desired value by default, it saves the time for each adjustment, which is much easier. Color-an RGBA color defined in 0~1, such as (1,1,1,1); 2d/rect/cube-For a map, the default value can be a string representing the default tint color, either an empty string or "white", "black", "gray", " A float,range in Bump "-a specified floating-point Vector-a 4-dimensional number, written as (X,Y,Z,W)
There is also a {option}, which is only related to 2d,rect or cube mapping, we have to write the input at least after the mapping of a pair of empty {}, when we need to open a specific option can be written in this pair of curly braces. If you need to open more than one option at the same time, you can use whitespace delimited. Possible choices are objectlinear, Eyelinear, SphereMap, Cubereflect, and Cubenormal, all of which are texgen patterns in OpenGL. So, the statement of a set of attributes might look like this.

Define a color with a default value of semi-transparent blue
_maincolor ("Main color", color) = (0,0,1,0.5)
//de Fine a texture with a default of white
_texture ("texture", 2D) = ' White ' {}
The next step is the Subshader section.


Tags

Surface shaders can be modified by a number of tags, and the hardware determines when the shader is invoked by determining the tags. For example, the first sentence of Subshader in our example

Tags {"Rendertype" = "Opaque"}

Tells the system to call us when rendering non-transparent objects. Unity defines a number of columns such as the rendering process, and rendertype is opaque the obvious is "rendertype" = "Transparent", which is called when rendering objects containing transparent effects. Here tags in fact implies that your shader output is what, if the output is non-transparent objects, that is written in opaque, if you want to render transparent or translucent pixels, it should be written in transparent.

Also more useful tags are "ignoreprojector" = "true" (not affected by projectors), "forcenoshadowcasting" = "true" (never produce Shadows) and "Queue" = "xxx" (Specify the render order queue). The point here is that the queue is the label, and if you've ever used unity to mix transparent and opaque objects, you've probably already had a situation where opaque objects can't appear behind a transparent object. This situation is most likely due to incorrect rendering order of shader. The queue specifies the order in which objects are rendered, and the predefined queue has: Background-the first called Render, used to render the sky box or background Geometry-this is the default value for rendering non-transparent objects (in general, most of the objects in the scene should be non-transparent) Alphatest-Used to render alpha test pixels, setting a queue separately for alphatest is a matter of efficiency considerations Transparent-to render a transparent object from a backward forward order Overlay-to render the overlay effect, Is the final stage of rendering (such as lens flare and other effects)

These predefined values are essentially a set of defined integers, Background = Geometry = $, alphatest = 2450, Transparent = 3000, and finally overlay = 4000. When we actually set the queue value, not only can we use a few of the predefined values above, we can also specify our own queue value, written like this: "Queue" = "transparent+100", Represents a call on a queue that is 100 after transparent. By adjusting the queue value, we can make sure that some objects must be rendered before or after other objects, and this technique is sometimes useful.


LOD

Lod is very simple, it is the level of detail abbreviation, in this case we have specified it as 200 (in fact, this is unity's built-in diffuse shader set value). This value determines what kind of shader we can use. In Unity's quality settings we can set the maximum allowable Lod and this subshader will not be available when the LOD is set below the LOD specified by Subshader. Unity built-in shader defines a set of LOD values that we can use as a reference to set our own LOD values when we implement our own shader, so that you can control it more precisely when you adjust the image according to the device's graphics performance. Vertexlit and its series = Decal, reflective vertexlit = Diffuse = $ Diffuse Detail, reflective bumped unlit, reflective Bu mped Vertexlit = bumped, specular = bumped specular = + Parallax = Parallax Specular = 600


Shader Body

In front of the miscellaneous said, finally can start to look at the most important part, that is, the input into the output of the code part. For your convenience, please allow me to copy the subject section of the Subshader above.

Cgprogram
#pragma surface surf Lambert
 
sampler2d _maintex;
 
struct Input {
  float2 uv_maintex;
};
 
void Surf (Input in, InOut surfaceoutput o) {
  Half4 c = tex2d (_maintex, In.uv_maintex);
  O.albedo = C.rgb;
  O.alpha = C.A;
}
Endcg

or row by line, the first is Cgprogram. This is a start tag, which shows that starting from here is a CG program (we use the CG/HLSL language when writing Unity's shader). The ENDCG of the last line is corresponding to the Cgprogram, which shows that the CG program ends here.

Next is a compiler directive: #pragma surface surf Lambert, it declares that we want to write a surface shader and specify a lighting model. It's written like this.

#pragma surface surfacefunction Lightmodel [Optionalparams] Surface-the name of the method that declares a surface shader surfacefunction-shader code Lightmode L-the lighting model used.

We declare a surface shader, the actual code in the surf function, using Lambert (that is, the normal diffuse) as the lighting model.

The next sentence sampler2d _maintex;,sampler2d is a what. In fact, in CG, SAMPLER2D is a data container interface that is bound to texture. Wait a minute.. This argument is still too complex, simple to understand, so-called after the texture (map) is just a piece of memory storage, using the RGB (and maybe a) channel, and each channel 8bits of data. In particular, to know the correspondence between pixels and coordinates, and to get the data, we can not always go to calculate the memory address or offset at once, so it is possible to manipulate the map through sampler2d. More simply, sampler2d is the type of 2D map in GLSL, corresponding, Sampler1d,sampler3d,samplercube and so on.

After explaining what sampler2d is, you need to explain why you need a statement of _maintex here, and we have not already declared it in the properties as a decal. The answer is that the shader we use for the example is actually made up of two relatively separate blocks, the outer attribute declaration, the rollback, and so on, which unity can use and compile directly shaderlab; now we are in cgprogram ... Endcg Such a code block, this is a piece of CG program. For this CG program, you must declare with the same name as the previous variable if you want to access the variables defined in the properties. So in fact sampler2d _maintex, the thing to do is to re-declare and link the _maintex, so that the next CG program can use this variable.

Next is a struct structure. I believe that we are very familiar with the structure, we first skip it, directly look at the following surf function. The #pragma section above has pointed out the name of the method of our shader code called Surf, that charmed, that is, this code is the work of our shader core. We have said more than once that the shader is the code given the input and then the output is colored. CG defines the type and name of the method (which is our surf) declared as a surface shader, so we have no right to determine the type of input and output parameters of the surf, which can only be written in accordance with the rules. The rule is that the first parameter is an input structure, and the second parameter is a inout surfaceoutput structure.

What are they respectively? Input is actually a structure that needs to be defined, which gives us an opportunity to put the data that we need to participate in the calculation into this input structure and pass in the surf function; The surfaceoutput is already defined with the type output structure, But initially the content is blank, we need to fill in the output, so we can complete the coloring. Take a closer look at input, and now you can jump back and look at the input structure defined above:

struct Input {
  float2 uv_maintex;
};
The struct as input must be named input, which defines a FLOAT2 variable ... You're right, I'm not wrong, that's float2, that means floating-point float is followed by a number 2, what does that mean? In fact, there is no magic, and both float and VEC can then add a number from 2 to 4 to represent the 2 to 4 similar types that are packaged together. For example, the following definitions:
Define a 2d vector variable
vec2 coordinate;
Define a color variable
float4 color;
Multiply out a color
float3 multipliedcolor = Color.rgb * coordinate.x;

When we access these values, we can get the whole set of values by using only the names, or we could use subscripts (such as. Xyzw,.rgba or their parts such as. x, etc.) to get a value. In this example, we declare a variable that contains two floating-point numbers called Uv_maintex.

If you have heard about 3d development, you will not be unfamiliar with the two letters of UV. The role of UV mapping is to map a point on a 2D map to a 3D model in accordance with certain rules, which is the most common vertex processing method in 3D rendering. In the CG program, we have such a convention, in a map variable (in our case is _maintex) before the addition of UV two letters, it represents the UV value of extracting it (in fact, two represents the point on the map of the two-dimensional coordinates). We can then get the coordinate value of the point that this map is currently calculating by accessing Uv_maintex directly in the surf program.

If you insist on seeing this place, congratulate you, because it is only one step away from the final successful reading of a shader. We go back to the surf function, which has two parameters, the first is input, and we understand that when the output is computed, shader calls the surf function multiple times, each time the point coordinates on a map are given to calculate the output. The second parameter is a writable surfaceoutput,surfaceoutput is a predefined output structure, and the goal of our surf function is to fill in the output structure according to the input. The SURFACEOUTPUT structure is defined as follows

struct Surfaceoutput {
    half3 Albedo;     The color of the pixel
    half3 Normal;     The normal value of pixels is
    half3 emission;   The divergence color of pixels
    half specular;    Pixel specular highlight
    half Gloss;       The luminous intensity of the pixel
    half Alpha;       Transparency of Pixels
};

The half here and ours. Float is similar to a double, both representing floating-point numbers, but with different precision. Maybe you're familiar with single-precision floating-point numbers (float or double) and dual-precision floating-point numbers (double), where half refers to semi-precision floating-point numbers, with the lowest precision and higher-precision floating-point numbers, so they are used extensively.

In the example, the things we do are very simple:

Half4 C = tex2d (_maintex, In.uv_maintex);
O.albedo = C.rgb;
O.alpha = C.A;

Here is a tex2d function, which is a method used by the CG program to sample a point in a poster, returning a float4. Here the _maintex is sampled on the input point and the RBG value of its color is given the pixel color of the output, and the A value is given transparency. As a result, the shader knows how to work: find the corresponding UV point on the map, and use the color information directly for coloring, over.


Next ...

I think now that you can read some of the simplest Shader, the next thing I would recommend is to look at Unity's surface Shader examples with a variety of basic Shader. On the basis of this tutorial, with some Google work, fully understand this shader sample page should not be a problem. If you can do without pressure to understand, it means that you have a good foundation can go to the deeper level of shader (perhaps not to my next tutorial can start to write some of the effect of their own); If there is a temporary difficulty, it does not matter, shader learning is definitely a gradual process, Because there are many conventions and common skills, more accumulation and practice will naturally progress and master.

In the next tutorial, we're going to look at some practical examples and start from the basics to actually get a little more complicated, so we can see the power of shader in real use. I hope I can finish this series as soon as possible, but the time is really limited, so I do not know when to be baked ... When it's written, I'll change the content and point to the new article. If you are worried about missing, you can also use the email subscription or subscribe to the RSS (although Google Reader has closed-).

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.