A cat can learn. Unity3d shader Getting Started Guide (i)

Source: Internet
Author: User

Motivation

It's been a while since I've been using Unity3d, but most of the time it's on the surface, and it's much easier to use this engine as a script control, with little understanding of the deeper layers. While the Unity engine design is designed to create a simple 3D engine that does not require the developer to worry about, it is only superficial to use and may not be able to achieve the desired position, so this situation must change! From where to start, seems to have a sentence called will write shader is a master, so, want to see from the beginning of shader can make their own level to reach a bit more, then, with this series (I hope I can persist in writing it, although it should be dragged for about six months or so).

Unity3d All rendering work is inseparable from shaders (Shader), if you and I have recently begun to be interested in Shader programming, perhaps you and I have the same confusion: how to get started? Unity3d provides some shader manuals and documentation (here, here and here), but the content is scattered and the learning ladder is slightly steeper. This is rather unfriendly to newcomers like me who have never been in touch with the content before. Although there are some shader introduction and experience at home and abroad, but also there is the problem of content dispersion, many tutorials in the previous chapter only introduced the basic concepts, and then immediately moved out a super complex example, for many basic usage does not explain. Maybe it's no problem for shader skilled developers, but I believe there are not a few people like me. After a lot of searching without fruit, I felt it necessary to write a tutorial to introduce some basic steps of shader development in the perspective of an introductory person. In fact, rather than a tutorial, rather than a self-summary, hoping to help people in need.

So, the object of this "tutorial" is

    • In general, the new contact with shader development: Perhaps you know what is shader, will also use other people's shader, but only to know some basic built shader name, never open them to view their source code.
    • Want to learn more about shader and developers who have a need for shader development, but have not experienced shader development before.

Of course, because I am in shader development is also a no-no big rookie, this article a lot of content is only in their own understanding plus some may not be very reliable verification and summary. The examples in this article should have a better way to implement, so you are a master and happen to pass by, if there is a good way to implement certain content, I implore you to leave a comment, I will continue to update and maintain this article.

Some basic concepts shader and material

If you are doing 3D game development, you must not be unfamiliar with the two words. The Shader (shader) is actually a small program that is responsible for combining the input mesh (mesh) in a specified manner with the input map or color, and then outputting it. The drawing unit can draw the image onto the screen based on this output. The input map or color, plus the corresponding shader, as well as the specific parameters of the shader set, the contents (shader and input parameters) are packaged and stored together, resulting in a material (material). We can then assign the material to the appropriate renderer (renderer) for rendering (output).

So there's nothing particularly magical about shader, it's just a program that provides a good input (color, texture, etc.) and output (the point and color that the renderer can read). What the shader developers have to do is to generate the output based on the input and the calculation transformation.

Shader can be broadly divided into two categories, simply speaking

    • Surface Shader-For you to do most of the work, you can achieve a lot of good results with simple tricks. Analogy card machine, do not need a lot of effort after the good results can be taken.
    • Fragment shader (Fragment Shader)-Can do more things, but it is also more difficult to write. The primary purpose of using fragment shaders is to develop more complex (or more efficient) targets at lower levels.

Since this is an introductory article, the following introduction will be focused on the surface shader.

Basic structure of shader program

Because shader code can say that specialization is very strong, it artificially prescribes its basic structure. The structure of an ordinary shader should look like this:

First, there are some attribute definitions that specify what input this code will have. Next is one or more sub-shaders, in which case the child shader is used by the running platform as determined by the actual operation. The child shader is the body of the code, and each child shader contains one or more passes. When calculating shading, the platform selects the most preferred shader, then runs the pass in turn, and then gets the result of the output. Finally, a rollback is specified to handle situations where all subshader cannot run (for example, the target device is too old and all subshader have unsupported features).

It should be stated in advance that, in the actual development of the surface shader, we will write the code directly at the Subshader level, and the system will compile our code into a number of appropriate passes. This is the end of the nonsense, let us actually enter the world of shader.

Hello Shader

The Hundred-line document is not as good as an instance, the following is a simple shader code, and then according to the code to verify the above mentioned structure and the elaboration of some basic shader syntax. Because this article is for Unity3d to write shader, so also use Unity3d to demonstrate it. First, create a new shader, you can find it in the project panel, create, select Shader, and name it Diffuse Texture :

Just use a text editor to open the newly created shader:

Shader"Custom/diffuse Texture"{Properties{_maintex("Base (RGB)",2D)="White"{}}Subshader{Tags{"Rendertype"="Opaque"}Lod200Cgprogram#pragma surface surf LambertSampler2d_maintex;structInput{Float2Uv_maintex;};voidSurf(InputInch,inout surfaceoutput o) { half4 c = tex2d  (_ Maintexin. Uv_maintex); o. Albedo = c. Rgb; O. Alpha = c. A;} endcg} fallback  "diffuse" }   

If you haven't seen the shader code before, the details will not be understood. But with the introduction of the basic structure above, you should be able to identify the composition of the shader, such as a Properties section, a Subshader, and a fallback. In addition, the first line is just this shader declaration and assigns it a name, such as our instance shader, you can find the shader in the corresponding position when you select shader in the material panel.

Next we talk about the shader, in order to understand the meaning of each statement.

Property

In the Properties{} define shader properties, the properties defined here will be provided as input to all child shaders. The syntax for the definition of each property is this:

_Name("Display Name", type) = defaultValue[{options}]

  • _name-the name of the property, which is simply the variable name, will be used to get the contents of the property after the entire shader code.
  • Display Name-This string will be displayed in Unity's material editor as a shader user-readable content
  • Type-the types of this property, the possible type of the content represented by the following:
    • Color-a colour that is defined by four quantities of rgba (red, green, and transparent);
    • 2-A map of the order size (256,512). This poster will be converted to the corresponding color of each pixel based on the model Uvs after sampling and is eventually displayed;
    • Rect-A map of a non-2 order size;
    • Cube-that is, cube map texture (cube texture), is simply a combination of 6 linked 2D maps, mainly used for reflection (such as the Sky Box and dynamic reflection), will also be converted to the corresponding point of sampling;
    • Range (min, max)-a floating-point number between the minimum and maximum values, typically used as a parameter to adjust certain attributes of the shader (e.g., the cutoff value for transparency rendering can be from 0 to 1);
    • Float-any one floating-point number;
    • Vector-a four-dimensional number;
  • DefaultValue defines the default value for this property by entering a conforming default value to specify the initial value of the corresponding property (some effects may require certain parameter values to achieve the desired effect, although these values can be adjusted later, However, if you specify the desired value by default, it saves the time for each adjustment, which is much easier.
    • Color-RGBA colors defined in 0~1, such as (1,1,1,1);
    • 2d/rect/cube-for stickers, the default value can be a string representing the default tint color, either an empty string or "white", "black", "gray", "bump" in one
    • Float,range-a specified floating-point number
    • Vector-a 4-dimensional number, written as (X,Y,Z,W)
  • There is also a {option}, which is only related to 2d,rect or cube mapping, we have to write the input at least after the mapping of a pair of empty {}, when we need to open a specific option can be written in this pair of curly braces. If you need to open more than one option at the same time, you can use whitespace delimited. The possible options are Objectlinear, Eyelinear, SphereMap, Cubereflect, and Cubenormal, all of which are OpenGL's texgen patterns, with the chance to stay in the back.

So, the statement of a set of attributes might look like this.

  //define a color with a default value of semi-transparent blue_maincolor  ( "Main Color" color ) =  (0,0,1,0. 5) //define a texture with a default of White_texture  ( "Texture" 2d) =  "white" {}          

Now you can see that the Properties section of the shader (and all the other shader) above should have no problem. The next step is the Subshader section.

Tags

Surface shaders can be modified by a number of tags, and the hardware determines when the shader is invoked by determining the tags. For example, the first sentence of Subshader in our example

Tags { "RenderType"="Opaque" }

Tells the system to call us when rendering non-transparent objects. Unity defines a number of columns such as the rendering process, and the rendertype is opaque, it is obvious that it is "RenderType" = "Transparent" called when rendering an object that contains a transparent effect. Here tags in fact implies that your shader output is what, if the output is non-transparent objects, that is written in opaque, if you want to render transparent or translucent pixels, it should be written in transparent.

Also more useful tags are "IgnoreProjector"="True" (not affected by projectors), "ForceNoShadowCasting"="True" (never generate Shadows) and "Queue"="xxx" (Specify the render order queue). The point here is that the queue is the label, and if you've ever used unity to mix transparent and opaque objects, you've probably already had a situation where opaque objects can't appear behind a transparent object. This situation is most likely due to incorrect rendering order of shader. The queue specifies the order in which the objects are rendered, and the predefined queue is:

    • Background-the first called Render, used to render the sky box or background
    • Geometry-This is the default value for rendering non-transparent objects (in general, most of the objects in the scene should be non-transparent)
    • Alphatest-Used to render alpha test pixels, set a queue for alphatest alone for efficiency reasons
    • Transparent-Render transparent objects in a backward-forward order
    • Overlay-The effect that is used to render the overlay, which is the final stage of the rendering (such as a lens flare)

These predefined values are essentially a set of defined integers, Background = Geometry = $, alphatest = 2450, Transparent = 3000, and finally overlay = 4000. When we actually set the queue value, not only can we use a few of the predefined values above, we can also specify our own queue value, written like this: "Queue"="Transparent+100" a call on a queue that is 100 after transparent. By adjusting the queue value, we can make sure that some objects must be rendered before or after other objects, and this technique is sometimes useful.

Lod

Lod is very simple, it is the level of detail abbreviation, in this case we have specified it as 200 (in fact, this is unity's built-in diffuse shader set value). This value determines what kind of shader we can use. In Unity's quality settings we can set the maximum allowable Lod and this subshader will not be available when the LOD is set below the LOD specified by Subshader. Unity built-in shader defines a set of LOD values that we can use as a reference to set our own LOD values when we implement our own shader, so that you can control it more precisely when you adjust the image according to the device's graphics performance.

    • Vertexlit and its series = 100
    • Decal, reflective vertexlit = 150
    • Diffuse = 200
    • Diffuse Detail, reflective bumped unlit, reflective bumped vertexlit = 250
    • bumped, specular = 300
    • Bumped specular = 400
    • Parallax = 500
    • Parallax Specular = 600
Shader body

In front of the miscellaneous said, finally can start to look at the most important part, that is, the input into the output of the code part. For your convenience, please allow me to copy the subject section of the Subshader above.

Cgprogram#pragma surface surf LambertSampler2d_maintex;structInput{Float2Uv_maintex;};voidSurf (input ininout  Surfaceoutput o) {half4 c = tex2d  (_maintexIN< Span class= "P". uv_maintex); o. Albedo = c. Rgb; O. Alpha = c. A;} endcg                

or row by line, the first is Cgprogram. This is a start tag, which shows that starting from here is a CG program (we use the CG/HLSL language when writing Unity's shader). The ENDCG of the last line is corresponding to the Cgprogram, which shows that the CG program ends here.

Next is a compiler directive: #pragma surface surf Lambert It declares that we are going to write a surface shader and specify a lighting model. It's written like this.

#pragma surface surfaceFunction lightModel [optionalparams]

    • Surface-a surface shader is declared
    • Surfacefunction-Name of the method of the shader code
    • Lightmodel-the lighting model used.

So in our case, we declare a surface shader, the actual code in the Surf function (which can be found below), using Lambert (that is, the normal diffuse) as the lighting model.

The next sentence sampler2D _MainTex; , sampler2d is a what? In fact, in CG, SAMPLER2D is a data container interface that is bound to texture. Wait a minute.. This argument is still too complex, simple to understand, so-called after the texture (map) is just a piece of memory storage, using the RGB (and maybe a) channel, and each channel 8bits of data. In particular, to know the correspondence between pixels and coordinates, and to get the data, we can not always go to calculate the memory address or offset at once, so it is possible to manipulate the map through sampler2d. More simply, sampler2d is the type of 2D map in GLSL, corresponding, Sampler1d,sampler3d,samplercube and so on.

After explaining what sampler2d is, you need to explain why you need a statement of the right here _MainTex , and we have not already Properties declared it as a decal. The answer is that the shader we use for the example is actually made up of two relatively separate blocks, the outer attribute declaration, the rollback, and so on, which unity can use and compile directly, and now we are in CGPROGRAM...ENDCG a code block, which is a CG program. For this CG program, to access the Properties variable defined in, you must declare it with the same name as the previous variable . So what sampler2D _MainTex; is actually done is to re-declare and link the _maintex, so that the next CG program can use this variable.

Finally, we can continue. Next is a struct structure. I believe that we are very familiar with the structure, we first skip it, directly look at the following surf function. The #pragma section above has pointed out the name of the method of our shader code called Surf, that charmed, that is, this code is the work of our shader core. We have said more than once that the shader is the code given the input and then the output is colored. CG defines the type and name of the method (which is our surf) declared as a surface shader, so we have no right to determine the type of input and output parameters of the surf, which can only be written in accordance with the rules. The rule is that the first parameter is an input structure, and the second parameter is a inout surfaceoutput structure.

What are they, respectively? Input is actually a structure that needs to be defined, which gives us an opportunity to put the data that we need to participate in the calculation into this input structure and pass in the surf function; The surfaceoutput is already defined with the type output structure, But initially the content is blank, we need to fill in the output, so we can complete the coloring. Take a closer look at input, and now you can jump back and look at the input structure defined above:

struct Input {float2 uv_MainTex;};

The struct as input must be named input, which defines a FLOAT2 variable ... You're right, I'm not wrong, that's float2, which means floating-point float is followed by a number 2, what does that mean? In fact, there is no magic, and both float and VEC can then add a number from 2 to 4 to represent the 2 to 4 similar types that are packaged together. For example, the following definitions:

//Define a 2d vector variablevec2 coordinate;//Define a color variablefloat4 color;//Multiply out a colorfloat3 multipliedColor = color.rgb * coordinate.x;

When we access these values, we can get the whole set of values by using only the names, or we could use subscripts (such as. Xyzw,.rgba or their parts such as. x, etc.) to get a value. In this example, we declare a variable called a uv_MainTex two floating-point number.

If you have heard about 3D development, you will not be unfamiliar with the two letters of UV. The role of UV mapping is to map a point on a 2D map to a 3D model in accordance with certain rules, which is the most common vertex processing method in 3D rendering. In the CG program, we have such a convention, in a map variable (in our case _MainTex ) before the addition of two letters of UV, it represents the UV value of extracting it (in fact, two represents the point on the map of the two-dimensional coordinates). We can then get the coordinate value of the point that this map is currently calculating by accessing Uv_maintex directly in the surf program.

If you insist on seeing this place, congratulate you, because it is only one step away from the final successful reading of a shader. We go back to the surf function, which has two parameters, the first is input, and we understand that when the output is computed, shader calls the surf function multiple times, each time the point coordinates on a map are given to calculate the output. The second parameter is a writable surfaceoutput,surfaceoutput is a predefined output structure, and the goal of our surf function is to fill in the output structure according to the input. The SURFACEOUTPUT structure is defined as follows

  struct surfaceoutput { half3 albedo//pixels of color half3 normal; //pixels of normal value half3 emission; //pixel divergence color half specular//pixels of Specular highlights half gloss//pixels of luminous intensity half alpha; //pixels transparency };             

The half here and ours. Float is similar to a double, both representing floating-point numbers, but with different precision. Maybe you're familiar with single-precision floating-point numbers (float or double) and dual-precision floating-point numbers (double), where half refers to semi-precision floating-point numbers, with the lowest precision and higher-precision floating-point numbers, so they are used extensively.

In the example, the things we do are very simple:

half4 c = tex2D (_MainTex, IN.uv_MainTex);o.Albedo = c.rgb;o.Alpha = c.a;

Here is a function that tex2d is used in a CG program to sample a point in a poster and return a float4. Here the _maintex is sampled on the input point and the RBG value of its color is given the pixel color of the output, and the A value is given transparency. As a result, the shader knows how to work: find the corresponding UV point on the map, and use the color information directly for coloring, over.

A cat can learn. Unity3d shader Getting Started Guide (i)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.