Esfog_unityshader Tutorial _unityshader Syntax Example analysis

Source: Internet
Author: User
Tags mul

Distance from the last preface has been a long time, has been relatively busy, today is the weekend can not be dragged, after a period of my consideration, I decided that this series of tutorials will avoid too much detail, one can avoid some students are misled, and will avoid the article is too lengthy and difficult to read, Sunline can give you more time to think for yourself. If I want to tell you some details, I'll open another series.

An analysis of the unityshader syntax examples

The last time I was in the preface in general about the process of one-shape rendering and how shader is involved, our series of tutorials is more focused on practical applications, so this section for the future to lay the groundwork, we will analyze the grammatical structure of unityshader. If you haven't read the preface, I suggest you take a look first, the more efficient, the link is http://www.cnblogs.com/Esfog/p/3534435.html.

Let's take a look at a very simple but complete Shader code. This series of tutorials without special notice, all Unityshader are vertex&fragment Shader.

  

 1 Shader "Esfog/simpleshader" 2 {3 Properties 4 {5 _maintex ("Base (RGB)", 2D) = "White" {} 6} 7             8 Subshader 9 {Pass11 {ten Tags {"Rendertype" = "Opaque"}13 14 CGPROGRAM15 #pragma vertex vert16 #pragma fragment frag17 #include "Unitycg.cgin  C "Uniform sampler2d _maintex;19 struct VERTEXOUTPUT21 {float4 pos:sv_position;23 float2 uv_maintex:texcoord0;24};25 vertexoutput Vert (APPDA Ta_base input) {vertexoutput o;29 O.pos = Mul (unity_matrix_mvp,input.vert ex); O.uv_maintex = input.texcoord.xy;31 return o;32}33 Floa                 T4 frag (vertexoutput i): COLOR35 {float4 col = tex2d (_maintex,i.uv_maintex); 37       Return col;38      }39 ENDCG41}42} FallBack "Diffuse" 44} 

Okay, let's take a little bit of analysis of this code, first give you some suggestions, if you encounter a temporary understanding of the knowledge can be reasonable choice temporarily skip, I will not be exhaustive, a lot of knowledge points in such as after concrete combination of examples to speak clearly, I hope you understand, knowledge of learning is not overnight, with the accumulation of continuous, A lot of places that I didn't understand before will be understood slowly. But first you have to be able to persist and have a heart to explore.

The 1th line of code Shader "Esfog/simpleshader" is similar to the

The 3rd to 6th line of code is where you define shader some adjustable parameters, after which you select the material in unity and then adjust the variable in the Inspector panel to influence the shader's appearance. Many times we can not fully estimate all the parameters, so we need to expose these variables so that the art students or programs themselves to achieve their desired effect, in which we currently only define a 2D mapping variable. Let's explain this line of code _maintex ("Base ( RGB) ", 2D) =" White "{}, where _maintex (the name is also arbitrary, but the recommended underscore) is that we identify the variable inside the shader. The back base (RGB) is the name that is displayed in the inspector. Followed by the 2D is set the type of our variable, in Unity's help document you can go to see there are many kinds, such as float,color,range and so on. I'm not too much to describe, but after the equals sign set a default value for this variable, The default setting for each variable type is different, so the 2D map remembers it.

The 8th line of Subshader to explain, in a shader can contain any number of subshader, but only one Subshader is selected by the graphics card and finally executed, because our shader code may run on a variety of different models, Each display rendering device has a certain difference, so we often write in different subshader for different rendering capabilities of the device's shader code, when the video card will automatically choose a most suitable for its own performance of the Subshader to execute, If the subshader we write is not finally selected, then we look at 43 lines of code, and then the default is to execute the fallback followed by the shader. Of course, in general, it is enough to learn to write a subshader on your own computer.

The pass of line 10th is also very important, in a subshader we can define multiple passes, and pass is different from Subshader, Every pass you write will be executed in the order. We want to write the specific shader code in the pass. Why there are multiple passes, maybe to achieve some special effects. You're looking at some of the online gods ' shader often have multiple passes. One more thing to mention, It is not possible to write a pass in Unity's surface shader, because in Unity's lighting model He will automatically define some passes, so you will no longer be allowed to write extra.

The 12th line of Tags {"Rendertype" = "Opaque"} He is a Shaderlab (in addition to the CG syntax in Unityshader) gives us some good configurable options that you can write one for each pass, It can also be written directly under the Subshader to let all pass public with a configuration. This configuration in addition to tags and cull,blend,ztest, and so on. They are very important, we will be in the future chapters of specific questions specific, interested students can take a look at the documents themselves, there are. Let's just say this tags a little bit, Tags inside the syntax is very similar to the way the HTML defines the label properties, it is used to configure some of the rendering parameters, which is the Rendertype = "Opaque" is to tell the rendering device, the use of this shader material is an opaque object, what is the use of it? So, there are a lot of translucent and opaque objects in the scene, there are some differences in rendering for different types of materials and ultimately we show that the effect is correct in the game scene.

The 14th to 40th line is the real CG code that is part of the Cgprogram and ENDCG, and our vertex shader and fragment shader are going to be written in here. Let's just go back to a little bit of analysis.

The two words in line 15th to 16th are what the name of the render device vertex shader (vertex) and fragment shader (fragment) is, so that the video card can be accurately found when it is rendered and must be declared. And we'll use that name when we're specific about shaders. Not necessarily vert and frag, just make sure you have the same name as your life in the back.

Line 17th # # # # "Unitycg.cginc" is because unity provides us with a large number of variables and constants available, such as some spatial transformation matrices, which can improve our development efficiency, and you need to add that in order to use them. After Unity4.0 you can omit this sentence, he will be included in the default, but in order to backward-compatible or proposed to add.

Uniform sampler2d _maintex in line 18th; We declare a variable called _maintex, whose type is sampler2d (that is, a 2D map that can be sampled), and the preceding uniform is omitted, This means that the variable is assigned by the external, in order to be rigorous, the suggestion Plus, you will find here _maintex and our properties in the _maintex name consistent, said before, In this way we can use the variables declared in the properties in the shader. Of course, the way to provide variables outside the properties of more than one, but also through the script directly to the shader variable assignment, the specific application of the specific said later.

The

Line 20th ~24 is the output structure that declares a vertex shader, which is also the input structure of the fragment shader. Of course, you can also return the result by out without using the struct, for the readability of the code or for the structure to be good. Just like the structure of C language, There are some differences between CG and C in the type, there are some types similar to float4,float2,float4x4, please refer to the Unity document, but remember that this is the basic type of CG, not the type, because the graphics card does not have the same operating mode as the CPU, This type of variable rendering device is very convenient to handle. The biggest difference from the C language is that each variable declaration is followed by a syntax like: Sv_position, which is called Semantics in CG (semantic), and their purpose is to correlate the variables in shader with the corresponding registers in the graphics card, This is closely related to the rendering pipeline. The simple understanding is that the graphics card set up some specific registers, to hold some special variables specified, in the rendering time if you need to use the direct to fetch, this can be processed faster also let pixel shader and vertex shader communication between the more aspects, this is my personal understanding , please refer to Kang Yuzhi's "GPU Programming and CG language Spring Snow lowbrow" for a more professional description. Where sv_position is specifically used to store the model vertex in the projection space of the coordinates, about the transformation is not very well understood can refer to my previous written preface. TEXCOORD0 can be used to store any variable you want to store, and this variable will be interpolated when rasterized. Similar to TEXCOORD0 and Texcoord1,texcoord2, the specific number of video card performance. We have added two variables in this structure, one is the vertex projection coordinates, this one must have, the other is the vertex texture coordinates, this is not necessary, I used that 2D map to color the model, and a little explanation of what is the texture coordinates: the texture coordinates are UV coordinates, usually the art in the production of the model, in order to put 2D stickers on a 3D model, the model will be assigned to each vertex a texture coordinate, And this texture coordinate is a two-dimensional coordinate, according to this coordinate to find the corresponding color on the map can be, in the rasterization when the texture coordinates will be the difference, and finally we only by a few vertices to specify the texture coordinates but let the whole model has added texture color. If you don't understand this, please Baidu, or skip.

Line 26th to 32nd finally came to an exciting moment, this is the code part of the vertex shader, the shader (understood as a function on the line, do not have to tangle the name) The return type is the return structure we just declared, the shader name is used in the CG code we started by #pragma associated name, And the parameters to note here, the vertex shader accepts the parameters are the model of the most primitive parameters, that is, the art students specify the model itself some parameters, including vertex position, normal, texture coordinates, color and so on. And how do these things shader know? You can create a new cube in the scene, You can see that the object to be rendered to the screen will contain a meshrenderer component that passes the vertex information to the shader through Meshrenderer, but not in shader by the external variables of life but through the semantics we have just mentioned ( Semantic) to correlate these parameters. And this appdata_base is a structure that unity has defined for us, which is included in the # include "Unitycg.cginc" that we've written above. You can install the directory in Unity/editor/ Data/cgincludes folder inside look for him to see the source code.

1 struct Appdata_base {2     float4 vertex:position;3     float3 normal:normal;4     float4 texcoord:texcoord0;5};

      So at vertex coloring where we can pass appdata_base. xxx This way to get the vertex information of the model directly. Since we're going to return an output structure, we'll first define a struct variable in 28 rows. Then assign values to the sub-variables inside, since the original vertex position of the model is in the model space, and the vertex position in our output structure is the projection space, So we have to do some of the column of the space transformation, if you read my preface, then you should know that the first "model space-World Space" transformation and "World space--camera space" transformation, and finally "camera space, projection space," the transformation. It looks very cumbersome, But Unity has helped us get it done, UNITY_MATRIX_MVP is the transformation matrix of this series of transformations, if you want to understand the knowledge of matrices, You can refer to the relevant data for linear algebra. So we just multiply the coordinates of our model space by this transformation matrix. Matrix multiplication is provided by the CG function Mul (); but remember Mul (Unity_matrix_mvp,input.vertex) and Mul ( INPUT.VERTEX,UNITY_MATRIX_MVP) However, since UNITY is a left-handed coordinate system, we must use the former to right-click the vertex position to the transformation matrix to get the correct result, and if you want to know the reason for it, also consult the linear algebra related data. So through O.pos = Mul (Unity_matrix_mvp,input.vertex) We complete the assignment of a variable. Take a look at o.uv_maintex = input.texcoord.xy; The texture coordinates of the model are in TEXCOORD0 at the beginning, so why do you have to use the same semantics to assign values again, and I'm not particularly clear about that, I estimate that only the results returned by the vertex shader will be interpolated. So it needs to be written again, if any one knows, please tell me. Since we only need the first two bits in the back of the texture coordinates, we're going to pull it through. XY. The syntax of this. Rgba or. XYZW is a special syntax provided by CG, because the variables in CG are mostly float3 or wind float4, which provides a way to increase efficiency by providing this fast extraction of sub-variables. Why did we discard the ZW in Texcoord? Because of me you do not use, at least in this case we just need to know XY can find the corresponding color on the texture map, in order to reduce the amount of calculation decisively lost it. Now that we have finished assigning the output structure, we can return it.

Line 34th to 38th is the code for the fragment shader, since the task of the fragment shader is to finally calculate a color that is provided to the rendering device for final processing, so its return value is FLOAT4 (RGBA). The name of the function and the previous # The pragma agreed well. The argument is just the vertex shader output structure. And at the very end followed by a: color, which is to give the return result to: Color semantic associated Register, When the rendering device is last processed, the result of the fragment shader is computed. Here we only use a tex2d function, which is a function of querying 2D map color according to texture coordinates, the first parameter is the map, that is, we declare the _maintex, written up can , and then we want to query the coordinates, but also directly to the vertex output structure of our assigned texture coordinates to fill it up, and will eventually return a map on the color. For a point, tex2d is not used in vertex coloring, due to the structure of the rendering pipeline and the limitations of the current graphics hardware device, But in the near future this will certainly be solved.

  

(~ o ~) ~ Write here, this chapter is almost over, can be relieved, unknowingly wrote for 3 hours. Finally, let's take a look at the results. This is just the basics of adding the color of the map.

If you have not used shader at all, it is recommended that you look at the official documentation, which I would like to describe here:

1. Create a new shader first. Then write the code you want to write in.

2. Create a new material ball, or use an existing material ball, select the shader you have written on the material ball, select it in the shader option, and set the appropriate parameters, we will drag your map to the right of select in this example.

  

3. Drag the material ball to your model, or modify the shader of the material ball that your model comes with.

It's best to take a look at the effects and two stickers used to get a better understanding of the meaning of texture coordinates.

  

The above is the default shader "diffuse" effect with no mapping

This one is using the shader we just wrote and adding the stickers.

Here are the two posters we used, of course, these stickers in the process of making the game, are the art provided, if you want to do technical beauty can also learn.

  

All right, everybody, this is the end of the chapter, if there is any misunderstanding can leave a message, or I have what wrong place if you find out please point it out in time. At the same time, we also encourage you to write more blog to share with others to improve their own. Thank you ~

Respect for the wisdom of others, welcome reprint, please specify the author Esfog, the original address http://www.cnblogs.com/Esfog/p/3562022.html.

(go) Esfog_unityshader tutorial _unityshader Syntax Example analysis

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.