"Translator" Unity3d Shader Beginner's tutorial (1/6)

Source: Internet
Author: User

When you start to touch Unity3d shader programming, you will find that the documentation about shader is quite scattered, which also causes beginners to be deterred by Unity3d shader programming. The first article in the series (the translator's note: This article, followed by 5 articles) details the surface shader in Unity3d, which lays the groundwork for learning more complex Shader programming.

Motivation

If you're new to shader programming, you may not know where to begin the first step of shader programming. This tutorial takes you step-by-step through a surface shader (surface Shader) and a fragment shader (Fragment Shader). This tutorial will also describe some of the functions and variables used in Unity3d shader programming, which may not be the same as what you see on the web!

If you meet the following conditions, I think you should check this article:

    • If you are a novice in shader programming.
    • You want to use shader to do some cool things in your game, but you can't find a usable shader on the Internet(the translator notes: O (╯-╰) o self-clothed).
    • Lack of knowledge of the basics makes it impossible to use the Strumpy shader editor ( Translator Note: Strumpy Shader Editor, a graphical way to write Shader, looks tempting! ).
    • You want to handle textures manually in your shader code (textures)

This is the first article in this series of tutorials, and then we'll make some more complex shader. In comparison, the first article is really simple.

About the author

I am also a novice in shader programming----so I decided to write this tutorial to help you get started-I had a lot of trouble getting started. In fact, I'm not a shader programming expert.

When I wanted to learn about shader programming, I read the official documents over and over, but I finally found out that the order of official documents was not for me to learn shader. So I think I should write a tutorial and share the knowledge I've learned. But after I finished the tutorial, I found it more clear when I read the official documents again.

Although all the examples in this tutorial work, I'm sure there are better shaders to implement these examples. If you have a better idea of the shaders in these examples, please leave a comment in the comments section!

I learned shader programming because I needed to create something in the world of games I created, but this game world was created with a lot of trouble because it was made up of different roles, and I had to create a unified grid (mesh) that consisted of multiple parts. So I can only use one draw call for each role. (Translator: Don't know what he's talking about?) So I put the original text below for everyone to comment reason with)

My reason for getting into shader programming is to build something this I needed for a world populated with an endless a  Rray of different characters. I needed to build a combined mesh out of multiple parts so I is only has one draw call per character.

By turning on and off the dressing effect of the character, I modified the basic grid (base meshes) of the character using Megafiers (a morph plugin). The difficulty is that I have only one texture (texture), but I want to use different colors for each character's skin, costumes and other features. I think of a method----use a different 3 4x4 texture for each character, and use a shader to paint the model. I'll describe this shader in detail throughout the tutorial, but for now-I think you can't wait to see the character I created to perform an impromptu flash dance (flash mob dance)(Translator Note: A screenshot of the web).

Shaders and materials (shaders&materials)

What a shader does is render a mesh of a model to the screen. Shader can be defined as a series of attributes (the translator notes: Just like the parameters inside a function, you can change the function's different assignments to change the output of the function), you can change the effect of the model rendering to the screen by changing the properties. These properties are stored and placed in a place called a material (material).

Unity3d shader has the following types of

    • surface Shader----The vast majority of the work you do in the background automatically, which reduces your workload and fits the vast majority of situations that require shader.
    • The fragment shader (fragment shader)----lets you do more, but this shader is harder to write. You can also use it to do some of the underlying work, such as vertex lighting (Vertex lighting, which stores the lighting information for that point at each vertex). Vertex lighting is useful for mobile devices (note: estimate the memory bar). The shader is also effective for some high-level rendering effects that require multichannel (multiple passes).

In this article, we will focus on the surface shader.

Learning Shader of Resources

If you want to learn shader programming, I recommend the following resources to you

    • Martin Kraus ' s fantastic Wiki book GLSL programming/unity
    • Unity ' s Shader Reference
    • NVidia ' s tutorial on the CG programming language
    • Creativetd ' s video series on writing surface shaders

Shader working method of water flow

( Translator Note:Shader's work is also known as the Shader Pipeline (pipeline), because shader works very similar to the automotive pipeline, the model of a series of vertex data and other various data as input, with this shader composed of the pipeline processing , it is a cool effect. )

You will see the various terms in the shader assembly line, I will use my own language to minimize the difficulty of understanding.

Shader's job is to enter some 3D geometry information, which is rendered on the screen after shader processing and turning it into 2D pixels. The advantage is that in the shader process, you only need to change a few attributes to produce different effects. For surface shaders, the workflow looks like this:

(Translator Note: A brief explanation of this flowchart, first to render the object to pass their own geometric information into the shader, and the system gets the vertex information of the object, and then you can choose to do not pass through vertex function to handle these vertex information, Then rasterization (mapping three-dimensional geometry to a two-dimensional screen, making an inappropriate analogy, which is equivalent to flattening the 3D model onto the screen, then you can focus on the pixels on the screen), and each pixel will get the final color value through your shader code)

Note that the color of the pixel is not calculated until the function in the surface shader (surface Shader) exits. This means that you can once again pass the normals of the vertices to influence the calculation of the light.

The fragment shader (Fragment Shader) has the same workflow, but in fact, the fragment shader must have vertex function (the vertex function part in is optional (Optional)), And you need to do a lot of work in the pixel processing phase to produce the final pixel. And the surface shader hides these. (Translator Note: The feeling is that the fragment shader gives the user more interfaces for more advanced rendering).

Shows how your code is called and how the code is composed.

From what we can see, when you write a shader, you may have to have some attribute values (properties) and one or more subshaders. Which subshader to use for processing depends on the platform you are running. You should also specify a fallback shader, when none of your subshader can run on your target device, you will use fallback shader(translator Note: a bit like a spare tire).

Each subshader has at least one channel (pass) as the input and output of the data. You can use multiple channels (passes) to perform different operations, such as in a grab pass, you can get the pixel values that will be rendered to the screen (translator note: Similar to fragment buffer in GLSL). This is useful when you want to create a high-level twist effect. While you are starting to learn shader programming, you may not use it. Another reason for using multichannel (multiple passes) is that at different times you may need to write to or disallow the use of deep caches.

When you write the surface shader, we will write the code directly at the Subshader level, and the system will compile our code into several appropriate channels (pass).

Although shader ultimately produces two-dimensional pixels, these pixels, in addition to preserving the XY coordinates, themselves hold the depth value (that is, the content of each pixel in the original 3D scene away from the camera), so that the object near the camera will be far away from the camera object, when displayed on the screen, is to overwrite its pixel values.

You can control whether you use deep cache (Z-buffer) in your shader to produce some effects, or use some instructions in pass to determine if shader can write to Z-buffer: for example, when using zwrite off, Anything you output will not update the Z-buffer value, that is, the write function of the closed z-buffer.

You can use the Z-buffer technique to take a hole in another object, you can write the depth value of the hole area first, but not the pixel value of the hole area, then the depth value of the object behind your model will not be written (because Z-buffer thinks your model is blocking the object behind it) (Translator Note: So your hole area shows the background color of the first use, causing a hole through the effect of these objects).

Here are some shader codes:

I hope you can see that the above code is made up of Properties,subshader, andFallback three pieces of code.

Understanding Shader Code

The rest of the article will tell you what the above simple code does. The real dry goods are coming right away, and you have to master the content.

When you do shader programming, you must call them with the correct variable name and function name, in fact the name of the variable in some cases can give a person a glimpse of its specific meaning.

Create and use the default shader

(Translator Note: Before the detailed introduction of shader, we will briefly introduce how to use the shader. )

1. We first Open unity (my version is 4.6.1), create a new project, and create three directories under the Assets folder, as follows:

2. Let's create a cube again.

The material used by the newly created cube can be seen in the Inspector panel as follows.

3. Open the Material folder, where we create a shader and a material.

At this point, the default shader for new material is diffuse.

We dragged the newshader onto the new material.

You can see that the shader used in the material becomes our new newshader. Of course you can also directly click on the material editor shader drop-down box, select the appropriate shader.

4. finally drag the new material onto the cube. You can see that the materials and shader used by cube have become our newly created materials and shader.

Introduction to Properties (attribute values)

Your properties{in the shader code ...} section defines the value of the attribute in shader (the attribute value is the data that the user passes to the shader, such as textures, and then shader handles the textures to produce effects. It can be understood that the property value is equivalent to a global variable, and shader is the main function, and unity has the advantage of assigning the global variable to the Inspector panel. Note that the properties (property values) are shared in all Subshader code, meaning that these property values can be used in all Subshader code.

The form of a property value definition:

_name ("Displayed Name", type) = Default Value[{options}]

    • The name of the _name property value, which is used inside the shader code, differs from the following displayed name, which is displayed on the Inspector panel as an input hint to the outside (user).
    • Displayed Name renders the property value names in the material editor and is displayed on the Inspector panel.

Summary: Open the Newshader we created. You can see that _maintex is used in code, and base (RGB) is used in the material editor.

    • Type property value, including:
      • Color – Represents a solid color, using the RGBA notation
      • 2D – a texture that represents a power of size 2 (such as 2,4,8,16 ... 256,512)
      • Rect – represents the texture (texture), unlike the texture above, where the size of the texture is not necessarily a power of 2.
      • Cube – Used for the cube map in 3d, often mentioned in the Sky box is the use of the cube map.
      • Range (min, max) – a value between Min and Max to change its value size in the panel by using the slider bar.
      • Float – any one floating point number.
      • Vector – a 4-dimensional vectorial value, essentially a type consisting of 4 floating-point numbers.

To a family photo:

  • default value   the initial value of the property value, which is equivalent to the value of the initialization of your variable.
    • color– (red,green,blue,alpha) uses the color of the RGBA format, Alpha refers to transparency-for example (1,1,1,1)
    • 2d/rect/cube – type of texture, The above has been introduced. The initialization value can make an empty string, or " white ", " Black , " Gray ", " Bump "(Description this texture is a bump texture)
    • Float/range – This has nothing to say, same as floating point initialization
    • vector–4 vector, where 4 numbers are floating point numbers (x,y,z,w)
  • {options} Note here that{options} is only used for texture types, such as the 2D,Rect,Cubementioned above, and for these types, if there is no options to fill, at least write an empty { }, or compile error. You can use spaces to separate multiple options, the available options are as follows:
    • Texgen texgenmode: How texture coordinates are automatically generated. It could be objectlinear, eyelinear, spheremap, cubereflect, Cubenormal One of these methods corresponds to how the texture coordinates are generated in OpenGL, as detailed in this blog post. Note that when you write vertex function, the texture coordinate generation method is ignored.

Here are some examples of how property values are spelled:

Defines a translucent (alpha=0.5) effect of red as the default color value _maincolor ("Main color", color) = (1,0,0,0.5)//defines a default value of white texture _texture ("Texture", 2D) = "White" {}

Note there is no need to add a semicolon at the end of the definition of a property value.

Tag (tags)

Your surface shader can be decorated with one or more tags (tags). The purpose of these tags is to tell the hardware when to call your shader code.

In our case, we used:Tags {"Rendertype" = "Opaque"}, which means that when the program is going to render opaque geometry, our shader,unity is called to define a series of such rendering procedures. Another easy-to-understand tag is tags {"rendertype" = "Transparent"}, meaning that our shader will only output translucent or transparent pixel values.

Other useful tags, such as "ignoreprojector" = "True", mean that the object you render will not be affected by the projectors (projector).

"queue" = "xxxx"(the label of the render queue is affixed to the object to which shader belongs). The queue label can produce some very interesting effects when the object type being rendered is a transparent object. The label determines the order in which the objects are rendered (the translator notes: I guess it works the same way, there are many objects in a scene, when these objects are rendered, there must be a rendering order, such as the background should be rendered before other objects, otherwise the background will be obscured by the previously rendered object, This is done by pasting a "Queue" = "backfround" label in the shader used in the background so that objects using the shader will be labeled background. In summary, when rendering the entire scene, unity determines in what order to render the object to which the corresponding tag belongs, based on the label of these render queues.

    • background – renders before all other objects are rendered for use in sky boxes or similar background effects.
    • geometry (Default tags is Geometry) – applies to most objects. Non-transparent objects use this rendering order.
    • alphatest – The pixel passed through the alpha test (alpha-test means that the current pixel's alpha is less than a certain value and discards the pixel) uses that render order. This rendering order is set separately because after all the entities have been rendered, the rendering order will be more efficient for rendering alpha-tested objects.
    • Transparent – The object to which the render tag belongs will be rendered after the object labeled Geometry and Alphatest, and all of the objects that are affixed to the transparent are rendered from the back and forth. Any alpha-blended object should use the label (Translator Note: alpha-blended refers to the use of the current pixel alpha as a blending factor, To mix the pixel values before writing to the cache, note that shader is not written to the deep cache, because if the write depth is not turned off, the object behind it will be visible to us through it, but because of the depth detection, the depth is removed, So that we don't see the object behind it, the glass and particle effects are more appropriate for the rendered label.
    • Overlay – The render label is appropriate for the overlay effect, and any final rendered effect can use the label, such as a lens flare.

The interesting thing is that you can give these basic rendering tags a plus and minus. These predefined values are essentially a set of defined integers,Background = Geometry = $, alphatest = 2450, Transparent = all, and finally Overlay = 4000 . (Translator Note: From here we can also get a glimpse of what appears to be a large number of post-rendering.) These presets have a significant effect on transparent objects, such as a lake's flat surface covering the tree you made with billboards, and you can use "Queue" = "Transparent-102" for your tree so that your tree will be drawn in front of the lake.

Shader The overall structure

Let's review the structure of the shader code.

#pragma surface surf Lambert This code means that surface represents this as a surface shader, and the function name for the result output is surf, which uses the lighting model for the Lambert lighting model.

Our CG program uses a modified Class C language--CG language (a shader language that is co-produced by Nvidia and Microsoft). Refer to Nvidia's documentation-I will also introduce some basic CG usage methods in this article.

Floating-point type (float) and vector-value type (VEC) are usually appended to the end by 2,3,4 these numbers (FLOAT2,FLOAT4,VEC3 ... ) indicates that the type is composed of several elements. This definition makes numerical operations easier, and you can use them as a whole or use their components alone.

Defines a two-dimensional coordinate of a floating-point type VEC2 coordinate;//defines a color variable (the color value of 4 floating-point component) FLOAT4 color;//The color value of the 3 floating-point component by a point multiplier float3 Multipliedcolor = Color.rgb * coordinate.x;

You can use. xyzw or. Rgba to indicate the specific meaning of the variable type you are using, such as. XYZW may represent a rotation of four of millions of dollars, while. XYZ represents a position or normal vector, and. Rgba represents a color. Of course, you can just use float as a single floating-point value type. In fact, the use of. Rgba and other component accessors is also known as swizzle, especially for color processing, such as the conversion of color spaces, which may be used, such as COLOR=COLOR.ABGR;

You will encounter half (semi-precision) and double (double) types, half (General 16bit), which is half the accuracy of normal float (general 32bit), Double (general 64bit) is twice times the precision of normal float (the way in which multiples are measured is not the range represented, but the number of bits that can be used). The use of half is often due to performance considerations. There is also a fixed fixed-point number that differs from floating-point numbers and is less accurate.

When you want to standardize the color values between 0~1, you might think that using the Saturate function (saturate (x) is a value of 0 if X is less than 0. If the X value is greater than 1, the return value is 1. If x is between 0 and 1, the value of x is returned directly. Saturate can also use the swizzled version of the variable, such as saturate (SOMECOLOR.RGB);

You can use the length function to get the lengths of a vector, such as float size = length (SOMEVEC4.XZ);

How to output information from a surface shader

Our surface functions (surface function) are called once per pixel, and the system has already calculated the input value of the currently processed pixel (the input structure, which is exactly what it should be). It is based on the patches on each mesh and the results are interpolated.

Take a look at our surf function.

void Surf (Input in, InOut surfaceoutput o) {    = tex2d (_maintex, In.uv_maintex). RGB;

Obviously we can see that we have returned O. Albeodo value – This value is a member of the Surfaceoutput struct that unity defines for us. Let's take a look at what members Surfaceoutput specifically defines. The albedo represents the color of the pixel.

 struct   surfaceoutput {half3 Albedo;  //   Half3 Normal;  //   HALF3 emission;  //   half specular;  //   half Gloss;  //   half Alpha;  //  };  

All you have to do is give the value of the struct to unity,unity, which automatically produces the final effect based on these values, without needing your attention to the details.

I promise you the dry goods are down there.

First, what is the input to our surf function?

We have defined an input structure as follows:

struct Input {    float2 uv_maintex;};

By simply creating the struct, we tell the system to get the texture coordinates of maintex in that pixel each time we call the surf function. If we have a second texture called-_othertexture, we can get its texture coordinates by adding the following code to the input structure

struct Input {    float2 Uv_maintex;    Float2 uv_othertexture;};

If for other textures we have a second set of texture coordinates, we can do this:

struct Input {    float2 Uv_maintex;    Float2 uv2_othertexture;};

For all textures we use, our input structure contains a set of UV coordinates or a set of uv2 coordinates.

If our shader is complex and needs to know other relevant information about pixels, we can query other related variables by including the following variables in the input structure.

  • FLOAT3 viewdir – view direction value. In order to calculate the parallax effect (Parallax effects), edge illumination (rim lighting) and so on, you need to include the view direction (views direction) values.
  • float4with color semantic (for example, Float4 CurrentColor, which is user-defined and color-dependent variable names) – the interpolation of each vertex (Per-vertex) color.
  • float4 Screenpos – for reflection effects, you need to include location information in the screen space
  • FLOAT3 Worldpos – location in world space
  • FLOAT3 Worldrefl – a reflection vector in world space. This parameter is included if the surface shader (surface shader) does not assign a value to normal in the surfaceoutput structure.
  • FLOAT3 worldnormal – Normal vector in world space (normal vectors). This parameter is included if the surface shader (surface shader) does not assign a value to normal in the surfaceoutput structure.
  • Internal_data – This parameter is used when the surface shader assigns a value to normal in the worldnormal structure, relative to the above Float3 Worldrefl and Float3 surfaceoutput. In order to obtain a reflection vector (reflection vector) based on each vertex normal map (per-pixel normal map), you need to use the world Reflection vector (Worldreflectionvector (in, O.normal)), where O. Normal represents a normal vector of tangent space, not a normal vector in the world coordinate system.
  • You might ask, what does the color semantic mean? When you write a normal fragment shader, you have to tell someone what the meaning of each variable represents in your input structure. If you are crazy enough, you can try the following definition: Float2 myunclefred:texcoord0; And tell someone that myunclefred represents the UV coordinates of the model. The only thing you're worried about in a surface shader is the definition of the color type. FLOAT4 Currentcolor:color; it can be seen as the pixel color that has been interpolated since now. Of course you can not care about this, but it is recommended that you name the best standard, convenient and convenient for others.

the shader What have you actually done?

Now we have two lines of code that are not discussed in detail:

Sampler2d _maintex;

For each attribute value, we define the attribute value area (Properties section), which defines the variables used in the CG program. In use, we must ensure that attribute names are consistent.

Note that the Uv_maintex in the input structure is the uv+ corresponding attribute value (_maintex, note that the preceding underline is the official CG recommended notation), if you use UV2, that will be written Uv2_maintex. Note that the _maintex variable in sampler2d _maintex is a sampler2d (this sampler2d, which can be understood as referencing a 2D Texture), which refers to the _maintex in the properties( Note: Notice that the two have the same name. After explaining what sampler2d is, you need to explain why you need a statement of _maintex here, and we have not already declared it in the properties as a decal. The answer is that the shader we use for the example is actually made up of two relatively separate blocks, the outer attribute declaration, the rollback, and so on, which unity can use and compile directly shaderlab; now we are in cgprogram ... Endcg Such a code block, this is a piece of CG program. For this CG program, you must declare with the same name as the previous variable if you want to access the variables defined in the properties. So in fact sampler2d _maintex, the thing to do is to re-declare and link the _maintex, so that the next CG program can use this variable. ), he can provide the pixel value on the corresponding texture according to the specified UV coordinates, and here the Uv_maintex function is to provide the UV coordinate value of the texture _maintex.

If we define a _color variable, we can define its properties as

FLOAT4 _color;

We surf the only line of code in the function

O.albedo = tex2d (_maintex, In.uv_maintex). RGB;

The role of tex2d is to take advantage of the UV coordinates represented by In.uv_maintex (note that we specify how UV coordinates are generated, so the In.uv_maintex here are automatically generated) to sample the texture _maintex. Here, for O. Albedo we only take RGB three components in the color component, where alpha value (transparency) is not currently required, at least for non-transparent objects alpha is not worth much.

If you want to set the alpha value, you can assign a value like this

FLOAT4 Texcolor == = TEXCOLOR.A;

Summarize

You've learned a lot about terminology, but the shader we've written today is quite limited, but when we finish the second part of the tutorial, we can do some cool shader, because the second part we'll start using the cool technology of multiple textures, normals, and so on.

    • In the second part, we create a shader that implements the snow effect, modifying the model to show different effects depending on the extent of the snow.
    • In the third part we improved our shader to mix the snow at the edge of the rock.
    • In the fourth part, we use black edges and gradient textures to create a shader with a cartoon effect.
    • In the fifth part, we created a vertex/fragment multichannel bump texture shader (vertex/fragment Multipass bumped shader) – its complexity extends far beyond the surface shader
    • In part sixth, we created a vertex/fragment shader (Vertex/fragment shader) to make a better shader than the cartoon effect that we made using the surface shader in Part IV. Shader.

Translate Unity3d Shader Beginner's tutorial (1/6)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.