Shader Programming Learning Notes (ii)--shader and rendering pipelines

Source: Internet
Author: User

Shader and rendering pipelines

What is shader

Shader, a Chinese translation, a shader, is a short program fragment that tells the graphics hardware how to calculate and output images, which were written in assembly language, and can now be written in high-level languages. In a nutshell: Shader is an algorithmic fragment of a programmable graphics pipeline.
It is divided into two main categories: Vertex Shader and Fragment Shader.

What is a rendering pipeline

The rendering pipeline, also known as the rendering pipelining, is a parallel processing unit that displays the internal processing graphics signals of the chip independently of each other. A pipeline is a sequence of stages that can be carried out in parallel and in a fixed order. Like an assembly line that is manufactured together at different stages of a car at the same time, the traditional graphics hardware pipeline processes a large number of vertices, geometric primitives, and fragments in a flowing way.

Note that there is a back-and-forth relationship here, and the input of the previous phase goes to the next stage to output. For example, in the vertex program, the data that the vertex program calculates will be used as a material for further processing of the fragment program.

To understand the rendering pipeline more visually, take a look at the following illustration.

  The top is "3D app or Game", "3D app or game" will call "3D application Interface", namely OpenGL or DirectX. OpenGL and DirectX are the middle tiers that make it easy for applications to access and invoke hardware. Without them, we need to write a very complex hardware-specific driver for the hardware.
The next step is the "dividing line between CPU and GPU", on which CPU operations are performed and the following are GPU operations.
The operation of the GPU is from left to right, the first "GPU Front-end module" to "element assembly" in the process of the past T&l pipeline is a hardware-designed operational process, which is integrated, it can not be programmed to control. Once the graphics hardware has programmability, we can use the vertex shader at this stage to write the operational logic to replace the previous operations that were directly integrated into the hardware. After this phase is complete, it will go to the "rasterization and interpolation" phase.
Rasterization is the computer graphics in the operation of the data to a subdivision, to adapt to the screen of the specific display of each pixel, but rasterization is not the same as the pixel display, the pixel display finally reflects the color, and the result of Rasterization is "frame cache", in this process, we can insert " Fragment shader, this section allows you to use a programmable operation. So the goal of the fragment shader is to calculate the final color of each pixel that will be displayed on the screen.

For unity, the above process can be explained by.

The top "Geometry" is a geometric model, the geometric model into "Unity", it can be understood that the geometric model mesh, mesh and other data to unity,unity import, through the Unity engine to invoke the "Grphics APU" graphics API, The process of invoking the graphics API is to drive the GPU for processing operations.
The first thing that goes into the GPU operation is the "Vertex Processor" vertex processor, which requires us to use the "Vertex Shader" vertex shader, and the result of the vertex shader operation is given to the "Pixel Processor" pixel processor, which is the fragment processor , in this section I need to write the "Pixel Shader" pixel shader program for pixel processing, which will output the color information we can eventually use on the screen, which we call "frame buffer" framebuffer. Frame buffering stores the data that the computer displays in turn, but not just the data, but also other additional information, such as depth values.

Here is a rendering pipeline diagram in the Unity Official Handbook.

  In the left-most part of the rendering pipeline, transform refers to the spatial transformation of the model, mainly for the spatial geometric transformation of the vertex, the Texgen is the texture Generator, which represents the generation of the texture coordinates, which is mainly used to obtain the texture coordinates in the vertices. Convert to a range of UV values; lighting refers to light. So this part is the past is t&l geometric transformation lighting pipeline, when the graphics hardware has a programmable ability, the fixed module is "Vertex Shader" vertex shader instead.

After the vertex shader has been processed, unity enters the culling & Depth test cropping and depth testing process. Clipping and depth testing describes the fact that if an object is shown in front of the camera, it will be observed toward the face of the camera, and it will not be observed on the face of the camera, in which case a clipping (culling) is done to reduce the amount of data processed by the GPU, eliminating the unseen faces directly, You do not need to deal with the vertex data involved in these polygons, thus accelerating graphics processing. The second aspect of the depth test (Depth test), refers to the camera has a feature, in the computer there is no infinite this concept, the computer processing data is discretized, it has a range, when more than the nearest and farthest this part of the range will be removed.

The next step is to enter the texture sampling (texturing) and atomization processing (FOG) phases. At this stage, it is actually rasterization, which describes how to display the color of each pixel on the screen. Here we need to go to texture sampling, a map has a lot of data, we go to the texture of a certain point of the color value, this is called texture sampling. Atomization is based on the final calculation of the data need not to do a atomization, near the very clear, a hazy sense of the distance, this part is the fragment shader programmable ability range.

It also requires "Alpha Test", which refers to the drawing of translucent or fully transparent objects. The "Blending" process is also required after the "Alpha Test", which mixes the final image.

The above is the process of rendering pipeline, to grasp the main process is that our programmability is two parts, one is the geometric transformation and lighting using the vertex shader part, the other is about how to sample, to calculate color and atomization processing, such as the use of fragment shader parts.

It is important to note that one of the main components of unity optimization is the reduction of draw call, and draw call refers to the process by which the application invokes the graphics hardware GPU to render. The application needs to prepare a lot of data, including vertex data, matrices, vectors and other data, all need to pass through the application to the GPU, so that the CPU must collect data after the scheduling process to generate an API call, the process is expensive, if you repeatedly start this call, The process of collecting and passing parameters, then, can be time consuming, creating bottlenecks in application or game running. Therefore, to minimize draw call, try to minimize CPU-to-GPU calls.

The relationship between shader and textures, textures

A Shader (shader) is actually a small program that is responsible for combining the input vertex data in a specified manner with the input map or color, and then outputting it. The drawing unit can draw the image onto the screen based on this output. The input map or color, plus the corresponding shader, as well as the specific parameters of the shader set, the contents (shader and input parameters) are packaged and stored together to get a material (material), then, We can give the material to a three-dimensional object for rendering (output).

The material is like the final product of the engine, shader is the process of producing the product, and the decal is the raw material.

Summarize

Shader is a program fragment of a graphics programmable scheme. It is mainly divided into vertex shader and fragment shader.

The rendering pipeline is an image description of the computer from the data to the final graphic imaging.

Material can be understood as a commodity, shader is the method of processing this commodity, and mapping is the material needed in the process of processing.

Shader Programming Learning Notes (ii)--shader and rendering pipelines

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.