Http://codingnow.cn/opengles/1504.html
OpenGL ES 2.0 implements a programmable graphics pipeline that is complex and flexible compared to 1.x fixed pipelines, consisting of two parts of the specification: the OpenGL ES 2.0 API specification and the OpenGL ES coloring language Specification. The following figure is the OpenGL ES 2.0 rendering pipeline, and the shaded part is the programmable phase of OpenGL ES 2.0.
1. Vertex shader (vertexshader) vertex shader implements a common programmable method for vertices. The input data for the vertex shader consists of the following: Attributes: Encapsulates data for each vertex using a vertex array, typically for variables with different vertices, such as vertex position, color, and so on. Uniforms: The constant data used by the vertex shader, which cannot be modified by the shader, is typically used for variables of the same set of vertices in a single 3D object, such as the position of the current light source. Samplers: This is optional, a special uniforms that represents the texture used by the vertex shader. Shader Program: The source or executable of a vertex shader that describes the actions that will be performed on the vertex. The output data of the vertex shader is the varying variable, which is computed for each generated slice during the Tu Yuanguang rasterization phase, and the result is the input data for the slice shader. The mechanism for Yuan Sheng each slice into a varying value from the original varying value assigned to each vertex is called interpolation. The input and output of the vertex shader data can be referenced in the following image: Vertex shaders are used for traditional vertex-based operations, such as: based on the position of the matrix transform, the illumination calculation to generate the color of each vertex, generate or transform texture coordinates. Also, because the vertex shader is specified by the application, you can use it to make any custom vertex transformations.
The following is a vertex shader source written in the OpenGL ES shader language, which uses a position and its associated color data as input data, transforms the position through a 4x4 matrix, and then outputs the transformed position and color data.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
Uniforms used by the vertex shader uniform mat4 U_mvpmatrix; Matrix to convert P from model//space to normalized device space. Attributes input to the vertex shader attribute vec4 a_position; Position value attribute Vec4 A_color; Input vertex color Varying variables–input to the fragment shader varying vec4 v_color; Output vertex color void Main () {v_color = A_color; gl_position = U_mvpmatrix * a_position; } |
The 2nd line of code defines a uniform variable U_MVPMATRIX,MAT4 represents a 4x4 floating-point matrix, which stores the combined model view and the projection matrix. The 6 and 7 lines of code define the input data for the vertex shader: attributes,vec4 represents a vector that contains 4 floating-point numbers, a_position is the vertex position property, and A_color is the vertex color property. The 10th line of code defines a variable of type varying v_color,varying is a variable to pass from the vertex shader to the slice shader, V_color is the output data of the vertex shader and stores the color of each vertex. The main function of the 12-17 row is the entry for the vertex shader and the element shader, the 15th line reads the value of A_color in the vertex shader input property, assigns it to the output data v_color, the 16th line gl_position is the built-in varying variable, does not need to be declared, The vertex shader must assign the transformed position to it.
2. Entity assembly (Primitive Assembly)
After the vertex shader, the next stage of the rendering pipeline is an element assembly, which is a geometry that can be drawn with the OpenGL ES drawing command, which specifies a set of vertex properties that describe the geometry and entity type of the entity. Vertex shaders Use these vertex properties to calculate the position, color, and texture coordinates of vertices so that they can be propagated to the slice shader. During the element assembly phase, the vertices processed by these shaders are assembled into separate geometric entities, such as triangles, lines, and point sprites. For each element, you must determine whether it is in the viewport (the visible area of the 3-dimensional space displayed on the screen), and if the elements are in the viewport, they need to be cropped, and if the elements are all outside the viewport, the entities are discarded directly. After cropping, the vertex position is converted to screen coordinates. The back culling operation also executes, depending on whether the entity is front or back, and if it is back, discards the entity. After clipping and back culling, it enters the next stage of the rendering pipeline: Rasterization.
3. Rasterization (rasterization)
The rasterization phase transforms the elements into a meta-set, which is then submitted to the slice shader processing, which represents the pixels that can be drawn to the screen. As shown in the following illustration:
4. Chip shader (Fragmentshader) slice shader A common programmable method is implemented for each slice generated during the rasterization phase, which requires the following input data: