The slowest stage in the rendering pipeline determines the speed of the entire rendering.
We typically use throughput (throughput) to describe the processing speed of a stage, not the frame rate. Because the frame rate is limited by device updates, the actual speed is slower than the frame rate indicates.
An example:
Suppose a device is 60 Hz, which means that the device is 16.666666ms refreshed once, when exactly one pipeline stage takes 62.5ms to complete, because 63 is greater than 16.666666*3, less than 16.666666 * 4, so he actually wants to finish the work eventually must wait for the next device refresh. So the actual execution time is translated down to more than 62.5ms but toward 66ms. Of course, if you turn off vertical synchronization, it's another matter.
A rendering pipeline can be roughly divided into three stages in order of execution:
- Application phase
- Geometry phase
- Grating Stage
These stages can be subdivided into more sub-stages, note that these are functional allocations, in the implementation for efficiency and other factors tend to merge some stages or split some stages and so on.
Here is a picture:
First, the application phase
The goal of this phase is to produce the appropriate graph metadata for the geometry phase, in general, a lot of work is done at this stage, such as collision detection, animation, input, and, of course, some algorithms for accelerating the pipeline, such as a layered view of the vertebral culling (hierarchical view frustum Culling) and so on.
Second, the geometric stage
The geometry phase focuses on the polygon -by-step operation and the vertex-wise operations.
- model-View transformations : Model coordinates--world coordinates--camera coordinates
- Vertex Coloring : The computed coloring equation determines the lighting effect based on material data (position, normal, color, data needed for other shading equations, and so on), and the result can be (color, vector, texture coordinates, or other kinds of shading data). Typically occurs in world space and sometimes transforms related entities into other space and is calculated in this space.
- projection : slightly
- cropping : Primitives outside the unit cube are discarded, and primitives that are completely within the unit cube are preserved; the intersecting primitives are cut, the new vertices are generated, and the old ones are discarded.
- screen mapping : Only clipped portions are passed to the screen map, and the coordinates are still 3-dimensional. Converts x/y to planar coordinates. The converted X/y is called screen coordinates. One thing to note is how to map floating-point numbers to pixels and the problem of screen-making coordinate origin between some systems.
Third, grating stage
- Build Triangles : Calculate the triangle surface difference data, this data will be used for the next phase of the scanning transformation, as well as the geometric phase generated by the various shading data interpolation.
- Traversal triangle : Checks each pixel that is centered over the triangle and generates fragments of the overlapping portions of the triangle. The process of finding pixels within a triangle or sampling is called a scan transformation. Each fragment that is generated within the triangle is interpolated from the triangle vertices, and the data for these fragments includes depth and various shading data from the geometry stage.
- Pixel coloring : interpolated shaded data input, which performs all pixel-wise shading operations and passes one or more colors to the next stage. Texture technology is also implemented at this stage.
- Merging : The information for each pixel is stored in the color buffer, which is a rectangular array of colors (each color has three parts of red, green, and blue). The merge phase is responsible for combining the color of the fragment generated by the shading phase with the color stored in the current buffer. and includes visibility detection, template manipulation, and so on.
Finally, we need to comb what the various shader usually do:
vertex shader: Model-View transform, vertex shading, projection
Geometry shader: manipulating element vertices (primitive concept below)--element coloring, destroying/creating entities
pixel (fragment) shader: Pixel coloring
Geometry primitive:points, lines, triangles
Real-time rendering (i)--Graphics rendering pipeline