Real-time rendering (i)--Graphics rendering pipeline

Source: Internet
Author: User
Tags pixel coloring

The slowest stage in the rendering pipeline determines the speed of the entire rendering.

We typically use throughput (throughput) to describe the processing speed of a stage, not the frame rate. Because the frame rate is limited by device updates, the actual speed is slower than the frame rate indicates.

An example:

Suppose a device is 60 Hz, which means that the device is 16.666666ms refreshed once, when exactly one pipeline stage takes 62.5ms to complete, because 63 is greater than 16.666666*3, less than 16.666666 * 4, so he actually wants to finish the work eventually must wait for the next device refresh. So the actual execution time is translated down to more than 62.5ms but toward 66ms. Of course, if you turn off vertical synchronization, it's another matter.

A rendering pipeline can be roughly divided into three stages in order of execution:

    1. Application phase
    2. Geometry phase
    3. Grating Stage

These stages can be subdivided into more sub-stages, note that these are functional allocations, in the implementation for efficiency and other factors tend to merge some stages or split some stages and so on.

Here is a picture:

First, the application phase

The goal of this phase is to produce the appropriate graph metadata for the geometry phase, in general, a lot of work is done at this stage, such as collision detection, animation, input, and, of course, some algorithms for accelerating the pipeline, such as a layered view of the vertebral culling (hierarchical view frustum Culling) and so on.

Second, the geometric stage

The geometry phase focuses on the polygon -by-step operation and the vertex-wise operations.

    1. model-View transformations : Model coordinates--world coordinates--camera coordinates
    2. Vertex Coloring : The computed coloring equation determines the lighting effect based on material data (position, normal, color, data needed for other shading equations, and so on), and the result can be (color, vector, texture coordinates, or other kinds of shading data). Typically occurs in world space and sometimes transforms related entities into other space and is calculated in this space.
    3. projection : slightly
    4. cropping : Primitives outside the unit cube are discarded, and primitives that are completely within the unit cube are preserved; the intersecting primitives are cut, the new vertices are generated, and the old ones are discarded.
    5. screen mapping : Only clipped portions are passed to the screen map, and the coordinates are still 3-dimensional. Converts x/y to planar coordinates. The converted X/y is called screen coordinates. One thing to note is how to map floating-point numbers to pixels and the problem of screen-making coordinate origin between some systems.

Third, grating stage

    1. Build Triangles : Calculate the triangle surface difference data, this data will be used for the next phase of the scanning transformation, as well as the geometric phase generated by the various shading data interpolation.
    2. Traversal triangle : Checks each pixel that is centered over the triangle and generates fragments of the overlapping portions of the triangle. The process of finding pixels within a triangle or sampling is called a scan transformation. Each fragment that is generated within the triangle is interpolated from the triangle vertices, and the data for these fragments includes depth and various shading data from the geometry stage.
    3. Pixel coloring : interpolated shaded data input, which performs all pixel-wise shading operations and passes one or more colors to the next stage. Texture technology is also implemented at this stage.
    4. Merging : The information for each pixel is stored in the color buffer, which is a rectangular array of colors (each color has three parts of red, green, and blue). The merge phase is responsible for combining the color of the fragment generated by the shading phase with the color stored in the current buffer. and includes visibility detection, template manipulation, and so on.

Finally, we need to comb what the various shader usually do:

vertex shader: Model-View transform, vertex shading, projection

Geometry shader: manipulating element vertices (primitive concept below)--element coloring, destroying/creating entities

pixel (fragment) shader: Pixel coloring

Geometry primitive:points, lines, triangles

Real-time rendering (i)--Graphics rendering pipeline

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.