"Giser&&painter" WebGL rendering first experience (i)

Source: Internet
Author: User

Based on the previous OpenGL rendering principle, these two weeks have come into contact with some of the WebGL drawing of some content, because just getting started, a lot of things are very obscure, so deliberately spent a small half a day to tidy up, hereby recorded.

One Canvas and brushes: Create canvas && get WEBGL context

Before we start the WebGL drawing story, we have to get to know canvas, because this is the base backplane for our drawing: "The canvas element creates a fixed-size canvas and provides one or more render contexts for drawing and processing the content to be displayed" (excerpt from MDN). By definition, we can interpret it as a staging point for rendering tasks, because the output of the final drawing is to give the data to the screen, and the canvas is just a broker for staging the data to be rendered, a container with an imitation screen pixel matrix data structure, similar to the concept of intermediate caches. So why not just print it out on the screen? Before listening to such an explanation, the role of the cache is that the next frame is not rendered in time (rendering time beyond the perceived frame rate of the human eye, generally 24 frames), the previous frame of the data can replace the next frame, in order to ensure the integrity of the process.

The WebGL API provides the ability to draw interactive 2d/3d images without plug-ins in a supported browser, while WEBGL's rendering context Webglrenderingcontext based on the OpenGL ES 2.0 drawing context, typically used in HTML5 's <canvas> in-element drawing. And Webglrenderingcontext can be seen as the core CPU of rendering tasks, all operations: from viewport clipping, state information, data buffers, shader creation and invocation, buffer drawing, and so on, are closely related to Webglrenderingcontext, so, The first thing we need to do before any Web program starts drawing is to create a canvas as a data container and bind it to the context of WebGL, so that we have both a canvas and a paintbrush to start drawing the shapes we want.

Two palette: Shaders

In traditional OpenGL fixed pipelines, we have limited control over the rendering process. From the previous OpenGL basis, in the vertex operation and element assembly, texturing, element coloring, and so on, we can control only call the underlying hardware vendors to provide the interface parameters, using a fixed program to carry out the above process processing. This level of control is very weak, remember not long ago on the understanding of a fixed pipeline control metaphor: "Toggle switch", I feel very appropriate, this concept is a bit similar to the Bandao on the railway, the direction of the train can only be laid on the track to choose, if there is no track of the place, the train will not be able to open. Rendering in the same way, if the rendering effect has a higher and more flexible requirements (or you can not accept the hardware vendors to provide a complex way to customize the rendering parameters of the rendering of their own results), so that the fixed pipeline processing can not meet the demand.

Figure 1 Toggle Switch

The emergence of a programmable rendering pipeline gives the answer to the challenges encountered above, as shown in Figure 2: There are two processors in the programmable rendering pipeline: The vertex shader vertex shader and the slice shader fragment shader. These two processors bypass the traditional process of vertex manipulation and element assembly, element coloring, and, through their own programmable features, take control of the work of vertex coordinate transformation and pixel color calculation:

1) in the processing stage of vertex shader vertex shader, vertex data is read from GPU memory, vertex shader vs can perform vertex processing such as Model view transform and projection transform for each vertex, instead of vertex processing pipeline in fixed pipeline;

<!--Vertex shader -<ScriptID= "2d-vertex-shader"type= "X-shader/x-vertex">attribute vec2 a_position;    Attribute Vec4 A_color;    Uniform MAT3 U_matrix;    Varying VEC4 v_color; voidMain () {gl_position=vec4 (VEC3 (a_position,1). XY,0, 1); V_color=A_color; }</Script>

2) When the vertex shader finishes processing, the pipeline will rasterize the individual vertices. Because the count of vertex shaders is determined by the number of vertices, n vertices correspond to n vertex pixel colors, but how do you determine the color of non-vertex pixels in an entity composed of several pixels? Now we need to introduce you to a new object: Data type Varyings, as you can see from the short vertex shader demo below, we have defined several data types in the vertex shader, with attribute,uniform and varying. But come early than the coincidence, we first meet varying.

Varying is a variable that joins the vertex shader and the slice shader as a messenger. In general, the vertex shader calculates the color and coordinate equivalents of each vertex and stores the values with the varying variable. Back to the question just raised, how do non-vertex pixels determine their values? This requires a slice shader to understand the messenger, but there is a rasterizer matchmaking between the element shader and the vertex shader, and when the vertex shader has a vertex value of type varying, the rasterizer specifies an interpolation pattern that guides the slice shader to render each pixel in pixels.

3) The chip shader fragment shader appears under the guidance of rasterization (the element is already mentioned in the previous OpenGL basis, so-called element refers to the rasterized elements). The main work of the slice shader FS is to provide color values for the currently rasterized pixels, and each pixel in the screen needs to call the slice shader FS once, each time getting color information from a special global variable Gl_fragcolor.

<!--Fragment Shader -<ScriptID= "2d-fragment-shader"type= "X-shader/x-fragment">Precision Mediumpfloat;    Varying VEC4 v_color; voidMain () {Gl_fragcolor=V_color; }</Script>

Figure 2 Programmable rendering pipeline

three-profile skeleton line

In the previous step is probably able to explore the first one or two, the next step is to create a vertex object in the video memory Vbo: So-called VBO, Vertex buffer objects (Vertex buffer object) This concept comes from OpenGL, The concept is defined as a type of attribute information (such as vertex coordinates, vertex normals, and colors) that is stored in a dedicated buffer in the memory of the vertex vertex, which can be removed from the video memory directly when the rendering command is executed. Vbo. Since the entire process is performed in the GPU, unlike the traditional drawing methods (CPU command GPU performs drawing actions, repeatedly transferring a large number of vertex data into the GPU, rendering is slower), VBO is generally considered an object that improves data transfer efficiency.

So how is Vbo applied in WebGL? We usually first create a cache object through the Createbuffer method Vbo, as shown in the MDN in Figure 3, the return value VBO can be a buffer object for color or vertex coordinate values. Just call Gl.bufferdata to write the data to Gl_array_buffer and then use GL. The Bindbuffer method can associate the buffer data in the Createbuffer with the Array_buffer in the GL context, that is, the vertex data is successfully written to the GPU memory.

Figure 3 Webglbuffer

    functionMain () {...//Create A buffer & bind buffer            //Create buffer data and bind to the context of GL            varPositionbuffer =Gl.createbuffer (); Gl.bindbuffer (GL.            Array_buffer, Positionbuffer); //Set Geometry            //Fill BufferSetgeometry (GL); ......    }
functionSetgeometry (GL) {Gl.bufferdata (GL. Array_buffer,NewFloat32array ([-150,-100, 150,-100, -150, 100, 150,-100, -150, 100, 150, 100]), GL. Static_draw); }

As of this stage, the initialization before rendering is basically done: the canvas space has been created, and the canvas canvas is bound to the WEBGL environment---> created and specified related custom shaders fill vertex data---> Create vertex cache vbo. Everything is ready, a mighty army of ordinary screen pixels are waiting for a clear rendering of the drawing instructions, only to the GL signal flags, under the command of the GPU, tens of millions of screen pixels will be in accordance with the specified location and color, with the lightning of the potential of the night, in the combination you want to appear on your screen.

Four Draw!

After displaying my literary talent, let's take a look at the final step of how the drawing rendering command is emitted.

In a nutshell, we can divide the rendering portion of the drawing into the following three parts:

1) Canvas Cleaning

2) Specify the environment

3) Execute the coloring program

//Rendering code: Rendering Codes        functionDrawscene () {

      //-----------------------Canvas cleaning--------------------- webglutils.resizecanvastodisplaysize (Gl.canvas); //Covert from clip space to pixelsGl.viewport (0, 0, Gl.canvas.width, gl.canvas.height); //Clear CanvasGl.clear (GL. Color_buffer_bit); //Direct to our programGl.useprogram (program); //------------------------Specify the environment------------------------------- //Open Properties Attribute switchGl.enablevertexattribarray (positionattributelocation); //bind the current state: binds the buffer data block that has completed the fill point data PositionbufferGl.bindbuffer (GL. Array_buffer, Positionbuffer); //Tell the attribute what to get data out of Positionbuffer (Array_buffer) varsize = 2;//2 Components per iteration varType = gl. FLOAT;//The data is 32bit floats varNormalize =false;//don ' t normalize the data varStride = 0;//Step (byte), the number of bytes per vertex data: 0 = move forward size * sizeof (type) each iteration to get the next position varoffset = 0;//start at the beginning of the buffer //Vertexattribpointer: Vertex properties Refer to road signs //tells the video card to read the vertex data from the currently bound buffer (Bindbuffer in the Drawscene method) vertexGl.vertexattribpointer (positionattributelocation, size, type, normalize, stride, OFFSE T); //------------------------------to execute the coloring program-------------------------------------//Draw the geometry. varPrimitiveType = gl. triangles;//Draft primitive mode varoffset = 0;//The first address of the read data from the buffer begins to offset the initial subscript varCount = 6;//draw the number of vertex data, that is, the number of times the shader code runsgl.drawarrays (PrimitiveType, offset, count); }//==================================================================================

The specific webglrenderingcontext provides the interface I do not repeat here, this is just to bring you a concept of how to draw a rendering in the browser, and then there should be a topic for each link, after all, just into the pit, more days! (because I want to work, next time write haha

"Giser&&painter" WebGL rendering first experience (i)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.