[WebGL Primer] 26, texture drawing

Source: Internet
Author: User

Note: The article is translated from http://wgld.org/, the original author Sambonja 広 (doxas), the article if there is my additional instructions, I will add [Lufy:], in addition, The WEBGL research is not deep enough, some professional words, if the translation is wrong, you are welcome to correct.



The results of this demo run


WebGL and TexturesThe method of color complementary coloring with light from point light source was introduced last time.
In the fragment shader to calculate the light, shadows, highlights and other effects are very beautiful, 3D scene of the true degree of greatly improved. And can be used with the vertex color, after understanding the previous explanation of the content, you should be able to compare high-quality 3D rendering.
This time, look at the use of advanced textures. The so-called texture, in a nutshell, is a picture data that can be placed on a polygon, and of course can be used in WebGL.
WebGL and HTML are different, the general picture type (gif,jpg,png, etc.) can not be used directly, in addition, you can also convert the canvas to a texture, in short, to change the method to render.
This time, let's take a look at the most basic methods of drawing using textures.

Limitations of textures in WebGLIt says that the general types of images used in HTML are transformed into textures and can be used in WebGL. The idea is to turn the image data used in the Web into a form that can be used in WebGL.
However, the textures in WebGL need to be aware that the size of the image data used must be a factorial of 2, and that the pixel length must be in the form of a factorial of 2 32x32,128x128.

Of course, to do some processing, not 2 of the factorial of the picture data is also available, but basically as a texture to use the image data must be the size of 2 factorial.

In addition, look at the ordinary Web page can feel, the page image data reading is to spend a little time, in the texture conversion, must be in the picture after reading, there need to do some special processing, if the JavaScript is not too familiar with the words may not be able to, this later will say.


Creation and use of textures

So, let's start with the steps of using textures.

Textures are processed in WEBGL using texture objects, and the CreateTexture function is required to generate texture objects.

Examples of use of >createtexture

var tex = Gl.createtexture ();
This function has no parameters, but simply returns a texture object, passing the above code, the variable tex is an empty texture object.
After the texture object is generated, the texture object is then bound to WebGL.

Let's think about it, the previous use of cached objects in WebGL is also required to bind with WEBGL, such as when using VBO, this time, as with the cache, to bind processing. To manipulate texture data, you first have to bind and then use some of the functions that manipulate the texture to be able to apply the processing to the bound texture object.

The function of binding texture data to WebGL is bindtexture.

Examples of use of >bindtexture

Gl.bindtexture (GL. TEXTURE_2D, Tex);
This function requires two parameters. The first parameter is the kind of texture that is used to draw the 2D image type, usually using GL. texture_2d as a parameter. The second parameter is the texture object to be bound, which is quite simple.
Although the texture object and WebGL are bound, but not the most core image data is added, the image data and textures are connected to the TEXIMAGE2D function.

Examples of use of >teximage2d

Gl.teximage2d (GL. texture_2d, 0, GL. RGBA, GL. RGBA, GL. Unsigned_byte, IMG);
look at the above code, there must be "what is this?" "Feel it, this function takes six parameters altogether, looks complex, actually quite simple."
The first parameter is the type of texture used in bindtexture, and GL is also used here. Texture_2d on the line. The second parameter texture mapping level, temporarily do not consider, set to 0 on the line. Next, GL is specified in the third parameter and in the fourth parameter. RGBA, at this stage do not consider, the direct use of the line. Similarly, the fifth parameter is nothing special to use, the first to specify a GL. Unsigned_byte can do it. Mainly, at this stage the first to the fifth parameter first like the above setting is no problem, the most important is the sixth parameter, where you need to specify the image data, the sixth parameter of the image data at this time assigned to the binding texture.


Picture reading time Considerations

I also said that using the TEXIMAGE2D function can assign the image data to the texture, but the reading of the picture in the webpage takes a little time.

It is important to note that when the Teximage2d function is called, it must be that after the picture has been read, if you call Teximage2d before the picture is read, you will not be able to correctly assign the picture data to the texture.

This needs to be called in the event that the picture is finished reading.

The specific process is to use JavaScript to create a picture object, the Image object has the OnLoad event can be monitored to complete the picture reading. The processing of the texture is done here, and finally the image object is assigned the image address to begin reading the image.

The important point here is that before the picture begins to read, add the OnLoad event to add texture-related processing to the event so that the image can be automatically processed after it is read.

The above-mentioned processing, written as a function, is the following.

> Texture Generation function

function        Create_texture (source) {//イメージオブジェクトの generates var img = new Image ();                データのオンロードをトリガーにするimg.onload = function () {//テクスチャオブジェクトの generates var tex = Gl.createtexture (); テクスチャをバインドするgl.bindtexture (GL.                TEXTURE_2D, Tex); テクスチャへイメージを Suitable for gl.teximage2d (GL. texture_2d, 0, GL. RGBA, GL. RGBA, GL.                Unsigned_byte, IMG); ミップマップを Generate Gl.generatemipmap (GL.                TEXTURE_2D); テクスチャのバインドを validity Gl.bindtexture (GL.                texture_2d, NULL);    Generate したテクスチャをグローバル slew number occupies into texture = Tex;        }; イメージオブジェクトのソースを Specifies img.src = source;} 
This custom function create_texture receives the picture address of a picture object as an argument. function, the image object is generated first, before the image data is read, the OnLoad event is added, in the event, there is texture generation, binding, and assigning image data processing. In the function, after the TEXIMAGE2D function is executed, it should be a bit strange to use the Generatemipmap function in order to generate the texture map.
Texture mapping is an organization that prepares a number of different sizes of picture data in advance, not only in WebGL, but in all 3D programming. Texture mapping, prepare a texture to use the scene, in the texture image need to zoom out when the display can play a big role, because the reduced image data has been prepared in advance, and then the appropriate switching rendering, so even if the image is smaller, it can be rendered very beautiful.

After executing the GENERATEMIPMAP function, the texture mapping can be generated, and the parameters of this function are the same as the Bindtexture, which is also GL. texture_2d.

Like Vbo, WebGL can bind only one texture at a time, so finally unbind. When the texture object is eventually generated, the global variable is used because onload has no way of returning itself. In the above example, the variable texture must be within the space that the Create_texture function can refer to.

At the end of the onload, the image object is finally assigned a picture address because the OnLoad event has been added, and the OnLoad event is automatically called after the picture is read, and the code that generates the texture is executed.


Properties of texture coordinates and vertices

Well, now that you know how to build a texture object, it's important to put the texture in the polygon.

to place the texture in a polygon, you need to include information about how the texture is placed in the polygon when the polygon is generated. So you need to add a new vertex property to the vertex.

in the previous article (nine, the basis of vertex caching), it is explained in detail that adding information to a vertex requires the use of a new VBO. Then this time the newly added VBO will need to save the vertex's texture coordinates. The texture coordinates are meant to represent which coordinates of the texture are used. Texture coordinates are used in a range of 0 ~ 1, and have a horizontal two direction. Therefore, it is similar to (0.0, 0.0) when representing texture coordinates, which requires two elements. (Lufy: Translation is a bit raozui, look at the use of the latter part of the understanding. )

And, there are some strange places, in general, the coordinate system of the picture data is considered as the origin point on the top left, as shown in.


The size of the values for x and Y, from the origin in the upper left corner, to the right and down.

The texture coordinate system in WEBGL is like the following.


The coordinate system is upside down a bit. That is, in the texture coordinate system, the lower left is the origin, and the higher the value in the vertical direction, the greater the upward representation. However, if you look at the two charts, you will know that there is no need to think too much when specifying texture coordinates. Why is it? Because the picture is also upside-down, so with the top left corner as the origin of the picture as the line, the result is consistent.

At this stage, you just need to know that the coordinate system on the texture space is flipped up and down. * When using textures for some special processing, you need to know more about the relevant knowledge.


JavaScript correctionsNext, in order to use the texture in the program, to modify the code it.

This time there is no lighting effect, only the polygon with the image is rendered. Although it is possible to use a torus, there are some details that need to be dealt with, so we start with the polygon model.

First of all, to prepare the vertex data of the model, just said, in order to save the texture coordinates, to add a new vertex properties, and because there is no need for lighting effect, so the normal information is not used this time.

> Preparation of vertex data

Attributelocationを with column occupies get var attlocation = new Array (); attlocation[0] = Gl.getattriblocation (PRG, ' position '); ATTLOCATION[1] = gl.getattriblocation (PRG, ' color '); attlocation[2] = Gl.getattriblocation (PRG, ' Texturecoord ');//     Attributeの features occupies var attstride = new Array () attstride[0] = 3;attstride[1] = 4;attstride[2] = 2;//vertex position var position = [  -1.0, 1.0, 0.0, 1.0, 1.0, 0.0,-1.0,-1.0, 0.0, 1.0,-1.0, 0.0];//vertex color var = [1.0, 1.0, 1.0,    1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0];//テクスチャ coordinates var texturecoord = [0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 1.0];//vertex インデックスvar index = [0, 1, 2, 3, 2, 1];//vboとiboの generate var vposition = Crea Te_vbo (position), var vcolor = Create_vbo (color), var Vtexturecoord = Create_vbo (Texturecoord), var vbolist = [V Position, Vcolor, Vtexturecoord];var iIndex = Create_ibo (index);//Vboとiboの Registration Set_attribute (vbolist, Attlocation, at Tstride); Gl.bindbuffer (GL. Element_array_bufFER, IIndex); 
Defines a quadrilateral of four vertices. A detailed look at the portion of the location where the vertices are saved is understood by the point that the center of the quadrilateral is the origin, and the vertex order is the same as the order in which the letter z is written, and the vertex color is defined as opaque white.
The texture coordinates, like the previous one, define two elements that contain everything.

Vertex data is represented by an array, generating VBO and IBO as before. This is exactly the same as what was done before. In this way, vertex-related processing, which means that the data processed with the attribute variable within the shader is ready.

This time, with no light processing, the coordinate transformation matrix is ready to handle vertices. The inverse matrix and the location of the light source are not required.

But considering the variables of the uniform modifier, you need to add a variable defined by the uniform modifier that refers to data that all vertices are uniformly processed, so the texture data that is used as all vertices must be passed with the uniform variable.

>uniform related Treatment

Uniformlocationを with column occupies get var unilocation = new Array (); unilocation[0]  = gl.getuniformlocation (PRG, ' Mvpmatrix '); UNILOCATION[1]  = gl.getuniformlocation (PRG, ' texture ');
There are two uniform variables used this time, one to handle the coordinate transformation matrix and the other to submit the texture data.



Set texture to validThe concept of a unit in a texture is to set a number of textures to manage textures, and the default is to set the texture unit of number No. 0 to valid. Texture units can play a role when handling multiple textures, this time using only a single texture, so use the default number NO. 0 unit to do it.

The Activetexture function is used to set the specific texture units to be valid.

> Texture Unit efficiency

There is a validity にするテクスチャユニットを designation Gl.activetexture (GL. TEXTURE0);
here is the GL used as the parameter. The TEXTURE0 constant, followed by 0, is the number of the texture unit, and if the texture of number 1 is set to valid, it needs to be GL. TEXTURE1. However, for no particular reason, using texture units should be used in order from small to large.
>> maximum value of the texture unit (upper value)
When multiple textures are used at the same time, the texture units must be used, and the maximum number of units is determined by the operating environment. Because running WebGL in addition to the computer, there are mobile phones, and so on, so the texture units can be used to determine how many are very laborious.
Because it is subject to the performance of the hardware, so it is feasible to make a judgment before using it and then handle it separately. The maximum number of texture units that can be used for the query execution environment is using the GetParameter function.
Here is an example
Gl.getparameter (GL. Max_combined_texture_image_units);
Passing this very long constant into the GetParameter function gl.max_combined_texture_image_units can get an integer value that represents the maximum number of liberal arts units that can be used, and if the return value is 10, Then the texture unit you can use is GL. TEXTURE0 ~ GL. TEXTURE9.


To pass in texture data to a shader

After the corresponding texture units are set to valid, then you need to bind the texture to WebGL, which is also done when converting the image data into texture data, so it's simple as follows.

> Textures and WebGL Bindings

テクスチャをバインドするgl.bindtexture (GL. texture_2d, TEXTURE);
The texture information is bound to be transmitted to the shader, so it needs to be handled with the uniform variable, which is also used in the previous uniformlocation, which is handled as follows.

> Passing texture data to shaders

Uniform Slew number にテクスチャを registration Gl.uniform1i (Unilocation[1], 0);
It is important to note that it is different from passing in the matrix and vector to the shader, because the number of the texture units needs to be passed in. The uniform1i function is used when an integer is passed to the shader. The second parameter is the integer 0 to be passed in to the shader. In other words, the integer passed in here is consistent with the previously valid texture units.

Shader modifications

Next, the shader is modified, first the vertex shader.

> Vertex Shader Code

Attribute vec3 position;attribute vec4 color;attribute vec2 texturecoord;uniform   mat4 mvpmatrix;varying   vec4 Vcolor;varying   vec2 vtexturecoord;void Main (void) {    Vcolor        = color;    Vtexturecoord = Texturecoord;    Gl_position   = Mvpmatrix * VEC4 (Position, 1.0);}
in a vertex shader, the position of the vertex, the color of the vertex, and the texture coordinates of the vertex are defined with the attribute modifier. The color and texture coordinates of the vertices are not processed and are passed directly to the fragment shader.
The vertex shader-related processing is not a difficult place to go, and then look at the fragment shader.

> Fragment Shader Code

Precision Mediump float;uniform sampler2d texture;varying vec4      vcolor;varying vec2      vtexturecoord;void Main ( void) {    Vec4 Smpcolor = texture2d (texture, vtexturecoord);    Gl_fragcolor  = Vcolor * smpcolor;}
The fragment shader uses the uniform modifier to receive texture data. Note that the variable type of the sampler2d is the meaning of the sample, first consider it as a texture data.
In addition, a texture2d function is used, which has two parameters, the first parameter is the sampled texture data, and the second parameter is the VEC type data representing the texture coordinates.

In this case, the vertex shader uses the texture coordinates of the vertex defined by attribute, receives it in the fragment shader with the varying variable, and then passes the texture coordinates to the TEXTURE2D function on the fragment shader side.

In this way, after using the TEXTURE2D function to obtain the color information of the texture, and then multiplying the vertex color (varying variable vcolor), the final color is obtained.


Summarize

The use of texture, with double time to explain, should understand it.

texture Peripheral processing is very lengthy, but the main thing is texture coordinates, texture objects, and in order to deal with the data of the shader, the concept of units in the texture, using a reasonable operation can handle multiple textures and so on.

Although this is only the most basic part of the package, but the change point is very much, so the end will be posted all the code ( Lufy: I will not post, everyone directly in the browser to see the line. ), in addition, the run demo is given at the end.

next time, describe the use of multiple textures.

Demo with textures in quadrilateral

http://wgld.org/s/sample_014/


reprint Please specify: transfer from Lufy_legend's blog Http://blog.csdn.net/lufy_legend

[WebGL Primer] 26, texture drawing

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.