[WebGL Primer] 25, lighting for point lights

Source: Internet
Author: User
Tags cos

Note: The article is translated from http://wgld.org/, the original author Sambonja 広 (doxas), the article if there is my additional instructions, I will add [Lufy:], in addition, The WEBGL research is not deep enough, some professional words, if the translation is wrong, you are welcome to correct.



The results of this demo run


Point LightThe last introduction of the high-color and complementary color coloring.
Using complementary coloring, you can render more natural shading, and the 3D effect is more realistic. However, there will be a large number of computational shortcomings. This can only be handled by cases, according to different circumstances, is a very annoying place.
Well, this time, it's about light. I seem to have heard the words "no ..." this kind of sound ...
This time the topic is the encapsulation of the point light source. The point light is the same as its name, and the light source is a point like a vertex.
So far, all of the light source processing is using a parallel light source, the parallel light source can be seen from an infinite distance from the direction of the fixed light source, all the models in three-dimensional space are exposed to the same direction of light. The position of light source is fixed in three-dimensional space, and the model in three-dimensional space is illuminated in different directions according to its location.
The real world is similar to electric light bulbs and so on. But, in fact, the light bulb will weaken, the farther away the light intensity is weaker. And this package of light source processing does not consider the weakening of the optical, no matter the object distance light has multiple, are affected by the same intensity of light, so, not completely simulate the real world of electric lights.

The method of Electric lightthe encapsulation of electric light sources is not difficult.
The parallel light source is the light vector, which means that the direction of light is fixed. Light source, the position of the light source is determined, you need to work out from the source to the vertex vector as a light vector, using this light vector to calculate the shadow.
Since the vector of the light source to the vertex must be computed, it is much more computationally than the parallel light source, but after solving the calculation of the light vector, it is possible to follow the previous calculation of the parallel light source, so it will not be too difficult.

Modification of vertex ShadersThis time, as with the last time, coloring is done using complementary color coloring, although most of the changes are made in the fragment shader, but there are a few changes in the vertex shader.
The processing of the point light source, just like just said, need to calculate the light source to the vertex of the vector, and to calculate the light vector, that will necessarily need the location of the vertex information.
To pass the position intelligence in the vertex shader to the fragment shader, it is necessary to have a new varying variable, but there are some problems with the location intelligence of the vertex.
Vertex position intelligence is usually passed to the vertex shader in the form of local coordinates, so if the model is moved or rotated using the model coordinate transformation, the position of the vertex will change, that is, even if the local coordinates are (1.0, 1.0, 1.0) the vertices are moved, rotated, and so on, the coordinates may become (example 0.5, 2.0, 5.5) and so on.
The light vector of light emitted from a point light source must take into account the position of the vertex after the model coordinate transformation. Therefore, you must pass in the new model coordinate transformation matrix to the vertex shader, then modify the vertex shader code.
> Vertex Shader Code
Attribute vec3 position;attribute vec3 normal;attribute vec4 color;uniform   mat4 mvpmatrix;uniform   mat4 Mmatrix ; varying   vec3 vposition;varying   vec3 vnormal;varying   vec4 vcolor;void Main (void) {    vposition   = ( Mmatrix * VEC4 (position, 1.0)). xyz;    Vnormal     = normal;    Vcolor      = color;    gl_position = Mvpmatrix * VEC4 (Position, 1.0);}
There are two change points compared to the previous one.
The first change point is to append the varying variable vposition to the location intelligence of the vertex being passed to the fragment shader. The VEC3 type is defined because it represents the location intelligence of the vertex.
The second change point is the addition of the new uniform variable Mmatrix. As we have just written, because the position coordinates of the vertices in the vertex shader are the local coordinate system, in order to transform the model coordinate transformation matrix into the appropriate form (that is, the world coordinate system), the variable defined by the uniform modifier is used to accept the model coordinate transformation matrix on the shader side.
When you pass the location intelligence of a vertex to the fragment shader, the Mmatrix of the model coordinate transformation matrix is multiplied by the position of the local coordinates representing the vertices, and the results are brought into the vposition. In this case, the fragment shader can use the vertex position after the model coordinates have been transformed.


Modification of fragment ShadersNext is the modification on the side of the fragment shader, where the light vectors need to be calculated using the position of the vertex and the position of the point light.
The method of calculating the light vectors at this time is very simple and only needs to be simply subtracted.
In addition, this time is based on the light source to do the processing, so with the position of the uniform variable lightposition to replace the lightdirection of the uniform variable that represents the optical vector.
> Fragment Shader Code
precision Mediump float;uniform mat4 invmatrix;uniform vec3 lightposition;uniform vec3 eyedirection;uniform vec4 ambientcolor; Varying vec3 vposition;varying vec3 vnormal;varying vec4 vcolor;void Main (void) {VEC3 Lightvec = Lightposition-vpos    ition;    VEC3 invlight = Normalize (Invmatrix * VEC4 (Lightvec, 0.0)). xyz;    VEC3 Inveye = Normalize (Invmatrix * VEC4 (eyedirection, 0.0)). xyz;    VEC3 halfle = normalize (invlight + inveye);    float diffuse = clamp (dot (vnormal, invlight), 0.0, 1.0) + 0.2;    float specular = POW (clamp (dot (vnormal, halfle), 0.0, 1.0), 50.0);    VEC4 Destcolor = Vcolor * VEC4 (VEC3 (diffuse), 1.0) + VEC4 (VEC3 (specular), 1.0) + Ambientcolor; Gl_fragcolor = Destcolor;} 
The first line of the main function in the shader, substituting the light vectors from the point light source to the vertex into the variable Lightvec. As said above, the use of simple subtraction, very simple. And, using the light vectors obtained here, the inverse matrix as well as the semi-vectors are calculated as the previous parallel light sources, and the diffuse and reflected light are computed.
Understand the structure, you should understand, in fact, and the previous demo has not changed much. Mainly the light vector processing is different, the method of illumination is basically the same.

JavaScript correctionsafter the shader has been modified, the following is the modification of the main program's JavaScript.

The details of this part of the revised more, a little bit of the beginning of the commentary. So far, the demo has only rendered a torus, this time in the ring body plus a sphere, see the first picture of the article to know. The vertex data of the torus and the vertex data of the sphere are prepared separately.

The creation of vertex data for a sphere model, using the following functions. And the functions that generate the vertex data for the torus are similar.
> Functions for generating vertex data for a sphere
The sphere is generated able Shimonoseki number function sphere (row, column, rad, color) {var pos = new Array (), nor = new Array (), col = new Array (    ), idx = new Array ();        for (var i = 0; I <= row; i++) {var r = math.pi/row * I;        var ry = Math.Cos (R);        var rr = Math.sin (R);            for (var II = 0; II <= column; ii++) {var tr = Math.PI * 2/column * II;            var tx = RR * rad * MATH.COS (TR);            var ty = ry * RAD;            var TZ = RR * rad * Math.sin (TR);            var rx = RR * Math.Cos (TR);            var RZ = RR * Math.sin (TR);            if (color) {var TC = color;            }else{TC = HSVA (360/row * I, 1, 1, 1);            } pos.push (TX, Ty, TZ);            Nor.push (Rx, Ry, RZ);        Col.push (Tc[0], tc[1], tc[2], tc[3]);    }} r = 0;            for (i = 0; i < row, i++) {for (ii = 0; ii < column; ii++) {R = (column + 1) * i + II;    Idx.push (R, R + 1, r + column + 2);        Idx.push (R, r + column + 2, r + column + 1); }} return {P:pos, N:nor, C:col, i:idx};}
Forming the vertex of a sphere, defines a shape method that wraps a film of a large polygon group into a ball. This sphere function accepts four parameters, the first parameter is the longitudinal division number (number of vertices) of the membrane-shaped polygon plate that forms the sphere, and the Earth's analogy is the direction of latitude. The second parameter is the horizontal split number, where the earth is the direction of longitude. The third parameter is the radius of the sphere. The fourth parameter is the color of the sphere, which is an array of four elements, and if no color is specified, the HSV color is automatically assigned.
Use this function, pass in the appropriate parameters, and then receive the return value. The return value is an object that is used to refer to the appropriate properties of the object. The actual code is as follows.

The use part of the > function sphere
Generate VBO with the sphere's vertex data and save var Spheredata = sphere (up, up, 2.0, [0.25, 0.25, 0.75, 1.0]), var sposition = Create_vbo (SPHEREDATA.P); var snormal   = Create_vbo (SPHEREDATA.N), var scolor    = Create_vbo (SPHEREDATA.C), var svbolist  = [Sposition, Snormal, scolor];//sphere is generated by ibo var sindex = Create_ibo (SPHEREDATA.I);
The above code generates a sphere with 64 vertices both vertically and horizontally, with a radius of 2.0, and this time the specified color is blue. It is important to note that, for the sake of subsequent processing, the VBO is saved in the array, and after doing so, the work of Attributelocation and VBO will become very convenient, which will be described later.
Next is the uniformlocation, this time from the parallel source to the electric light, the direction of the specified light part to be replaced by the position of the specified light.

Related treatment of >uniform
Uniformlocationを with column occupies get var unilocation = new Array (); unilocation[0] = Gl.getuniformlocation (PRG, ' Mvpmatrix '); UNILOCATION[1] = gl.getuniformlocation (PRG, ' Mmatrix '); unilocation[2] = Gl.getuniformlocation (PRG, ' InvMatrix '); UNILOCATION[3] = gl.getuniformlocation (PRG, ' lightposition '); unilocation[4] = Gl.getuniformlocation (PRG, ' Eyedirection '); unilocation[5] = Gl.getuniformlocation (PRG, ' ambientcolor ');//Point light position var lightposition = [0.0, 0.0, 0.0];
Changes to the uniform modifier variables made in the shader are reflected here in detail, and the location of the electric light source of the demo is set to the origin.
In order to make it easier to understand the effect of the point light source, the center of the position of the point light source in the demo, the torus and the sphere are rotated continuously, which contains the generation of the model coordinate transformation matrix. Because two models are plotted at the same time, the model is rendered using the appropriate VBO and IBO during the continuous cycle.
The code is a little bit longer, so you can see it carefully. It is mainly written just now, using an array that holds VBO, Vbo is bound to the self-made function.

> Continuous-cycle drawing processing
カウンタをインクリメントするcount++;//カウンタを Yuan にラジアンと various coordinates calculate var rad = (count%) * Math.pi/180;var tx = Math.Cos (RAD) * 3.5;var ty = Math.sin (RAD) * 3.5;var tz = Math.sin (RAD) * 3.5;//トーラスのvboとiboをセットset_attribute (Tvbolist, attlocation, attStride); Gl. Bindbuffer (GL. Element_array_buffer, tindex);//モデル coordinates slew to generate m.identity (Mmatrix); M.translate (Mmatrix, [TX,-ty,-tz], Mmatrix); M.rotate (Mmatrix,-rad, [0, 1, 1], Mmatrix); M.multiply (Tmpmatrix, Mmatrix, Mvpmatrix); M.inverse (Mmatrix, InvMatrix);// Uniform Slew registration と depict GL.UNIFORMMATRIX4FV (unilocation[0], False, Mvpmatrix) GL.UNIFORMMATRIX4FV (unilocation[1], False, Mmatrix); GL.UNIFORMMATRIX4FV (unilocation[2], false, Invmatrix); GL.UNIFORM3FV (unilocation[3], lightposition); GL.UNIFORM3FV (Unilocation[4], eyedirection); GL.UNIFORM4FV (unilocation[5], ambientcolor); Gl.drawElements (GL. Triangles, Torusdata.i.length, GL. Unsigned_short, 0);//Sphere のvboとiboをセットset_attribute (Svbolist, attlocation, attstride); Gl.bindbuffer (GL. Element_array_buffer, Sindex)//モデル coordinates slew to generate M.ideNtity (Mmatrix); M.translate (Mmatrix, [-tx, Ty, TZ], Mmatrix); M.multiply (Tmpmatrix, Mmatrix, Mvpmatrix); M.inverse ( Mmatrix, Invmatrix);//Uniform Slew number Registration と (GL.UNIFORMMATRIX4FV], False, unilocation[0) Mvpmatrix ( Unilocation[1], False, Mmatrix); GL.UNIFORMMATRIX4FV (unilocation[2], false, Invmatrix); Gl.drawelements (GL. Triangles, Spheredata.i.length, GL. Unsigned_short, 0);//コンテキストの Gl.flush ();
After the generation of various coordinate transformation matrices and the end of the generation of the inverse matrix, the position of the point light source and the viewpoint vector are passed into the shader together with the binding processing of VBO and IBO, and the drawing command is issued.

In order to draw two models and a series of processing, but should be careful not to repeat, and did not carry out particularly difficult to deal with.


Summarizethe concept of lighting with a point light is basically the same as a parallel light source. The shadow is added based on getting the normal of the light vector and vertex and the inner product of the viewpoint vector. The difference between a peaceful light source is simply that the light vector is a fixed value. The point light source uses the position of the vertex and the position of the light source after the transformation of the model coordinate, then calculates the light vector, so it adds a number of computational quantities.
The direction of light in the parallel light source is certain, and the whole is subjected to equal illumination. But the point light source according to the actual vertex coordinates to carry on the concrete light collision. This time the demo is the same as the last time in the fragment shader for the calculation of light and complementary coloring, so you can make a very beautiful rendering.
This article only need to understand is the lighting related to the basic part of the package, WebGL in the coloring according to Kung Fu home, you can render a variety of effects, from now on to have the ability to apply, in the future will carry out a detailed introduction of some special technology.
Then, this time also provides the actual demo, please click on the connection to test.

next time, to start the rendering of the picture, look forward to it.

Use point light to render torus and spheres

http://wgld.org/s/sample_013/


reprint Please specify: transfer from Lufy_legend's blog Http://blog.csdn.net/lufy_legend

[WebGL Primer] 25, lighting for point lights

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.