[WebGL Primer] 25, lighting for point lights

Source: Internet
Author: User
Tags cos sin



Note: The article is translated from http://wgld.org/, the original author Sambonja 広 (doxas), the article assumes that I have additional instructions. I will add [Lufy:]. In addition, the WEBGL research is not deep enough, some professional words. If the translation is wrong, you are welcome to correct me.









The results of this demo's execution








Point LightThe last introduction of the high-color and complementary color coloring.






The technique of using complementary colors. The ability to render more natural shadows. The 3D effect is more realistic. However, there will be a large number of computational shortcomings.



This can only be handled by cases, according to different circumstances. It's a very annoying place.




Well, this time, it's about light. I seem to have heard the words "no ..." this kind of sound ...
This time the topic is the encapsulation of the point light source. The point light is the same as its name, like a vertex. The light source is a point.
Until now. All light sources are processed using a parallel light source, which can be seen as a light source that is fixed in the direction emitted from infinitely far away. All the models in the three-dimensional space are illuminated by the same direction of light. And the point light is processed. The position of the light source is fixed in three-dimensional space, and the model in three-dimensional space is illuminated by different directions depending on its location.
Similar electric light bulbs in the real world.



However, the light bulb will actually weaken. The farther away the light is, the weaker the intensity.



The process of the light source in this package does not take into account the weakening of the lights. No matter how much the object is from the light source, it is affected by the same intensity of light, so. Not completely simulating the electric light in the real world.


The method of Electric lightThe encapsulation of electric light sources is in fact not difficult.
The parallel light source is the light vector, which means that the direction of light is fixed. Light source, the position of the light source is determined, you need to work out from the source to the vertex vector as a light vector, using this light vector to calculate the shadow.
Since the vector of the light source to the vertex must be calculated, it is larger than the parallel light source, but after the calculation of the light vector is conquered. will be able to follow the previous calculation of the parallel light source. So it won't be too hard.







Changes to vertex shadersthis time, same as last time. Coloring with complementary color coloring, although most of the changes are made in the fragment shader, there are a few changes in the vertex shader.
Point of light processing, just like that. It is necessary to calculate the vector of the light source to the vertex, and to calculate the light vector, the position information of the vertex must be needed.
To pass position intelligence from the vertex shader to the fragment shader. That must require a new varying variable. But only the vertex of the location information, and other problems.
Position intelligence for vertices is usually passed to the vertex shader in the form of local coordinates, so it is assumed that the position of the vertex will change when the model is moved or rotated using the model coordinate transformation. So. Even if the local coordinates are (1.0, 1.0, 1.0) the vertices are moved. Rotation and other transformations, coordinates may become (example 0.5, 2.0, 5.5) and so on.
The light vector of light emitted from a point light source. The position of the vertex after the model coordinate transformation must be considered. Therefore, you must pass in the new model coordinate transformation matrix to the vertex shader. So let's change the code for the vertex shader.
> Vertex Shader Code
Attribute vec3 position;attribute vec3 normal;attribute vec4 color;uniform   mat4 mvpmatrix;uniform   mat4 Mmatrix ; varying   vec3 vposition;varying   vec3 vnormal;varying   vec4 vcolor;void Main (void) {    vposition   = ( Mmatrix * VEC4 (position, 1.0)). xyz;    Vnormal     = normal;    Vcolor      = color;    gl_position = Mvpmatrix * VEC4 (Position, 1.0);}
Compared with the last time, there are two change points.






The first change point is the location intelligence to pass in the vertex to the fragment shader. The varying variable vposition is appended. Because of the location intelligence of the vertex represented. So the VEC3 type is defined.
The second change point is the addition of the new uniform variable Mmatrix. As we have just written, the coordinates of the vertices in the vertex shader are in the local coordinate system, so in order to transform the model coordinate transformation matrix into the appropriate form (that is, the world coordinate system). Variables defined using the uniform modifier. To accept the model coordinate transformation matrix on one side of the shader.




When you pass the location intelligence of a vertex to the fragment shader, the Mmatrix of the model coordinate transformation matrix is multiplied by the position of the local coordinates representing the vertices, and the results are brought into the vposition. Such words. The fragment shader can use the vertex position after the model coordinates have been transformed.





Fragment shader changesThen there is the change of one side of the fragment shader, in which the light vector is calculated using the position of the vertex and the position of the point light source in the fragment shader.






The method of calculating the light vectors at this time is very easy. Just need simple subtraction to be able.
In addition, this time is based on the light source to do the processing, so with the position of the uniform variable lightposition to replace the lightdirection of the uniform variable representing the optical vector.
> Fragment Shader Code


Precision Mediump float;uniform mat4 invmatrix;uniform vec3 lightposition;uniform vec3 eyedirection;uniform vec4 Ambientcolor;varying vec3 vposition;varying vec3 vnormal;varying vec4 vcolor;void Main (void) {    vec3  Lightvec  = lightposition-vposition;    VEC3  Invlight  = Normalize (Invmatrix * VEC4 (Lightvec, 0.0)). xyz;    VEC3  Inveye    = Normalize (Invmatrix * VEC4 (eyedirection, 0.0)). xyz;    VEC3  Halfle    = normalize (invlight + inveye);    float Diffuse   = clamp (dot (vnormal, invlight), 0.0, 1.0) + 0.2;    float Specular  = POW (Clamp (dot (vnormal, halfle), 0.0, 1.0), 50.0);    VEC4  destcolor = Vcolor * VEC4 (VEC3 (diffuse), 1.0) + VEC4 (VEC3 (specular), 1.0) + Ambientcolor;    Gl_fragcolor    = Destcolor;}
The first line of the main function in the shader. The light vectors from the point light source to the vertex are placed into the variable Lightvec.





Just like the one above said. A simple subtraction is used. It's very easy. And, using the light vectors obtained here, the inverse matrix and the semi-vector are the same as the previous parallel light sources. Calculates diffused light and reflected light.
Understanding the structure, it should be clear. In fact, and the previous demo did not change much.



Mainly the light vector processing is different, the method of illumination is basically the same.


JavaScript correctionsafter the shader has been changed. Here's how the main program's JavaScript is changed.


The details of this part of the change is much more, a little start to explain.



So far the demo has just rendered a torus, this time in the ring body plus a sphere, see the beginning of the article on the picture to know. The vertex data of the torus and the vertex data of the sphere are prepared separately.


The creation of vertex data for a sphere model, using the following functions. The functions that generate the vertex data for the torus are more similar.
> Functions for generating vertex data for a sphere
The sphere is generated able Shimonoseki number function sphere (row, column, rad, color) {var pos = new Array (), nor = new Array (), col = new Array (    ), idx = new Array ();        for (var i = 0; I <= row; i++) {var r = math.pi/row * I;        var ry = Math.Cos (R);        var rr = Math.sin (R);            for (var II = 0; II <= column; ii++) {var tr = Math.PI * 2/column * II;            var tx = RR * rad * MATH.COS (TR);            var ty = ry * RAD;            var TZ = RR * rad * Math.sin (TR);            var rx = RR * Math.Cos (TR);            var RZ = RR * Math.sin (TR);            if (color) {var TC = color;            }else{TC = HSVA (row * I, 1, 1, 1);            } pos.push (TX, Ty, TZ);            Nor.push (Rx, Ry, RZ);        Col.push (Tc[0], tc[1], tc[2], tc[3]);    }} r = 0;            for (i = 0; i < row, i++) {for (ii = 0; ii < column; ii++) {R = (column + 1) * i + II;    Idx.push (R, R + 1, r + column + 2);        Idx.push (R, r + column + 2, r + column + 1); }} return {P:pos, N:nor, C:col, i:idx};}
Forming the vertex of a sphere, defines a shape method that wraps a film of a large polygon group into a ball. This sphere function accepts four of the parameters. The first parameter is the number of longitudinal cuts (vertices) of the film-like polygon plate that formed the sphere. The word of the earth is the direction of latitude.





The second parameter is the number of transverse cuts, and here, in the case of Earth, is the direction of longitude.



The third parameter is the radius of the sphere. The fourth number of parameters is the color of the sphere. This color is an array of four elements, assuming no color is specified. The HSV color will be assigned on its own initiative.




Use this function, pass in the appropriate number of parameters, and then receive the return value.



The return value is an object. Use the appropriate properties for this object. The actual code is for example the following.

The use part of the > function sphere


Generate VBO with the sphere's vertex data and save var Spheredata = sphere (up, up, 2.0, [0.25, 0.25, 0.75, 1.0]), var sposition = Create_vbo (SPHEREDATA.P); var snormal   = Create_vbo (SPHEREDATA.N), var scolor    = Create_vbo (SPHEREDATA.C), var svbolist  = [Sposition, Snormal, scolor];//sphere is generated by ibo var sindex = Create_ibo (SPHEREDATA.I);
The above code generates a sphere with 64 vertices both vertically and horizontally, with a radius of 2.0, and this time the specified color is blue. It is important to note that, for later processing, the VBO is saved to the array, after that. The work of Attributelocation and VBO has become very convenient, and this will be described later.
Next is the uniformlocation, this time from the parallel source to the electric light, the direction of the specified light part to be replaced by the position of the specified light.

Related treatment of >uniform
Uniformlocationを with column occupies get var unilocation = new Array (); unilocation[0] = Gl.getuniformlocation (PRG, ' Mvpmatrix '); UNILOCATION[1] = gl.getuniformlocation (PRG, ' Mmatrix '); unilocation[2] = Gl.getuniformlocation (PRG, ' InvMatrix '); UNILOCATION[3] = gl.getuniformlocation (PRG, ' lightposition '); unilocation[4] = Gl.getuniformlocation (PRG, ' Eyedirection '); unilocation[5] = Gl.getuniformlocation (PRG, ' ambientcolor ');//Point light position var lightposition = [0.0, 0.0, 0.0];
Changes to the uniform modifier variables made in the shader are reflected here specifically, and the location of the electric light source of the demo is set to the origin.
In order to make it easier to clarify the effect of the point light source, the center of the position of the point light source in the demo, the ring body and the sphere are rotated continuously, including the generation of the model coordinate transformation matrix.





As two models are drawn at the same time, the model is rendered using the appropriate VBO and IBO during the continuous cycle.




The code is a little bit longer. A careful look is clear. It is mainly written just now, using an array that holds VBO, Vbo is bound to the self-made function.

> Continuous-cycle drawing processing


カウンタをインクリメントするcount++;//カウンタを Yuan にラジアンと various coordinates calculate var rad = (count%) * Math.pi/180;var tx = Math.Cos (RAD) * 3.5;var ty = Math.sin (RAD) * 3.5;var tz = Math.sin (RAD) * 3.5;//トーラスのvboとiboをセットset_attribute (Tvbolist, attlocation, attStride); Gl. Bindbuffer (GL. Element_array_buffer, tindex);//モデル coordinates slew to generate m.identity (Mmatrix); M.translate (Mmatrix, [TX,-ty,-tz], Mmatrix); M.rotate (Mmatrix,-rad, [0, 1, 1], Mmatrix); M.multiply (Tmpmatrix, Mmatrix, Mvpmatrix); M.inverse (Mmatrix, InvMatrix);// Uniform Slew registration と depict GL.UNIFORMMATRIX4FV (unilocation[0], False, Mvpmatrix) GL.UNIFORMMATRIX4FV (unilocation[1], False, Mmatrix); GL.UNIFORMMATRIX4FV (unilocation[2], false, Invmatrix); GL.UNIFORM3FV (unilocation[3], lightposition); GL.UNIFORM3FV (Unilocation[4], eyedirection); GL.UNIFORM4FV (unilocation[5], ambientcolor); Gl.drawElements (GL. Triangles, Torusdata.i.length, GL. Unsigned_short, 0);//Sphere のvboとiboをセットset_attribute (Svbolist, attlocation, attstride); Gl.bindbuffer (GL. Element_array_buffer, Sindex)//モデル coordinates slew to generate M.ideNtity (Mmatrix); M.translate (Mmatrix, [-tx, Ty, TZ], Mmatrix); M.multiply (Tmpmatrix, Mmatrix, Mvpmatrix); M.inverse ( Mmatrix, Invmatrix);//Uniform Slew number Registration と (GL.UNIFORMMATRIX4FV], False, unilocation[0) Mvpmatrix ( Unilocation[1], False, Mmatrix); GL.UNIFORMMATRIX4FV (unilocation[2], false, Invmatrix); Gl.drawelements (GL. Triangles, Spheredata.i.length, GL. Unsigned_short, 0);//コンテキストの Gl.flush ();
After the generation of various coordinate transformation matrices and the end of the generation of the inverse matrix, the position of the point light source and the viewpoint vector are passed into the shader together with the binding processing of VBO and IBO, and the drawing command is issued.


In order to draw two models and a series of processing, but be careful not to repeat. And not particularly difficult to deal with.





SummarizeThe concept of lighting with a point light is basically the same as a parallel light source. The shadows are added based on the Nebilai of the normals of the light vectors and vertices and the viewpoint vectors. The difference between a peaceful light source is simply that the light vector is a fixed value. The point light source uses the position of the vertex after the model coordinate transformation and the position of the light source, then calculates the light vector, so it adds a number of computational quantities.



The direction of light of the parallel light source is certain. In general, all are subject to equal illumination. But the point light source according to the actual vertex coordinates to carry on the concrete light collision. This time the demo is the same as the last time in the fragment shader for the calculation of light and complementary coloring, so can be very beautiful rendering.


This article only need to be clear is the lighting related to the basic part of the package, WebGL in the color based on whether Kung Fu home, can render a variety of effects, from now on to have the ability to apply, will be carried out some special technical specific introduction.


So, this time also provides the actual demo, please click on the connection to test.



Next time, to start the rendering of the picture, look forward to it.


Use point light to render torus and spheres

http://wgld.org/s/sample_013/


reprint Please specify: transfer from Lufy_legend's blog Http://blog.csdn.net/lufy_legend


[WebGL Primer] 25, lighting for point lights


Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.