How to project a texture)

Source: Internet
Author: User

How to project a texture


Source: SGI OpenGL tutorial
Translation: Xin LAN Pan Li Liang. Email: Xheartblue@etang.com

Preface:
Shadow has two classic implementation methods: Shadow volume. The second is Shadow Mapping. How can we use light mapping to implement projection shadows? This requires project texture. projection texture: a texture is projected into the scene like a slide. Assume that a movie machine is playing the movie, and the film will be cast on the wall along the direction of the camera, the projection texture is similar to this situation. The texture we want to use is the film in the movie machine.

The following is an article I found in my SGI tutorial. To everyone.

How to project a texture
Http://www.sgi.com/software/opengl/advanced98/notes/node49.html

Projection of a texture image to your own synthetic environment is the same in many steps as the projection rendering scene to the display. The key to projection texture is the texture transformation matrix content. The transformation matrix is connected by the following three transformations:

1. View/model transformation-orientation to be projected to the scene.

2. Projection Transformation (perspective or orthogonal)

3. scaling and offset (is bias called Offset ?), It is near the cropping surface to the texture coordinate.

Projecting a texture image into your synthetic environment requires attributes of the same steps that are used to project the rendered scene onto the display. the key to projecting a texture is the contents of the texture transform matrix. the matrix contains the concatenation of Three transformations:

1. A modelview transform to orient the projection in the scene.

2. a projective transform (perspective or orthogonal ).

3. A scale and bias to map the near clipping plane to texture coordinates.

In texture transformation, the model/view transformation part can be calculated like a Common graphic pipeline. You can use a general model/view Conversion Tool. For example, you can use glulookat () you can also use glfrustum or gluperspective () to define a perspective transformation.

The modelview and projection parts of the texture transform can be computed in the same way, with the same tools that are used for the modelview and projection transform. for example, you can use glulookat () to orient the projection, and glfrustum () or gluperspective () to define a perspective transformation.

The role of model/view transformation is the same as that of it in the OpenGL observation pipeline. It moves the observer along the-z direction to the origin and projection center. In this case, the observer is like a light source. The near-cropping plane is like the position of the texture image to be projected. The texture image can be viewed as a transparent film projected. Alternatively, you can imagine an observer looking at the position and looking at the surface that needs to be attached with a texture (projection texture) through a near-plane texture.

The modelview transform is used in the same way as it is in the OpenGL viewing pipeline, to move the viewer to the origin and the projection centered along the Negative Z axis. in this case, viewer can be thought of a light source, and the near clipping plane of the projection as the location of the texture image, which can be thought of as printed on a transparent film. alternatively, you can conceptualize a viewer at the view location, looking through the texture on the near plane, at the surfaces to be textured.

The projection operation converts the eye space to the normalized device space. In this space, the range of X, Y, and zcoordinates is-1 to 1. The projection texture can be imagined to be placed on the near plane of the projection direction. This projection is defined by the model/View and the projection transformation matrix.

The projection operation converts eye space into normalized device coordinate (NDC) space. in this space, the X, Y, and Z coordinates range from-1 to 1. when used in the texture matrix, the coordinates are S, T, and r instead. the projected texture can be visualized as laying on the surface of the near plane of the oriented projection defined by the modelview and projection parts to the transform.

The last part of the transformation is to scale and offset the texture ing, which changes the range of the liberal arts coordinates to 0 to 1, so that the entire texture image (or the expected Region) cover the whole projection near plane. Because the near plane is defined as the NDC (normalized device coordinate) coordinate. To map the near plane under the NDC coordinate to the texture image, we need to reduce the coordinates in the S and T directions by 1/2, and then translate 1/2. (Note: [-1/2] * 1/2 + = []). The texture image is centered and covers the entire near plane (I have never understood what back plane is ). The texture can also be rotated when the direction of the projected image is changed.

The final part of the transform scales and biases the texture map, which is defined in texture coordinates ranging from 0 to 1, so that the entire texture image (or the desired portion of the image) covers the near plane defined by the projection. since the near plane is now defined in NDC coordinates, mapping the NDC near plane to match the texture image wocould require scaling by 1/2, then biasing by 1/2, in both S and T. the texture image wocould be centered and cover the entire back plane. the texture cocould also be rotated if the orientation of the projected image needed to be changed.

The projection order is the same as that of a Common graphic pipeline. First, the model/view transformation, then the projection transformation, and finally the scaling and moving the position of the near plane to the texture image:

1. glmatrixmode (gl_texture );

2. glloadidentity (); start.

3. gltranslatef (0.5f, 0.5f, 0.5f );

4. glscalef (0.5f, 0.5f, 1.0f );

5. Set the projection matrix (for example, glfrustum ())

6. Set the view/model matrix (for example, glulookat ()).

The projections are ordered in the same as the graphics pipeline, the modelview transform happens first, then the projection, then the scale and bias to position the near plane onto the texture image:

1. glmatrixmodegl_texture (gl_texture)

2. glloadidentity () (start over)

3. gltranslatef.5f,. 5f, 0.f(. 5f,. 5f, 0.f)

4. glscalef.5f,. 5f, 1.f(. 5f,. 5f, 1.f) (texture covers entire NDC near plane)

5. Set the perspective transform (e.g., glfrustum ()).

6. Set the modelview transform (e.g., glulookat ()).

So how can we define the ing method of the texture coordinates of the elements? Because our projection and view/model transformation are defined in the eye space (all scenarios are installed in this space ). The most direct method is to create a 1-to-1 correspondence between the texture coordinate space and the eye space. This method can be used to set the texture coordinate Generation Method to the eye linear method, set eye planes to 1-to-1 ing: (for details, see the texture coordinate generation of OpenGL. You can also find the d3d method .)

Glfloat splane [] = {1.f, 0.f, 0.f, 0.f };
Glfloat tplane [] = {0.f, 1.f, 0.f, 0.f };
Glfloat rplane [] = {0.f, 0.f, 1.f, 0.f };
Glfloat qplane [] = {0.f, 0.f, 0.f, 1.f };

What about the texture coordinates for the primitives that the texture will be projected on? Since the projection and modelview parts of the matrix have been defined in terms of eye space (where the entire scene is assembled ), the straightforward method is to create a 1-to-1 mapping between eye space and texture space. this can be done by enabling texture generation to eye linear and setting the eye planes to a one-to-one mapping:

Glfloat splane [] = {1.f, 0.f, 0.f, 0.f };
Glfloat tplane [] = {0.f, 1.f, 0.f, 0.f };
Glfloat rplane [] = {0.f, 0.f, 1.f, 0.f };
Glfloat qplane [] = {0.f, 0.f, 0.f, 1.f };

You can also use the volume method of the object space, but the model/view transformation should also be included when creating the volume.

You cocould also use Object Space Mapping, but then you 'd have to take the current modelview transform into account.

Now everything is done. What will happen? When each element is rendered, the X, Y, and Z values (vertex coordinates) corresponding to the texture coordinates are transformed by the generated model/view matrix, then it is transformed by the texture transformation matrix. First, a view/model and projection transformation matrix are applied. This matrix copies the texture coordinates of the elements to the normalized device coordinates (-1.1 ). Then zoom and shift the coordinates. Then, apply filtering and texture environment operations to the texture image.

So when you 've done all this, what happens? As each primitive is rendered, texture coordinates matching the X, Y, and Z values that have been transformed by the modelview matrix are generated, then transformed by the texture transformation matrix. the matrix applies a modelview and projection transform; this Orients and projects the primitive's texture coordinate values into NDC space (-1 to 1 in each dimension ). these values are scaled and biased into texture coordinates. then normal filtering and texture environment operations are performed med using the texture image.

When transformation and texture ing are applied to all polygon to be rendered, how can we limit the projection texture to a single area? There are many ways to achieve this goal. The simplest method is to render the polygon that you are trying to project the texture to when you open the projection texture and set the texture transformation matrix. However, this method is rough. Another method is to use the template cache Algorithm in multiple rendering to control the parts in the scenario that will be updated by the projection texture. The scene is rendered without a projection texture, and then a specific area is covered by the stencel buffer. The scene is re-rendered when the projection texture is opened. The stencel buffer can cover all areas that do not want to be used with projection textures. This allows you to create any contour lines of a projection image, or, you can project a texture to a texture surface (that is, texture multiple times. and does not support arb_muti_texture .)

If transformation and texturing is being applied to all the rendered polygons, how do you limit the projected texture to a single area? There are a number of ways to do this. one is to simply only render the polygons you intend to project the texture on when you have projecting texture active and the projection in the texture transformation matrix. but this method is crude. another way is to use the stencel buffer in a multipass algorithm to control what parts of the scene are updated by a projected texture. the scene can be rendered without the projected texture, the stencel buffer can be set to mask off an area, and the scene re-rendered with the projected texture, using the stencel buffer to mask off all but the desired area. this can allow you to create an arbitrary outline for the projected image, or to project a texture onto a surface that has a surface texture.

Here is a very simple method to implement a non-repetitive (Repeat) texture to a surface without a texture ing: Set the texture environment to gl_modulate. Repeat the texture to break gl_clamp and set the border color of the texture to white. When a texture is projected, the surface not projected to the texture will be automatically set to the border color of the texture-white, and then modulated together with the white. In this way, their colors will remain unchanged, because this is equivalent to multiplying each color component by 1.

There is a very simple method that works when you want to project a non-repeating texture onto an untextured surface. set the gl_modulate texture environment, set the texture repeat mode to gl_clamp, and set the texture border color to white. when the texture is projected, the surfaces outside the texture itself will default to the texture border color, and be modulated with white. this will leave the areas textured with the border color unchanged, since each color component will be scaled by one.

Texture filtering is the same as normal texture ing. The projection texture is determined by the downsize and magnification of the screen pixels. When small images are needed, we need to use mipmapping to achieve better results. If the projection texture is removed in the scenario, it is very important to use a good texture filtering method.

Filtering considerations are the same as for normal texturing; the size of the projected textures relative to screen pixels determines minification or magnification. if the projected image will be relatively small, mipmapping may be required to get good quality results. using good filtering is especially important if the projected texture moves from frame to frame.

Note that, like observation and projection, Texture projection does not fully conform to the optical principle. Unless a special method is used, the texture will affect all the surfaces-either the front or the back, of course, this is not in line with the optical principle ). Because there is no default view volume (view volume), the application must be careful to avoid unwanted projection texture effects. Custom cropping surface (additional cropping surface) helps to better control where the projection texture appears.

Please note that like the viewing projections, the Texture projection is not really optical. unless special steps are taken, the texture will affect all surfaces within the projection, both in front and in back of the projection. since there is no implicit view volume clipping (like there is with the OpenGL viewing pipeline), the application needs to be carefully modeled to avoid undesired texture projections, or user defined clipping planes can be used to control where the projected texture appears.

 

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.