Software raster work schedule

Source: Internet
Author: User

References:
Book: [Computer Graphics]
Book: [directx9.03d game development programming basics-13 cameras]
Http://wenku.baidu.com/view/a08bf9ee4afe04a1b071decc.html (rotate around any axis)
Http://wenku.baidu.com/view/700512d5360cba1aa811da70.html (view matrix)
Http://blog.csdn.net/popy007/article/details/1797121 (projection matrix)
Http://blog.csdn.net/popy007/article/details/4091967 (projection matrix)
Http://blog.csdn.net/popy007/article/details/5556639 (texture ing)
Http://blog.csdn.net/popy007/article/details/5570803 (texture ing perspective correction)
Http://hi.baidu.com/tinya/blog/item/37f0ffdeea8dd85ecdbf1a6b.html
Http://www.chrishecker.com/Miscellaneous_Technical_Articles
Http://blog.csdn.net/popy007/article/details/5556639
Http://zoomquiet.org/res/scrapbook/ZqFLOSS/data/20100709102006/

5.20 build an application framework (API point)
5.20 vector and matrix Definition
5.20 vertex Definition
5.20 triangle Definition
5.20 WVP matrix transformation
5.20 VP Space Transformation
5.21 draw lines)
5.21 triangle filling (image)
5.21 adjust the data structure (adjust the data structure definition to facilitate computing .)
5.22 BlinN-phong illumination model (illumination)
5.22 model definition (rendering cube)
5.23 transformation from model space to World Space (basic implementation of affine transformation)
5.23 Gouraud coloring (gradient filling)
5.23 texture definition (PNG texture temporarily)
5.23 texture ing
5.24 fluoroscopy correction
5.24 interface correction (from triangle-based to vertex-based, and then considering batch processing of vertex and surface retrieval, laying the foundation for cropping)
5.24 remove the back (define the camera orientation)
5.24 CVV Cropping
5.24 VP Cropping
* ** 5.25 hide plane elimination (Z-buffer) [efficiency problem, temporarily disabled]
5.25 camera operations
5.28 create a SVN Repository
5.28 troubleshoot UV misplacement
5.28 2D linear cropping algorithm [No, even if it is cropped, the method for determining the new line segment is too complicated and may be wrong] (use the CVV coordinates of all pixels in interpolation, judge the point in the range)
5.28 check matrix-related calculation errors: 1. vec4 * mat44 is written incorrectly to implement column-based primary matrix calculation.
5.29 check matrix-related calculation errors-2. The VP matrix [] was written incorrectly and an inverse was obtained;
5.29 troubleshooting the backend rejection error (the backend rejection must be performed after perspective Division)
5.29 organize the basic data structure and change it from class to struct.
5.29 added the UI of testunit.
* ** When 5.29 is very close, UV computing will fail []
5.30 Add the profile tool to check performance problems
5.30 profile data organization
5.30 profile data display
5.31 add profile to analyze specific performance problems (rendering 640*480 points is inefficient)
5.31 introduce the ddraw surface for a comparative test [do not do it well, let's get the GDI method first].
5.31 modify the workflow of drawing points in the GDI mode, write the data into the memory block, and copy the entire data to rendertarget. (Because there is a lock memory problem here) (290 ms)
6.01 make the ddraw surface available and complete the control group. (200 ms, a little faster than GDI)
6.01 improve the efficiency of drawing points (the method of locking off the screen surface for a lifetime, the efficiency is from 200 ms-> 57 ms)
6.01 incorrect color value correction
6.01 optimize the data structure when filling the triangle surface (the efficiency is not significantly changed, and the data volume is too small, up to 480)
6.01 texture sampling optimization (22 Ms reduced)
* ** 6.01 optimize linear interpolation (reducing is reduced, 13 Ms, but the result is incorrect)

 

I have several questions about Perspective Projection texture ing:
1. How can the result of UV interpolation calculated in the window space correspond to a certain pixel point? One is the UV obtained from the vertex (x (float), y (INT), and the other is the raster points (x (INT), y (INT) in the VP space ));
2. Assume that the uv is regarded as the UV coordinate of the rasterized pixel. During the row scan, is the UV coordinate of the entire row of pixels completely related to X-ray interpolation? This is not true. Is it about 1/Z?
3. (is it true ?) After calculating the UV coordinates of a pixel ([0, 1]), multiply the size of the texture to convert it to the texture space. In this case, it is biased (-0.5,-0.5 ), used for UV coordinates and screen coordinates alignment [doubt 4], then is the texture sampling point?
4. If-0.5 is required, assume that the U coordinate of a pixel is 0.0, and it is still 0.0 after conversion to the texture space. After-0.5, if the texture adopts the surround addressing mode, isn't it the color value of 0, but the color value of 9.5 (texture width is 10?
5. In the calculation system with the-0.5 offset, the value is rounded up, so the value is 10? If it's an integer below, isn't it 9? Is it farther from 0?
6. (Do you understand ?) After finding the map element corresponding to the UV coordinate, use the texture Sampling algorithm. For example, do we take the left and right vertices for the three-point average, or do the four Upper and Lower vertices for the five-point average?
7. What is the so-called 2x sampling and 4x sampling?
8. If the U coordinate of the pixel is 0.0, Which element is used? 0 or 10? What if it is 1.0?

 

The vertex data contains UV, which is available in the production stage. Art is called UV.
The UV value remains unchanged when the vertex is transformed by WVP/w vp. Continue to the grating step
Interpolation can be used to obtain the UV of each pixel. Based on this UV, The Paster map and the illumination color * are calculated to obtain the final color.

 

C # assignment, truly dizzy

 

My matrix is broken... X to the left... it seems to be the right coordinate system.
Learning matrix derivation, preparing to fix it
The matrix in the DX document is used. I don't know why.

[2012-05-25]
Kun (Xiao Long) 00:38:30
I know what went wrong. I mixed up the camera's right, up, look, and uvn in the observed coordinate system. That is, the expressions (u (x, y, z), V (x, y, z), n (x, x, y, z). In fact, I have not defined the camera orientation, but defined the expression of uvn under the XYZ coordinate.
So when I think that look is oriented, I will let it look inside the screen .. it is unit_z, and the result is the inverse of the entire observed coordinate system, which is equivalent to a 180 degree rotation around the Y axis, and my projection matrix is derived from the left-hand coordinate system, so all the projection items are reversed.
)
Kun (Xiao Long) 00:42:11
Expression of uvn In the XYZ coordinate ==> expression of uvn unit vector in the XYZ coordinate.

Kun (Xiao Long) 09:16:19
Cropping is a big pitfall .....
Some faces or straight lines that pass through frumst are projected if they are not cropped in 3D space, just as they are projected onto the virtual projection surface. Now we have no time to crop 2D images. Make sure that the coordinates of all vertices involved in the calculation are between-. Otherwise, there will be a problem of virtual projection (my own name.
A triangle surface with three vertices outside the CVV is probably (not carefully calculated). CVV can cut up to six vertices. These vertices must be used to form a triangle surface, at the same time, it must be consistent with the normal vector of the original triangle. For Gouraud coloring, you also need to interpolation the illumination color of the new vertex.
E... it's useless to crop 2D .. So I'm sure there is something wrong... but there is a virtual projection.
Maybe this is the case: if a plane or line segment has its vertex falling into the space of virtual frusmt (reverse extension of frusmt), it is necessary to calculate a new vertex, otherwise, let it be projected and handed over to 2D. Because 2D computing is better.
=, It seems to be related to the cropping plane.
I used the most basic point-by-point crop Method for 2D cropping to see if it was in the window.
Conclusion:
In the view space (which has not been changed to CVV), if it is a bit behind the eye (N axis), 3D cropping is required; otherwise, a virtual projection problem occurs.
My practice is to judge the Z value in CVV. If Z is <-1, don't click this point-super simple 3D cropping.

Kun (Xiao Long) 09:16:40
The result is correct. No virtual projection will appear when the camera moves freely.

Kun (Xiao Long) 10:13:49

Another problem is that my cube is severely deformed at the edge of the screen...
Kun (Xiao Long) 10:14:13
It seems like it is not linear.

Kun (Xiao Long) 10:18:40
After the cube is attached with a texture, it cannot be viewed without covering the surface.
Kun (Xiao Long) 10:18:47
Eyes are spent ..
Kun (Xiao Long) 12:18:23
All kinds of pitfalls... textures are very strange... pixels are not evenly distributed, and the middle density is higher than that on both sides...
Kun (Xiao Long) 12:18:47
It must be a serious problem with cube deformation at the edge of the screen.
Kun (Xiao Long) 12:18:51
Same Matrix
Kun (Xiao Long) 12:24:01
Oh, FOV, should be calculated using aspect. Well, I think of the previous bug.
. Niu dun-12:39:34
The deformation is serious. It is estimated that the view matrix is incorrect.
Kun (Xiao Long) 14:00:57
Crying... When I scanned the row color interpolation, I added processing in the case of Y0 = Y1 and copied the X-based interpolation code .. When the result is replaced with (x, y)... only a part is replaced... too much X, Y, and blind ..
Kun (Xiao Long) 14:03:11
The link is changed to y = 1/X, so the intermediate pixel concentration is...
Kun (Xiao Long) 14:04:14
How is mathematics consistent? Even wrong. If we can realize that it is reciprocal, we should be able to directly look up the linear interpolation area to see the normal.
Kun (Xiao Long) 14:27:50
There are eight vertices in the light, and there is no way to perfectly express UV. It seems that more vertices are needed, or a multi-layer UV feature is added to make an image display the same on six sides.
Is there an attribute on face that determines which sets of UV coordinates are enabled for vertices?
Kun (Xiao Long) 14:34:10
Confirmed with Qiu, no ~ Add more
Kun (Xiao Long) 15:40:05
My camera didn't record the rotation, and it turned out that the axis of the view space was not parallel to XYZ.
Kun (Xiao Long) 15:40:19
As a result, viewmatrix is relentlessly wrong.
Kun (Xiao Long) 16:00:43
You see what I understand is incorrect:
Note the rotation of camera: RX, Ry, and RZ are the number of rotations around X, Y, and Z.
Each rotation is converted to only one component (RX + angle A, RX ')
Then, the camera's look and right axes are rotated in the order of RX 'ry 'rz'. The up is obtained by look. corssprodect (right.
So?
Kun (Xiao Long) 16:02:06
A one-time construction matrix can only be used to construct the three rotating Matrices "in the order of RX 'ry' rz. The location is not needed.

 

After a triangle projected to 2D is cropped, a maximum of six edges are generated. Assuming that the vertex number is 123456, how can we determine the edges, for example, 1-2 3-4 5-6 2-3 4-5 6-1 instead of 1-3 2-4 5-1 6-2? A rule is required for the distribution of points. What is a general rule?
I want to understand, by the normal. Because the front is determined by V0-> V1 and V0-> V2, the new vertex can be sorted in a certain order. it is irrelevant to the values of v0, V1, and V2. It is only related to the sequence.

 

The method I tried is especially troublesome and needs to be determined based on N cases.
Kun (Xiao Long) 14:00:34
The new endpoint is 123456, and the new line segment is 2-3, 4-5, 6-1.
Take 2-3 line segments as an example.
Excluding angle: The line segment is 2-3
Contains 1 angle: top left, top right, bottom left, bottom right, line segment is 2-angle, angle-3
Contains two angles: Left (up and down), right (up and down), top (left and right), bottom (left and right), and line segment:
2-angle 1, angle 1-angle 2, angle 2-3.
In this way, we can get a new line segment.

 

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.