Reconstruct the Object Location Based on the depth Graph

Source: Internet
Author: User
Tags mul

Reprinted Please note: http://blog.csdn.net/tianhai110

Reconstruct the Object Location Based on the depth Graph

Method 1: store the projection Z/W and combine x/W, Y/W to multiply the inverse matrix of the projection matrix and divide it by W to get the position of the object;

// Vertices shader code of deep pass: <br/> output. vpositioncs = MUL (input. vpositionos, g_matworldviewproj); <br/> output. vdepthcs. XY = output. vpositioncs. ZW; </P> <p> // pixel shader code for deep pass (output Z/W): <br/> return input. vdepthcs. x/input. vdepthcs. y; <br/>

 

Another delayed rendering shader

 

// Function: Convert the depth value to the vertex position of the view space <br/> // vtexcoord is the texture coordinate of the full screen quadrilateral, and x = 0 is the left side of the screen, y = 0 is the top of the screen; <br/> float3 vspositionfromdepth (float2 vtexcoord) <br/>{< br/> // obtain the depth value of this pixel <br/> float z = tex2d (depthsampler, vtexcoord ); </P> <p> // obtain x/W and Y/W based on the position of the viewport. <br/> float x = vtexcoord. x * 2-1; <br/> float y = (1-vtexcoord. y) * 2-1; </P> <p> float4 vprojectedpos = float4 (X, Y, Z, 1.0f ); </P> <p> // multiply the inverse matrix of the projection matrix <br/> float4 vpositionvs = MUL (vprojectedpos, g_matinvprojection ); </P> <p> // divide by W to obtain the vertex position <br/> return vpositionvs. XYZ/vpositionvs. w; <br/>}</P> <p>

Method 2: store the normalized view space Z as our depth. Because the depth of the view space is linear, this means that we can obtain a uniform distribution, instead of projection and Inverse Projection, to restore the vertex position. We only need to multiply the depth value by a ray pointing to the far cross section of the cone to reconstruct the vertex position.

Rendering linear depth

Void depthvs (in float4 in_vpositionos: Position, <br/> out float4 out_vpositioncs: Position, <br/> out float out_fdepthvs: texcoord0) <br/>{< br/> // calculate the position of the vertex in the view space and clip space <br/> float4x4 matworldview = MUL (g_matworld, g_matview ); <br/> float4 vpositionvs = MUL (in_vpositionos, matworldview); <br/> out_vpositioncs = MUL (vpositionvs, g_matproj); <br/> out_fdepthvs = vpositionvs. z; <br/>}</P> <p> float4 depthps (in float in_fdepthvs: texcoord0 ): color0 <br/> {<br/> // The Z value is reversed and the distance from the far cross section is exceeded (so that the depth value is between [0, 1) <br/> // (do not add a negative number to the left-hand Coordinate System) <br/> float fdepth =-in_fdepthvs/g_ffarclip; <br/> return float4 (fdepth, 1.0f, 1.0f, 1.0f); <br/>}</P> <p>

Delay shader to reconstruct vertex position

 

// Render a full-screen quadrilateral. The vertex shader is as follows: <br/> void quadvs (in float3 in_vpositionos: Position, <br/> In float3 in_vtexcoordandcornerindex: texcoord0, <br/> out float4 out_vpositioncs: Position, <br/> out float2 out_vtexcoord: texcoord0, <br/> out float3 out_vfrustumcornervs: texcoord1) <br/>{< br/> // offset position of half a pixel <br/> out_vpositioncs.x = in_vpositionos.x-(1.0f/g_vocclusiontexturesize.x); <br/> out_vpositio NCS. y = in_vpositionos.y + (1.0f/g_vocclusiontexturesize.y); <br/> out_vpositioncs.z = in_vpositionos.z; <br/> out_vpositioncs.w = 1.0f; </P> <p> // transfer the texture coordinates and the position of the cone. These vertex locations are interpolated so that the pixel shader can use rays to query the vertex locations <br/> out_vtexcoord = vertex; <br/> out_vfrustumcornervs = g_vfrustumcornersvs [vertex]; <br/>}</P> <p> // pixel shader: used to reconstruct the position in the viewspace <br/> float3 vspositionfromdepth (float2 vtexcoord, float3 vfrustumrayvs) <br/>{< br/> float fpixeldepth = tex2d (depthsampler, vtexcoord ). r; <br/> return fpixeldepth * vfrustumrayvs; <br/>}</P> <p>

G_vfrustumcornersvs is the four vertices on the far cross section of the video cone.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.