Direct2d 1.1 Development Notes Special Effects Article (3) Simple pixel shader Special Effects

Source: Internet
Author: User
Tags pixel coloring

(For more information, see the source)


This time we implement a custom transformation.


The implementation of direct2d custom transformation shader models requires the implementation of HLSL (high level shading language.

HLSL is a shader implementation, but HLSL can only be used in d3d.

Shader is described as a short program executed by the video card, which can be executed efficiently (in parallel.


Never learned? It doesn't matter. I don't know it either, but I won't explain it in detail here (you TM tease me). Please visit the official website.


D2d effects can be used in HLSL 4.0 and later versions (shader models 4.0), which is implemented in d3d 10.

However, for the convenience of programming, we force require that the video card support d3d11. After all, the broken integrated video card supports d3d11.


D2d effects can be used in the following colorways:Pixel shader,Vertex shaderAndComputing coloring er.


This topic is to write a simple pixel coloring device transformation --Reverse PhaseThat is, the color is reversed.

Think about CPU execution, first we need to flip several megabytes of data and then deliver it to the video card for display, which is highly efficient.


Let's take a look at the d3d11 rendering pipeline.

Of course, D2d is not so complex and raster is simple. After all, it is 2D.

Just look at this pipeline. Unless you develop the d3d11 program, you only need to know about D2d effects-the pixel shader is the last step of several operations (except om output)



To implement the special effect of the D2d pixel coloring device, you need to implement id2d1drawtransform. You can view the header file and find that:

Id2d1drawtransform ----- inherit from ----> id2d1transform ----- inherit from ----> id2d1transformnode ----- inherit from ----> iunknown


Id2d1transformnode is the "transformation node" mentioned in the previous section.


Implementation interface:

Now that we want to implement this interface, let's look at it (actually reading ...):

0. Three iunknown interfaces:

This is not an explanation.

1. An interface of id2d1transformnode:

Id2d1transformnode: getinputcount: Get the number of input objects. This is "reverse". If you need an input, return 1 directly.

2. Three id2d1transform interfaces:

Id2d1transform: mapinputrectstooutputrect this interface is called every time D2d renders this transition. This method is used to calculate the output area.

The parameters are: Input rectangle array, input opaque rectangle array, and input array length. The following two are outputs: the output rectangle and the output opacity rectangle.

Theoretically, the two parameters of the input/output opaque rectangle can be removed, but the transparency must be mixed with the image below, and rendering

D2d effects provide a lot of mixed methods and complex computing. The D2d pixel shader only cares about itself, and the following is automatically handed over to D2d.

These two parameters are provided for optimization. If you do not know what is transparent in advance, pleaseSet the output opacity rectangle to (0, 0, 0)

Below is an image provided by Microsoft:

Perform Gaussian blur, assuming degree is 5, then (L-5, t-5, R + 5, B + 5) is the output rectangle, (L + 5, T + 5, R-5, B-5) is an opaque rectangle.


This is because we only have one input, and there is no need for transparent information. The following is a simple implementation:

if (inputRectCount != 1) return E_INVALIDARG;*pOutputRect = pInputRects[0];m_inputRect = pInputRects[0];*pOutputOpaqueSubRect = *pOutputRect;return S_OK;


Of course, we need to save the input rectangle (which is actually the output rectangle), or where else we draw:


Id2d1transform: mapoutputrecttoinputrects D2d: after calling the previous method, this method is called. This method specifies

Where D2d should read the image, if there is no pixel (not within the input image range), D2d will automatically sample transparent black)

Same as above, Microsoft also provides a diagram for easy understanding:

It can be understood that, for example, the pixel in the upper left corner of the output is the average (or weighted average) of N pixels around it ),

You need to extend the input range.

Of course, here we can directly return the reserved input rectangle. After all, it is a point-to-point special effect.


Id2d1transform: mapinvalidrect is different from the preceding two methods. This method may not be called. The official description is:

Set the input rectangle for this transition rendering channel. The first parameter is the rectangular index. The same is true for expansion.


3. id2d1drawtransform

The id2d1drawtransform: setdrawinfo parameter has only one id2d1drawinfo parameter. Currently, only one of the following is required:

Id2d1drawinfo: setpixelshader provides the guid of a pixel shadow. It is obvious that you need to register the shader guid.



Implement id2d1effectimpl

As mentioned in the previous section, I will not talk about it. However, this time, it is only a transformation, just setsingletransformnode.


Register shader:

0. Compile the object file

As early as id2d1effectimpl: Initialize, you can create a guid to register the shader. Suppose we have written a HLSL,

Right-click the selected file in vs express 2013 for Windows desktop-properties:

"Project class" is set to "HLSL compiler". Pay attention to setting different configurations (debug/release, x86/x64)

In the "regular" sub-branch, the entry point name is random. For the color set type, select the pixel color set. For the color set model, select 5.0.

Select the object file name in the output. The author sets shaderobject \ % (filename). CSO.


1. Register shader

You need to generate a guid on your own. As mentioned in the previous section, we will not talk about it here.

FILE* file = _wfopen(L"ShaderObject\\InvertShader.cso", L"rb");if (file){fseek(file, 0L, SEEK_END);size_t length = ftell(file);BYTE* pBuffer = new BYTE[length];fseek(file, 0L, SEEK_SET);if (pBuffer){fread(pBuffer, 1, length, file);m_hr = context->LoadPixelShader(GUID_MyInvertShader, pBuffer, length);delete[] pBuffer;}else{m_hr = E_OUTOFMEMORY;}fclose(file);}else{m_hr = E_FAIL;}

In this way, you can register.

Note:

You can use loadpixelshader to register this object once, but this example does not process it because it only calls this object once.

The negative result is that the file needs to be read when the same special effect is created for the second time, causing unnecessary efficiency problems.


Write shader

All the equipment work has been completed. Everything is ready, and the shader is missing.

This time we write a pixel coloring device, which performs (parallel) computing in pixels.

First, let's make it simple:

// Shader entry float4 main (): sv_target {return float4 (1, 1, 0, 1 );}

Main is the entry you just specified. float4 indicates that a four-dimensional vector is returned, and the color (,) is yellow.

To let the compiler understand that we return color, we need to specify the semantics. sv_target is a built-in semantics that represents color.

If you are familiar with the shader of d3d9, the color semantics is color, which is more intuitive, but a little different. I will discuss it later.

SV indicates system value, and target should be render target.


Parameters:

Most functions contain parameters. Although the above parameters are useless (but can be compiled and used ),

The default value of the D2d special effect is three parameters that are automatically passed to the pixel shader. It should be as follows:

// Float4 main (float4 scenespaceoutput: scene_position, float4 clipspaceoutput: sv_position, float4 cursor: texcoord0): sv_target {return float4 (1, 1, 0, 1 );}


Get image:

D2d automatically writes the first image to the texture cache register 0 (T0), and the second image to t1...... so forth

We only need to bind:

// 2D Texture the first input is stored in t0texture2d inputtexture: Register (t0 );


D2d also automatically writes the probe status to the probe register (S0, s1...... and so on)


// Samplerstate inputsampler: Register (S0 );

We can also directly define the status of our samplerger In the shader, for example:


SamplerState MySampler{Filter = MIN_MAG_MIP_POINT;AddressU = Wrap;AddressV = Wrap;};

You can refer to the d3d11 sampler status description:


We use the following in main:

<span style="font-size:14px;">return InputTexture.Sample(InputSampler, texelSpaceInput0.xy);</span>

The current pixel information can be returned, and the sampling function can be guessed;



Now we need to reverse the phase. That's simple.

// Example of a simple pixel coloring device: Invert // 2D Texture the first input is stored in t0texture2d inputtexture: Register (t0); // sampler state the first input is stored in s0samplerstate inputsampler: register (S0); // float4 main (float4 scenespaceoutput: scene_position, float4 clipspaceoutput: sv_position, float4 texelspaceinput0: texcoord0 ): sv_target {// you can use this // return float4 (,)-inputtexture if the reverse code is not transparent. sample (inputsampler, texelspaceinput0.xy); float4 color = inputtexture. sample (inputsampler, texelspaceinput0.xy); color. XYZ = float3 (1, 1, 1)-color. XYZ; return color ;}

Color. XYZ directly calculates the xyz3 dimension vector. The shader is harsh in some places, but this is easy to understand:

Effect:


Note:

It is worth noting.

Texelspaceinput0.xy is not the real coordinate of alignment, but converted,

Texelspaceinput0.xy/texelspaceinput0.zw can get the current location.


Note! YesTrue location", Rather than pixel coordinates. It is a half-pixel offset from the current pixel coordinate, and is both horizontal and vertical. Why?

Simply put, the center position of a pixel is itsTrue locationFor example, the actual position of the pixel (0, 0) is (0.5, 0.5 ),

For details, refer to Google "half-pixel offset in DirectX 11"

That is, after dx10, the color of sv_target is different from that of DX9.


The next section will briefly introduce how to debug graphics devices, and then introduce a slightly more comprehensive pixel coloring effect.


Example in this section: click here






ZookeeperZookeeper

Direct2d 1.1 Development Notes Special Effects Article (3) Simple pixel shader Special Effects

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.