Unity Shader Learning Note 1 (Unity 5.6)

Source: Internet
Author: User
Tags cos mul sin unity 5
Macro Unity_uv_starts_at_top

Usually used to determine the D3D platform, when anti-aliasing is turned on, picture sampling may be used to Unity_single_pass_stereo

Single-channel stereoscopic rendering, currently used mainly in VR Unity_colorspace_gamma

The current use of gamma space, or linear color spatial parameter transformation related

Most commonly used, estimates are not unclear, are float4*4 UNITY_MATRIX_MVP

Model * View * projection UNITY_MATRIX_MV

Model * View unity_matrix_v

View unity_matrix_p

Projection UNITY_MATRIX_VP

View * Projection UNITY_MATRIX_T_MV

Model * View's transpose matrix UNITY_MATRIX_IT_MV

Model * View reversal matrix _object2world _world2object

such as camera and screen related _worldspacecamerapos

Float3
Camera's world coordinates _projectionparams

Float4
x = 1 or-1 (rarely used, 1 indicates rendering has a flip projection matrix)
y = near plane
z = Far plane
w = 1/far plane _screenparams

Float4
By dividing screen coordinates by screen resolution, _SCREENPARAMS.XY can get coordinates in the viewport space
X is the camera's render target width in pixels, y is the camera's render target height in pixels, Z's 1.0 + 1.0/width an D W is 1.0 + 1.0/height _zbufferparams

Float4
Used to linearization the Z-cache
X is (1-far/near), y are (far/near), Z is (X/far) and W are (Y/far) unity_orthoparams

Parameters related to orthogonal cameras
X is orthographic camera's width, y is orthographic camera's height, Z is unused and W are 1.0 when the camera is orthographic, 0.0 when perspective unity_cameraprojection

Float4*4
Projection matrix of the camera unity_camerainvprojection

Inverse matrix of unity_cameraprojection Unity_cameraworldclipplanes[6]

Float4
Some parameters under the clipping space
Left, right, bottom, top, near, far time related

Unit is seconds _time

Float4
(T/20, T, T*2, t*3) _sintime

Float4
Sin (t/8), sin (T/4), sin (t/2), sin (t) _costime

Float4
cos (T/8), cos (T/4), cos (T/2), cos (t) Unity_deltatime

Float4
Is the time of the previous frame, the SMOOOTHDT is the smoothing time, mainly prevents the interval fluctuation too big
DT, 1/DT, SMOOTHDT, 1/SMOOTHDT lights

The parameters of the lights are more complex, depending on the render path and the Lightmode settings in pass tag, there are some different paths: forward (including base and add) _lightcolor0

Fixed4
Forward rendering (Forwardbase and Forwardadd pass types)
Light Color _worldspacelightpos0

Float4
Forward rendering (Forwardbase and Forwardadd pass types)
Directional Lights: (World Space Direction, 0). Other lights: (World space Position, 1) _lightmatrix0

Float4*4
Forward rendering (Forwardbase and Forwardadd pass types)
World-to-light Matrix. Used to sample cookies & attenuation textures (mostly used in real-time shadows) unity_4lightposx0, Unity_4lightposy0, unity_4lightposz0

Float4
(Forwardbase pass only) World space positions of first four non-important point lights unity_4lightatten0

Float4
Attenuation factor of the light
(Forwardbase pass only) attenuation factors of first four non-important point lights unity_lightcolor

HALF4[4]
(Forwardbase pass only) colors of the first four non-important point lights path: Deferred _lightcolor

Float4
Light Color _lightmatrix0

Float4*4
World-to-light Matrix. Used to sample cookies & attenuation Textures path: Vertex-lit

This path, up to 8 lights, from the brightest one to start sorting, less than 8, the more black unity_lightcolor

HALF4[8]
Light Colors unity_lightposition

HALF4[8]
view-space Light positions. (-direction,0) for directional lights; (position,1) for point/spot lights unity_lightatten

HALF4[8]
Light attenuation factors. X is cos (SPOTANGLE/2) or–1 for non-spot lights; Y is 1/cos (SPOTANGLE/4) or 1 for non-spot lights; Z is quadratic attenuation; W is squared light range unity_spotdirection

FLOAT4[8]
View-space spot light positions; (0,0,1,0) for non-spot lights fog effect and ambient light related Unity_ambientsky

Fixed4
Sky ambient lighting color in gradient ambient lighting case Unity_ambientequator

Fixed4
Equator ambient lighting color in gradient ambient lighting case Unity_ambientground

Fixed4
Ground ambient lighting color in gradient ambient lighting case unity_lightmodel_ambient

Fixed4
You can use this on the usual.
Ambient Lighting Color (sky color in gradient Ambient case). Legacy variable unity_fogcolor

Fixed4
Fog Color unity_fogparams

Float4
Three methods of calculation of fog effect required parameters
Parameters for Fog calculation: (Density/sqrt (ln (2)), DENSITY/LN (2), –1/(End-start), end/(End-start)). X is useful-Exp2 fog mode, Y for EXP mode, Z and W for Linear mode method Unityobjecttoclippos

Inline Float4 unityobjecttoclippos (in Float3 Pos)
{
    //More efficient than computing M*VP matrix product since unity says it It's quicker, so use this.
    return Mul (UNITY_MATRIX_VP, Mul (Unity_objecttoworld, FLOAT4 (POS, 1.0)));
}
Inline Float4 unityobjecttoclippos (float4 POS)//overload for FLOAT4; Avoids "implicit truncation" warning for existing shaders
{
    return Unityobjecttoclippos (POS.XYZ);
}
Computegrabscreenpos

Calculates the UV (XY) on the captured screen, and the input value is the clipping space coordinates

Inline Float4 computegrabscreenpos (float4 pos) {
    #if unity_uv_starts_at_top
    float scale = -1.0;
    #else
    Float scale = 1.0;
    #endif
    float4 o = pos * 0.5f;
    O.xy = Float2 (o.x, O.y*scale) + O.W;
#ifdef Unity_single_pass_stereo
    o.xy = Transformstereoscreenspacetex (O.xy, POS.W);
#endif
    o.zw = pos.zw;
    return o;
}
Transform_tex
#define TRANSFORM_TEX (Tex,name) (TEX.XY * name# #_ST. XY + name# #_ST. ZW)
Computescreenpos

Calculate screen coordinates (should say viewport coordinates), input values in the clipping space, this method reminds of something, or think about it.
1. Not the same as the previous method, mainly to increase the support for VR
2. This method does not have homogeneous division, because this method in the vertex, also need interpolation (the difference of the clipping space is non-linear), so in order to obtain the viewport space coordinates in the Frag, but also need to be manually homogeneous division
The essence of the 3 projection matrix is to scale x y Z to varying degrees, before W is 1, but after the projection matrix, W is the previous Z-inverse (in camera space, right-hand coordinate system, so z is larger, closer to the camera, and the projection matrix is also called the clipping matrix, because after the projection matrix, XYZ is in the-W W is not cropped, there was a thought error, why the XYZ divided by W will be between 1 to 1, in fact, because after clipping the matrix, with the near clipping surface as an example, XYZ in close and-near, the far clipping surface of XYZ between near and-far. is the distance from the camera), this is because after cropping (that is, after multiplying the MVP matrix), to the screen space, the homogeneous division to get the NDC coordinates, XYZ range of 1 to 1 (opengl,unity also use, D3D 0 to 1), In this case, it will be in a cube, at this point, to calculate the screen coordinates (OpenGL habit, the left is 0, 0, the screen resolution on the right), simply speaking, X (=x) (clip)/w (clip) [NDC coordinates]\2[between 1 to 1 map it]*pixelw[ Horizontal resolution]+pixelw/2[should be well understood], but here is the calculation of viewport coordinates, which is between 0 and 1, and the lower left is 0, which is good to map from NDC.
All this method is just half done

Another way to do that is to use vPOS.

Inline Float4 computescreenpos (float4 pos) {
    float4 o = Computenonstereoscreenpos (pos);
#if defined (unity_single_pass_stereo)
    o.xy = Transformstereoscreenspacetex (O.xy, POS.W);
#endif
    return o;
}
Computenonstereoscreenpos

As the name implies, there is no stereoscopic rendering when the screen coordinates

Inline Float4 computenonstereoscreenpos (float4 pos) {
    float4 o = pos * 0.5f;
    O.xy = Float2 (o.x, o.y*_projectionparams.x) + O.W;
    O.ZW = POS.ZW;
    return o;
}
Transformstereoscreenspacetex

Unity_stereoscaleoffset an array of length 2, representing some of the two eye offset values, etc., or used in VR stereoscopic rendering

FLOAT2 Transformstereoscreenspacetex (Float2 UV, float W)
{
    Float4 scaleoffset = unity_stereoscaleoffset[ Unity_stereoeyeindex];
    Return uv.xy * scaleoffset.xy + scaleoffset.zw * w;
}
ShadeSH9

A very common function, the input normals need to be normalized, the resulting values will be broken down according to the current color space processing
Calculation of spherical harmonic illumination, not as a light source for pixel light and vertex light, into the probe, are handled here
It reminds me of unity. In forward rendering, the process of light is sorted by the importance of the order of the vertex-by-pixel processing, processing by pixels, the spherical harmonic function, a certain number of light sources by pixel (brightest parallel light important light source, less than quality setting per pixel light source number, not Important will also be pixel-wise, then up to 4 per vertex processing, and others with SH)

Half3 ShadeSH9 (Half4 Normal)
{
    //Linear + constant Polynomial terms half3
    res = SHEVALLINEARL0L1 (normal); 
  //Quadratic polynomials
    res + = SHEvalLinearL2 (normal);

#   ifdef unity_colorspace_gamma
        res = lineartogammaspace (res);
#   endif

    return res;
}
compute_eyedepth

Unity's model space and world space with left-handed coordinate system, in observation space with right hand, OpenGL tradition

#define COMPUTE_EYEDEPTH (o) o =-unityobjecttoviewpos (V.vertex). Z

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.