The Lab renderer study notes

Source: Internet
Author: User

Objective

Unity Vision Vr/ar Summit came to China (http://www.bagevent.com/event/197605?bag_track=http://www.bagevent.com/event/ 197605), recently also focused on Unity's VR development.


It was probably June to see the news: Steam released all the source code for the renderer used by the lab. I've always been curious, for Unity3d Such an open-source engine, what if a renderer is engaged? Have time to read the code today.

RELATED LINKS :
-Official Posts: http://steamcommunity.com/games/250820/announcements/detail/604985915045842668
-GitHub Download: Https://github.com/ValveSoftware/the_lab_renderer
-Unity Asset Store Download: https://www.assetstore.unity3d.com/en/#!/content/63141

Features & implementations

After downloading the Lab renderer from GitHub, it was a cursory glance, with little content, mainly the C # code of several components and some shader. The next step is to see how its main features are implemented.

Single-pass Forward Rendering

The Lab renderer uses forward rendering for the most part for MSAA (multisampling anti-aliasing) and efficiency. However, Unity's default forward rendering uses Multi-Pass to render all the lights (each dynamic light in each object needs a pass to render its light), and the Lab renderer provides a single pass to render multiple light solutions.


To implement Single-pass Forward Rendering, first make some settings in the player settings, as shown in. The so-called "Single-pass" is mainly by shader to achieve. The general idea is to define an array of light parameters in the "vr_lightng.cginc" Shader file:

#define MAX_LIGHTS 18...float4 g_vLightColor[ MAX_LIGHTS ];float4 g_vLightPosition_flInvRadius[ MAX_LIGHTS ];float4 g_vLightDirection[ MAX_LIGHTS ];

Then use a For loop to calculate the lighting for all lights at once:

...){    for0; i < g_nNumLights; i++ )    {}}

The next step is to process the light information in the C # layer.

    • First, you need to add a "ValveRealtimeLight.cs" script for each light object in unity, class Valverealtimelight manage a static variable "list< valverealtimelight > S _alllights "is used to book all the light data.
    • Then, you need to add a "ValveCamera.cs" script on the main camera object. In the class valvecamera.updatelightconstants () member function, all the light-related parameters are computed and set to the constants of the shader.

      The above is the Single-pass Forward rendering of the Lab renderer to realize the idea of this feature.

Shadows

The Lab renderer also takes over the rendering of the shadows. You need to select "Disable Shadows" in Unity's quality->shadows settings to turn off Unity's default shadows.


As shown, the Lab renderer uses the shadow mapping algorithm to generate real-time shading. The rough process of this algorithm is this:

    1. Renders a depth buffer from the angle of the light. The geometric meaning of this depth buffer can be roughly understood as the nearest distance from each pixel to the light, and this depth buffer is also known as shadow buffer or shadow Map.
    2. When rendering back buffer, for each point that needs to be shaded, "Projection" it to the shadow map space above and compare it to determine if the point is closest to the light-that is, there is no other object obscured, that is, in the shadow.
    3. Generate shadow buffer rendering, for spot light is very intuitive, for directional lights, the lab uses an approximate method: to replace the directional light with a "very far" point light source, the lab uses 6 fake spot light to replace the point. 0_0| | |

The process control of the above algorithm is implemented in the ValveCamera.cs script. First it needs a camera that is rendered from a light angle, a rendertexture for shadow map, and a shader for shadow map rendering (Resources/vr_cast_shadows.shader).

[ExecuteInEditMode][RequireComponent( typeof( Camera ) )]public class ValveCamera : MonoBehaviour{    ...    [NonSerialized] private Camera m_shadowCamera = null;    [NonSerialized] public RenderTexture m_shadowDepthTexture = null;    [NonSerialized] public Shader m_shaderCastShadows = null;    ...}

Valuecamera.valveshadowbufferrender () is called in the Valuecamera.onprecull () script callback function to render the shadow Buffer. As shown in shadow, the lab renders all the lights into a whole shadow buffer, storing each light shadow buffer corresponding to the shader parameter "G_VSHADOWMINMAXUV". So in front of the Single-pass Forward rendering process, you can achieve all the lights in a Pass to calculate the light and shadow.

As for the content of Vr_cast_shadows.shader, it is very simple, it is a vertex Shader, used to calculate the position coordinates after the projection, UV ah what and so on can be omitted.

The shadow is computed by the Computeshadow_pcf_3x3_gaussian () function in the shader of the Light rendering (vr_lighting.cginc). The so-called PCF is the percentage Closer Filter, which produces a smooth edge of the shadow. In this function, it is a Gaussian filter to calculate the 3x3 range around the target point.

Adaptive Quality

For VR, the frame rate is very important, so Valve's Daniel added this feature: dynamically adjust the rendering quality to achieve stable efficiency, this is his speech on GDC 2016: Https://www.youtube.com/watch?v=eIlb688pUu4
This part mainly relates to when to adjust the quality, adjust which places (which can not be arbitrarily adjusted), the specific logic is in the valvecamera.updateadaptivequality () this function.

The Lab renderer study notes

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.