Unity5.3 Official VR Tutorials-Series 7 optimized VR experience

Source: Internet
Author: User
Tags unity 5

This article turns from: About the column-stupid cat happy to learn programming, Wang Han

Brief introduction

For VR applications, if you want to give users a good user experience, especially the problem of eliminating nausea and vertigo, it is essential to optimize in VR development, so as to achieve our desired game running frame rate. Unlike development on other platforms, the optimization of VR applications should begin at the start of the project, and should always be done, rather than leaving the optimization work to the end, as is the case with traditional projects. In addition, it is necessary to perform actual testing on the target device.

VR projects are very computationally expensive compared to non-VR projects, and the main reason is that all images must be rendered individually for each eye. Therefore, it is necessary to think about these problems in the process of developing VR applications. If we can think of these problems before we open them, we will save a lot of time.

For mobile VR, the optimization work is particularly important. Not only because of the need to run VR applications, but also because of the computing performance and heat dissipation of mobile devices compared to the desktop computer is a lot worse.

Considering that it is so important to achieve the target frame rate, all optimization methods must be taken into account. We need to optimize the project code wherever possible, and we can refer to Unity's unity–manual:optimizing Scripts for optimized code.

Resources related to Oculus

A lot of information on how to optimize VR applications is available on Oculus's official website, and it is strongly recommended that you read these documents carefully before reading our tutorials.

https://developer.oculus.com/documentation/

Http://static.oculus.com/sdk-downloads/documents/Oculus_Best_Practices_Guide.pdf

https://developer.oculus.com/blog/squeezing-performance-out-of-your-unity-gear-vr-game/

https://developer.oculus.com/blog/squeezing-performance-out-of-your-unity-gear-vr-game-continued/

Unity Editor Optimization Tool

Unity provides a range of useful tools and methods to help us optimize VR content.

The Profiler

Profiler helps developers understand the time it takes to render each frame in the game and divide it into CPU, rendering, memory, audio, physics engine, and network. Learning how to use the profiler is critical to detecting game performance.

For information about Profiler, refer to the following links:

Unity–manual:profiler

Unity–profiler Overview for Beginners

Unity–introduction to the Profiler

Frame Debugger

Using frame debugger lets us freeze a frame and then use a separate draw to see how the scene was generated, and then discover where to optimize. In this process, we may find that we have rendered some objects that are not necessarily rendered, which can significantly reduce draw calls per frame.

For more information on using frame debugger, please refer to here:

Http://docs.unity3d.com/Manual/FrameDebugger.html

Unity–frame Debugger

Basic knowledge of VR application optimization

Considering the optimization of the application is a huge topic, different platforms have different requirements, we also provide extended reading related information.

Usually, the optimization techniques of existing applications are also applicable to VR development, so this knowledge is also used.

Geometry (geometry)

In VR applications we should try to remove polygons that the user will never notice in the geometry. We don't need to render something that the user doesn't see in the scene. For example, if a cup is backed against a wall, then the user may never see the back of it, so we don't have to display the faces in the model.

For 3D art designers, the model design should be simplified as much as possible. Depending on the target platform, we may need to look at the texture details and perhaps want to see the parallax mapping map, and the surface subdivision. While this approach may affect game performance, it may not be available to specific platforms at all.

Overdraw

Overdraw allows developers to see which objects are drawn at the top of other objects, but in fact they are wasting GPU time. We should reduce the use of overdraw as much as possible. We can use Scene View Control bar (Unity–manual:scene View control bar) to see Overdraw in the scene view.

The normal shaded views are as follows:

To enable shaded views after Overdraw:

Level of Detail (LOD) levels of detail

By using LOD, you can reduce the number of triangles the object renders with the distance between the object and the camera. Unless all the objects are as far away from the camera, we can use LOD to reduce the burden on the hardware. We can add a LOD component and then provide a low-precision model for objects away from the camera.

LOD preprocessing for most Asset can be done automatically using Simplygon (Asset Store).

Draw Call Batch Processing

The draw call batch should be implemented as far as possible through the static batching and dynamic batching. The Draw call batch process can greatly improve game performance. Please refer to the Draw call batching of the Unity's official guide for more details (Unity–manual:draw-batching).

Light mapping

Minimize dynamic lighting and use light baking as much as possible to avoid real-time shading.

Refer to Unity's official lighting and Rendering (unity–unity 5) For details on this section.

Light Probes

Using light probes (Unity–manual:light probes) allows us to sample the lighting points in the scene and apply them to dynamic objects. Using light probes is usually faster and produces excellent visual effects.

Reflection Probes

Reflection probes (unity–manual:reflection Probe) can save the cubic diagram around it for real-world reflection, but it also has an impact on game performance. It is important to note that the current use of real-time reflection probes in VR can result in a significant reduction in game performance.

Occlusion culling

Occlusion culling (occlusion culling) prevents the rendering of objects that are not visible. For example, if a player is in a room and the door of another room is closed, then all objects in the other room are invisible to the player, and no rendering is necessary at all.

Depending on the project and the target platform, we may want to implement occlusion culling, which can dramatically improve game performance.

is an example of a frustum culling (frustum culling):

is an example of occlusion culling (occlusion culling):

Anti-aliasing (anti-aliasing)

Anti-aliasing is important for VR applications because using this technique can make the edges of the image smoother and reduce the edges of the hairs. If we use forward Rendering in our project, we need to enable MSAA (Unity–manual:quality Settings) in quality setting. For Gear VR projects, we need to enable this option at any time.

Of course, we can't enable MSAA when using deferred rendering, and we need to enable antialiasing as a post-processing effect (so-called "anti-aliasing", Asset Store), or consider using Smaa.

A related example is provided here.

GITHUB–CHMAN/SMAA-UNITY:A highly customizable implementation of subpixel morphological Antialiasing for Unity3D.

Textures

In general, we should use Texture atlasing (texture mapping, Texture Atlas) as much as possible in VR projects to reduce the amount of individual textures and materials used.

To simplify and accelerate this process, we can consider using Meshbaker (Asset Store) to bake the textures, models, and materials used in the game.

At the Oculus Connect 2 developer conference, Holden from the Turbo button shared experience in optimizing applications and using Meshbaker (video).

It is important to note that normal maps does not look good in VR projects, so we should avoid using them. For more information on textures, please refer to Oculus documents.

Shaders

In VR projects, we should use the most basic shader as far as possible. On Gear VR, we may need to consider using a less resource-intensive mobile>unlit (Supports lightmap) shader and use Lightmap to provide illumination to the scene.

Fullscreen effects (full screen effects)

Fullscreen Effects (unity–manual:image Effect Reference) is too extravagant for VR projects, so we should avoid it completely in Gear VR projects.

Quality Settings

The options in quality Settings (Unity–manual:quality Settings) will directly affect the visual effects of your project. By adjusting these properties, you can improve game performance to some extent, at the cost of sacrificing some of the visual effects.

Renderscale

Tuning Vrsetting.renderscale (unity–scripting API:) can sacrifice quality for higher game performance. Refer to the second article in this series for details.

Asynchronous Loading

In order to improve performance, we can consider dividing the game scene into many small scenes. However, it is important to note that when loading the contents of the next scene, you should avoid locking the tracking of the head to avoid nausea nausea.

To avoid this, we can consider designing a loading scenario that allows for head motion tracking, allowing the game to load new scenes asynchronously, using Scenemanager.loadsceneasync (unity–scripting API:).

Optimization techniques used in the sample scenario

To enable users to get a better experience on DK2 and Gear VR, we used a series of optimization techniques in the example scenario.

Considering that we need to support two bars for the same project, we need to consider support for the lowest-end performance devices, also known as Gear VR. We chose the art style of low poly and used a small amount of basic color to make things stand out from the environment.

When using forward rendering, we need to enable 4x MSAA in Edit > Project Settings > Quality Settings (unity–manual:quality Settings) to To achieve better visual effects:

Let's take a quick look at the optimization techniques used in these scenarios:

Optimization techniques used in the menu scene:

Like all the scenes in the project, the menu scene uses a low-poly art resource and avoids the use of real-time lighting.

We used the custom shader on the menu panel, named Separablealpha, to define a separate alpha channel for a series of images. This means that not every frame needs its own alpha channel. Doing so saves the file size and removes some stickers.

Optimization techniques used in the Flyer scenario:

We dynamically enabled the fog (unity–scripting API:) in the flyer scene to avoid the sudden jumping of objects into the player's field of view and shorten the range of sight, which means that the number of objects to be rendered is reduced.

The number of vertices in the scene is low, so Draw call can be reduced with the dynamic batching (Unity–manual:draw call batching).

To reuse some objects, we created an object pool (Unity–object Pooling) to handle objects such as lasers, meteorites, and Stargate. In this way, expensive initialization calls can be avoided (unity–scripting API:).

We have also optimized the spaceship textures in the flyer scene by using the secondary UV channels in the detail Map slot, which can only be used with fewer color blocks. This way we can reduce the overall texture size.

Optimization techniques in the maze scene

The lightmap is used in the maze scene to achieve better performance at run time, especially on Gear VR. In addition, the scene does not have any real-time lighting and effects.

Scene optimization in Shooter180 (target Gallery) and Shooter360 (target Arena)

As with other games, we have used the low-poly style in these scenes and created an object pooling for the target object. At the same time we used the low vertex number to enable Dymaic batching (Unity–manual:draw call batching).

After reading this tutorial, you should have an overall impression of VR game optimization, and probably understand how we can use Unity's built-in tools to analyze game performance and how to achieve better performance with some tricks.

Oculus the official website has a lot to do with this:

https://developer.oculus.com/documentation/

Http://static.oculus.com/sdk-downloads/documents/Oculus_Best_Practices_Guide.pdf

https://developer.oculus.com/blog/squeezing-performance-out-of-your-unity-gear-vr-game/

https://developer.oculus.com/blog/squeezing-performance-out-of-your-unity-gear-vr-game-continued/

In the last part of the tutorial, we will provide a series of reference materials for you to learn more deeply.

Unity5.3 Official VR Tutorials-Series 7 optimized VR experience

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.