Summary of VR official tutorials related to graphical display _VR

Source: Internet
Author: User
Tags virtual environment unity 5

01texel:pixel Render Scale Unity5.3 official VR Tutorial-Series 2-Goofy cat programming-Learn about columns depending on the complexity of the VR interactive application scenario and the hardware environment you are running, you may need to change the Render Scale settings. This setting allows you to adjust the ratio of the texel:pixel before the lens is corrected, so that you can sacrifice game performance in exchange for the clarity of the screen. This setting should be done by code, which you can refer to here:

Http://unity3d.com/cn/VRSettings.renderScale

You can change the settings of the render scale by using the following code:

Using Unityengine;
Using System.Collections;
Using UNITYENGINE.VR;
namespace Vrstandardassets.examples public
class Examplerenderscale:monobehaviour
[Serializefield] Private float M_renderscale = 1f;
The render scale. Higher numbers = better quality, but trades-performance
void Start ()
vrsettings.renderscale = M_renderscale;


For this setting, refer to our VR samples, which are scenes/examples/renderscale scenarios. In addition, this setting is also used in MainMenu scenarios.

Examples of the effects of changing render scale are as follows:

Unity's default render scale is 1.0, the effect is as follows:

If you set the render scale to 1.5, you can see a sharper display:

Next, set the Renderscale to 0.5, and you can see the pixel is very serious:

Depending on the game scene, you can consider lowering the render scale to improve the performance of the game, or by increasing the number of render scale to make the screen more sharp, but at the expense of game performance.
02UI renders the crosshair to the surface of other Gameobject Unity5.3 official VR tutorial blockbuster-Series 3 interactive mode in VR Unity5.3 official VR tutorial blockbuster-Series 4 user interface in VR

If the cross is exactly the same as the position of an object, the star may be embedded in the adjacent object.

To solve this problem, we need to make sure that the star is rendered to the front of all objects in the scene. In VR samples, we provide a shader based on Unity's existing "Ui/unlit/text" shader named Uioverlay.shader. When selecting a shader for a material, it can be found in "Ui/overlay".

This shader applies to both UI elements and text, and is drawn in front of other objects in the scene.

Non-diegetic (non-storyline UI)

In traditional non-VR projects, the UI is usually displayed at the top of the interface to display information such as life values, scores, and often called HUD. This is the so-called non-plot ui-user interface is not related to the game world, but plays a role in the game players.

There are also so-called "voice-over" movies, such as background music in movies or television. In contrast, the storyline is often closely linked to the story-such as the dialogue between roles, or the actual natural sound effects that occur.

In unity, it's relatively easy to add a HUD-style canvas UI, just choose screen space-overlay or screen space-camera in the render mode of the UI.

But this kind of UI is basically not applicable to VR, our glasses can't focus on such a close object, and the Unity VR does not support screen space-overlay at all.

Spatial UI (spatial UI)

Unlike the previous UI, we need to put the UI into the environment and choose to use World space mode in canvas's render mode. In this way, you can get the user's glasses to focus on the UI. This is called the spatial UI (spatial UI). Where you place the UI elements in the world also requires serious consideration. Too close to the user can cause eye fatigue, too far away to feel focused on the horizon-a situation that can occur outdoors, not in a small room. In addition, we need to adjust the proportion of the UI appropriately, depending on the actual requirements of the product.
We'd better put the UI at a comfortable readable distance and scale accordingly. You can refer to the UI in the main menu scene for this: it is positioned a few meters away, so the picture and text are larger and look comfortable. When we put the UI at a distance from the user, we might find that the UI overlaps with other objects. For this, you can view the previous tutorial to see how to modify shader to draw it on other objects, or directly use the shader in the VR samples project. This shader also applies to text.

Many developers want to associate the UI with the camera so that when the player moves, the UI stays in a fixed position. This may be appropriate for a crosshair or other small object, but for larger UI elements, it's like putting a newspaper on your face, making it easy for users to feel uncomfortable or even dizzy. We can look at the UI in the shooter 360 (Target Arena) scenario, where the UI moves to the field of vision after a short delay, allowing users to look around and become familiar with the environment, rather than fixing the UI directly into their field of vision, resulting in blurred vision.
The UI in menu scene menu scene uses custom textures to create a curved, sealed environment.

Flyer scene

The game profile and the end of the game UI are placed in a static manner in the world space. But in order to present game-related information, we can use the world space UI associated with spaceships, which is the storyline UI.

Given that the spacecraft must always appear in the user's field of vision, it is reliable to display the relevant information in the four weeks of the spacecraft. At the same time, the UI elements rotate to face the camera, which avoids skewing and ensures that the UI is displayed to the user.

Maze scene

In the maze scenario, we use the spatial UI to provide game booting and ending. Spatial UI can also be used to prompt the user for interaction: text anti-aliasing in VR

To achieve free text anti-aliasing in the VR world, here's a little trick to use canvas scaler in WorldSpace Canva. In the relevant settings, we need to set the "Reference Pixels per unit" to 1, and then adjust the "Dynamic Pixels per unit" until the edge of the text is softened. In the following illustration, we can see the actual display effect of different settings. 03
Fade/Fade and blink gradients in VR Unity5.3 official VR tutorial blockbuster-Series 5 motion in VR a common way to implement motion in a virtual environment is to use fading or blinking gradients, such as fast fading to black, moving the camera to the desired position, and then fading in. In addition, you can consider a more complex approach, which is to use a blinking gradient. For specific instructions on this point, you can refer to Tom Forsyth on Oculus Connect 2014 (https://www.youtube.com/watch?v=addUnJpjjv4&feature= YOUTU.BE&T=40M5S) 04 Optimization
Overdraw Unity5.3 Official VR course blockbuster-Series 7 optimizing the VR experience

Overdraw allows developers to see which objects are drawn at the top of other objects, but is actually wasting GPU time. We should reduce the use of overdraw as much as possible. We can use Scene View Control bar to view the overdraw in the scene view Unity-manual:scene.

The normal shaded view is as follows:

To enable shaded views after Overdraw:
Level of Detail (LOD) detail levels

By using LOD, you can reduce the number of triangles rendered by objects with the distance between the object and the camera. Unless all the objects are as far away from the camera, we can use LOD to reduce the burden on the hardware. We can add a LOD component and then provide a low precision model for objects away from the camera.
Using Simplygon (Asset Store) automatically completes the Lod preprocessing for most Asset.

Light Mapping

Minimize dynamic lighting, use light baking as much as possible, and avoid real-time shading.
Please refer to the Unity official lighting and Rendering (unity-unity 5) for the specific contents of this section.

Light Probes
Using the Light probes (Unity-manual:light probes) allows us to sample the lighting points in the scene and then apply them to the dynamic object. Using light probes is usually faster, and can also produce excellent visual effects.

Occlusion culling

Occlusion culling (occlusion culling) avoids rendering of objects that are not visible. For example, if the player is in a room and the door in the other room is closed, then all objects in the other room are invisible to the player and there is absolutely no need to render it.

Depending on the project and the target platform, we may want to implement occlusion culling to dramatically improve game performance.

The following figure is an example of a frustum culling (visual cone culling):

The following figure is an example of occlusion culling (occlusion culling):

Anti-aliasing (anti-aliasing)

Anti-aliasing is important for VR applications, because using this technique can make the edges of the image smoother and reduce the edge of the hair. If we use forward Rendering in our project, then we need to enable MSAA (Unity-manual:quality Settings) in the Quality setting. For Gear VR projects, we need to enable this option at any time.

Of course, when using deferred rendering we cannot enable MSAA, we need to enable antialiasing as a reprocessing effect (so-called "anti-aliasing", Asset Store), or consider using Smaa.

A related example is provided here.
GITHUB-CHMAN/SMAA-UNITY:A highly customizable implementation of subpixel morphological for Antialiasing.

Textures

In general, we should use texture atlasing (texture map, Texture Atlas) as much as possible in VR projects to reduce the amount of individual textures and materials used.

To simplify and accelerate this process, we can consider using Meshbaker (Asset Store) to bake the textures, models, and textures used in the game.

At the Oculus Connect 2 developer conference, Holden from the Turbo button has shared experience in optimizing applications and using Meshbaker.

Https://www.youtube.com/watch?v=9vZ8SfXOlpI
One thing to note is that normal maps does not look good in a VR project, so we should avoid using it. For more information on textures, please refer to Oculus documents (https://developer.oculus.com/documentation/intro-vr/latest/concepts/bp_app_rendering/).

Shaders
In VR projects, we should use the most basic shader possible. On Gear VR, we might want to consider using Mobile>unlit (Supports lightmap) shader, which consumes less resources, and use LIGHTMAP to provide illumination to the scene. We chose the art style of the low polygon and used a small amount of basic color to make the object stand out from the environment.

When using forward rendering, we need to enable 4x MSAA in Edit > Project Settings > Quality settings, (unity-manual:quality Settings) to To get better visual effects:

Optimization techniques used in the menu scenario:

As with all the scenarios in the project, the menu scene uses a low polygon art resource and avoids the use of real-time lighting.
We used the custom shader on the menu panel, named Separablealpha, to define a separate alpha channel for a range of images. This means that not every frame needs its own alpha channel. Doing so saves the file size and removes some textures.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.