Unity5.3 Official VR tutorial debut-Series 2

Source: Internet
Author: User

Wang Han
Links: https://zhuanlan.zhihu.com/p/20485529
Source: Know
Copyright belongs to the author. Commercial reprint please contact the author for authorization, non-commercial reprint please specify the source.

Welcome to continue with our study.

Early January 7, 2016 in Beijing time, Oculus formally opened the first consumer version of Rift product CV1 booking. The price of 599 knives makes a lot of VR fans feel tight, but think of the first-generation iphone price is the same, can only expect the subsequent version price reduction.

Built

In order to be ready to learn to use unity to develop VR applications, we first check whether our computer hardware and software configurations meet the requirements. Simply put, the graphics card to Nvidia GTX970 or AMD290 above, the CPU to Intel i5-459 above, more than 8GB of memory, to have two USB3.0 interface, a USB2.0 interface, an HDMI 1.3 interface.

Os comparison pit dad, Mac not supported, and Linux is not supported. Support WIN7,WIN8,WIN10.

Of course, you also need to upgrade your graphics card driver to the latest version.

Oculus Official inspection Address:

HTTP/oculus.us5.list-manage.com/track/click?u=88dbd06829e35d5cbf84bbc2e&id=b436d0da47&e= 86f0296884

For information on Computer Configuration, refer to:

Oculus Rift Open the reservation tomorrow, your computer configuration is enough? (New official detection Tool address)

Once you've done this kind of money-burning job, it's time to install unity. Be careful to connect the DK2 and turn it on before you open unity. Before continuing, open the Oculus Configuration utility app and check that the demo scene will work. Note Before you run the demo scene, you may need to set up a new user in Oculus Configuration utility.

Create the first VR project

Next we'll use unity to create a simple VR project demo that looks at a cube in a VR helmet. If you want to explore more VR examples, download the VR sample project (Asset Store) that we mentioned in the previous tutorial.

Step 1.

Open unity and create a new empty project.

To illustrate, the unity version I'm currently using is 5.3.1f1, and maybe you've already upgraded it when you saw this tutorial.

Step 2.

Select Fiel-build Settings in the Unity menu and select pc,mac& Linux Standalone

Step 3.

Create a new cube in the scene, choose Game Object-3d Object-cube from the menu, and place the cube in front of the default main Camera with the Translate tool, similar to the following.

Step 4. Save the interface (File-save Scene, or use shortcut keys).

Step 5. Select Edit-project Settings-player in the menu and check "Virtual Reality supported" in the "Other Settings" section

Step 6. Click on the play button on the Unity interface to enter play mode.

If there is no problem with the previous settings, you should now be able to see the scene through DK2 and look around, and the camera in unity will automatically respond to the position and rotation of the DK2.

What if something goes wrong?

If you do not see the desired scene in the DK2, then check the following items:

1. Make sure you have the DK2 connected and turned on before you open the Unity project.

2. Open Oculus Oculus Configuration Utility to see if the demo scene works

3. Update your graphics driver to the latest version

4. Make sure you have the latest Oculus Runtime 0.8 or later installed on your computer.

Of course, if there is still a problem, you can go to the forum to participate in the discussion. Virtual Reality

Some useful information about VR development:

While VR application development is similar to standard unity application development, there are some differences to be aware of.

1. Frame rate displayed in the editor

When you view the project through the editor, note that there may be some delay in the experience because the computer needs to render the same content two times. So in the actual test project, it is best to create an executable version of the actual experience on the test device.

2. Movement of the camera

Note that we cannot move the VR camera directly in unity. If you want to adjust the position and rotation of the camera, be sure to set it as a sub-object of the Gameobject of other game objects and then move through the attached object.

For this, you can view the flyer and maze scenes in the VR samples project.

3. Camera node

The camera in the left and right eye is not created by unity. If you need to get the locations of these nodes in development, you must use the Inputtracking class.

If you want to get different locations of the eyes in the scene (for example, when testing), use the following example script and attach it to the camera.

C # Script

Using Unityengine;

Using UNITYENGINE.VR;

public class Updateeyeanchors:monobehaviour

{

Gameobject[] eyes = new gameobject[2];
String[] eyeanchornames ={"Lefteyeanchor", "Righteyeanchor"};void Update ()

{
for (int i = 0; i < 2; ++i)

{

If the eye anchor are no longer a child of us, and don ' t use it

if (eyes[i]! = NULL && eyes[i].transform.parent! = transform)

{

Eyes[i] = null;

}

If we don ' t have a eye anchor, try to find one or create one

if (eyes[i] = = null)

{

Transform t = Transform. Find (Eyeanchornames[i]);

if (t)

Eyes[i] = T.gameobject;

if (eyes[i] = = null)

{

Eyes[i] = new Gameobject (eyeanchornames[i]);

Eyes[i].transform.parent = Gameobject.transform;

}

}

Update The eye transform

Eyes[i].transform.localposition = Inputtracking.getlocalposition ((vrnode) i);

Eyes[i].transform.localrotation = Inputtracking.getlocalrotation ((vrnode) i);

}

}

}

Image effects in 4.VR (Images Effect)

The use of many image effects in VR projects is a luxury thing. Given that you need to render the same scene two times (once per eye), many of the currently used image effects are wasteful for VR applications and can seriously affect the game's running frame rate.

Because VR places the user's eyes into a virtual space, some image effects do not make sense for VR. For example, the depth view, blur effect, and lens halo effect don't make sense for VR because we don't see these effects in the real world. However, if the VR headset can support eye tracking later, then the depth perspective may make sense.

However, some effects can still be considered: anti-aliasing is useful (especially given the low resolution of some head-mounted devices), and color grading is also useful (see this link for this: color Grading with Unity and the Asset Store), Bloom can be useful for some games. However, before using any effect, it is best to use the actual test in the game.

Unity itself offers many image effects (Assets-import package-effects), and asset Store also offers many effects, such as colorful,chromatica,amplify Color and much more.

5.Render Scale

Depending on the complexity of the VR interactive application scenario and the hardware environment that you are running, you may need to change the render scale setting. This setting allows you to adjust the ratio of the texel:pixel before the lens correction so that the game performance can be sacrificed in exchange for the sharpness of the screen.

This setting should be done in code, which you can refer to here:

HTTP/unity3d.com/cn/vrsettings.renderscale

You can change the settings of the render scale by using the following code:

Using Unityengine;

Using System.Collections;

Using UNITYENGINE.VR;

Namespace Vrstandardassets.examples

{

public class Examplerenderscale:monobehaviour

{

[Serializefield] Private float m_renderscale = 1f;

The render scale. Higher numbers = better quality, but trades performance

void Start ()

{

Vrsettings.renderscale = M_renderscale;

}

}

}

For this setting, you can refer to our VR Samples, which is an example of the Scenes/examples/renderscale scenario. There are also applications for this setting in the MainMenu scene.

An example of the effect of changing the render scale is as follows:

The default render scale for unity is 1.0, with the following effect:

If you set the render scale to 1.5, you can see that the display looks sharper:

Next set the Renderscale to 0.5, you can see the pixel is very serious:

Depending on the game scenario, you might consider lowering the render scale to improve the performance of your game, or by increasing the value of the render scale to make the picture sharper, but at the expense of running performance at the expense of the game.

Well, see here, you should know how to integrate VR in a unity project, how to set up the movement of the camera in the game, and how to use the image effect when compared to non-VR games.

In the next tutorial, we will learn how to interact with the objects in the scene, so please look forward to

Unity5.3 Official VR tutorial debut-Series 2

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.