in order to be ready to learn to use unity to develop VR applications, we first check whether our computer hardware and software configurations meet the requirements. Simply put, the graphics card to Nvidia GTX970 or AMD290 above, the CPU to Intel i5-459 above, more than 8GB of memory, to have two USB3.0 interface, a USB2.0 interface, an HDMI 1.3 interface. be careful to connect the DK2 and turn it on before you open unity. Before continuing, open the Oculus Configuration utility app and check that the demo scene will work. Note Before you run the demo scene, you may need to set up a new user in Oculus Configuration utility. Create the first VR project Next we'll use unity to create a simple VR project demo that looks at a cube in a VR helmet. If you want to explore more VR examples, download the VR sample project that we mentioned in the previous tutorial (Asset Store). STEP01: Open Unity,create a new, empty project. the Unity version I'm currently using is 5.3.1F1 STEP02:Select Fiel-build Settings in the Unity menu and select pc,mac& Linux Standalone
STEP03:Create a new cube in the scene, choose Game Object-3d Object-cube from the menu, and place the cube in front of the default main Camera with the Translate tool, similar to the following.
Step 04: Save your SceneStep 05: Togo to Edit > Project Settings > Player to check if Virtual Reality supported is checked. such as:
Step06: Click on the play button on the Unity interface to enter play mode.
If there is no problem with the previous settings, you should now be able to see the scene through DK2 and look around, and the camera in unity will automatically respond to the position and rotation of the DK2.
What if something goes wrong?
If you do not see the desired scene in the DK2, then check the following items:
1. Make sure you have the DK2 connected and turned on before you open the Unity project.
2. Open Oculus Oculus Configuration Utility to see if the demo scene works
3. Update your graphics driver to the latest version
4. Make sure you have the latest Oculus Runtime 0.8 or later installed on your computer.
Some useful information about VR development:
While VR application development is similar to standard unity application development, there are some differences to be aware of.
1. Frame rate displayed in the editor
When you view the project through the editor, note that there may be some delay in the experience because the computer needs to render the same content two times. So in the actual test project, it is best to create an executable version of the actual experience on the test device.
2. Movement of the camera
Note that we cannot move the VR camera directly in unity. If you want to adjust the position and rotation of the camera, be sure to set it as a sub-object of the Gameobject of other game objects and then move through the attached object.
For This, you can view the flyer and maze scenes in the VR samples project.
3. Camera node
The camera in the left and right eye is not created by unity. If you need to get the locations of these nodes in development, you must use the Inputtracking class.
If you want to get different locations of the eyes in the scene (for example, when testing), use the following example script and attach it to the camera.
Using unityengine;using Unityengine.vr;public class updateeyeanchors:monobehaviour{gameobject[] eyes = new GAMEOBJEC T[2]; String[] eyeanchornames ={"Lefteyeanchor", "Righteyeanchor"};void Update () {for (int i = 0; i < 2; ++i) {//If the eye anchor are no longer a child of us, and don ' t use it If (eyes[i]! = NULL && Eyes[i].transform.parent! = Transform) {Eyes[i] = null; }//If we don ' t have a eye anchor, try to find one or create one If (eyes[i] = = null) { Transform t = Transform. Find (Eyeanchornames[i]); if (t) eyes[i] = T.gameobject; if (eyes[i] = = null) {Eyes[i] = new Gameobject (eyeanchornames[i]); Eyes[i].transform.parent = Gameobject.transform; }}//Update the eye transform Eyes[i].transfOrm.localposition = Inputtracking.getlocalposition ((vrnode) i); Eyes[i].transform.localrotation = Inputtracking.getlocalrotation ((vrnode) i); } }}
Image effects in 4.VR (Images Effect)
The use of many image effects in VR projects is a luxury thing. Given that you need to render the same scene two times (once per eye), many of the currently used image effects are wasteful for VR applications and can seriously affect the game's running frame rate.
Because VR places the user's eyes into a virtual space, some image effects do not make sense for VR. For example, the depth view, blur effect, and lens halo effect don't make sense for VR because we don't see these effects in the real world. However, if the VR headset can support eye tracking later, then the depth perspective may make sense.
However, some effects can be considered: anti-aliasing is useful (especially given the low resolution of some head-mounted devices), and color grading is also useful (see this link for this: color Grading with Unity and the Asset Store ), which can be useful for some games. However, before using any effect, it is best to use the actual test in the game.
Unity itself offers many image effects (Assets-import package-effects), and asset Store also offers many effects, such as colorful,chromatica,amplify Color and much more.
5.Render Scale
Depending on the complexity of the VR interactive application scenario and the hardware environment that you are running, you may need to change the render scale setting. This setting allows you to adjust the ratio of the texel:pixel before the lens correction so that the game performance can be sacrificed in exchange for the sharpness of the screen.
You can change the settings of the render scale by using the following code:
Using unityengine;using system.collections;using unityengine.vr;namespace vrstandardassets.examples{public Class Examplerenderscale:monobehaviour { [Serializefield] private float m_renderscale = 1f; The render scale. Higher numbers = better quality, but trades performance void Start () { Vrsettings.renderscale = m_renderscal e;}}}
For this setting, you can refer to our VR Samples, which is an example of the Scenes/examples/renderscale scenario. There are also applications for this setting in the MainMenu scene.
An example of the effect of changing the render scale is as follows:
The default render scale for unity is 1.0, with the following effect:
If you set the render scale to 1.5, you can see that the display looks sharper:
Next set the Renderscale to 0.5, you can see the pixel is very serious:
Depending on the game scenario, you might consider lowering the render scale to improve the performance of your game, or by increasing the value of the render scale to make the picture sharper, but at the expense of running performance at the expense of the game.
Well, see here, you should know how to integrate VR in a unity project, how to set up the movement of the camera in the game, and how to use the image effect when compared to non-VR games.
Unity 5.3 Official VR Tutorial (ii) Creation of the first VR project