---Oculus code analysis of the VR introductory series tutorial

Source: Internet
Author: User
Code profiling
Original Author: Tony Parisi
So how exactly does unity support the Oculus VR operation. First, let's look at how the unity scene is built. In the Unity Integration Development package There is a camera preset, which provides the most basic VR technology, including: Oculus Stereo rendering and head tracking, let's take a concrete action.
Locate the Ovrcamerarig object in the hierarchy panel, and then click the down arrow on the left to expand its sub object, Camera The rig contains a child object called Trackingspace, Trackingspace below contains: Lefteyeanchor,centereyeanchor,righteyeanchor and Trackeranchor four sub objects. Of these, left and right Anchor are the keys, each with a camera, which is used to render the right eye view separately. The parameters of both cameras in inspector are default values, and their parameters change as the program runs.
Once again we locate the Ovrcamerarig object, and we double click on it with a script component called Ovrcamerarig, and the Unity Editor uses MonoDevelop to open a script source code called OVRCameraRig.cs for us.
Search the source code in MonoDevelop and find the Lateupdate function, as follows: [CSharp] View plain copy print? #if! unity_android | |       Unity_editor privatevoid lateupdate () #else privatevoid Update () #endif {ensuregameobjectintegrity (); if (!       application.isplaying) return;       Updatecameras ();   Updateanchors (); }
#if! unity_android | | Unity_editor
privatevoid lateupdate ()
#else
privatevoid Update ()
#endif
{
    Ensuregameobjectintegrity ();
    if (! application.isplaying) return
        ;
    Updatecameras ();
    Updateanchors ();
}


We're not building on Android, so the #if statement is true, we're using the Lateupdate function, and the unity script calls multiple functions at run time, including the update and Lateupdate functions. The Lateupdate function is better suited to be used as a camera update because the engine can make sure that all update operations are done before calling the Lateupdate function, which is necessary, such as we need to get the header information before updating the camera.
In the Lateupdate function we first call the Ensuregameobjectintegrity function to make sure that there are objects in the scene that we must (that is, Ovrcamerarig preset instances). This prevents the script from being included in the scene but not instantiating the Ovrcamerarig object.
    After checking to see if the application is running, we'll start working on it. First, we call Updatecameras to update two camera parameters, the following code: [CSharp] View plain copy print? Privatevoid updatecameras ()    {       if (needscameraconfigure)        {           lefteyecamera  = configurecamera (ovreye.left);            Righteyecamera = configurecamera (ovreye.right);   #if  ! unity_android | |  UNITY_EDITOR           needscameraconfigure =  false;   #endif        }  }  
Privatevoid Updatecameras ()
{
    if (needscameraconfigure)
    {
        Lefteyecamera = Configurecamera ( Ovreye.left);
        Righteyecamera = Configurecamera (ovreye.right);
#if! unity_android | | Unity_editor
        needscameraconfigure = false;
#endif
    }
}

This function will only be called once on the desktop side, it will get the configuration parameters through the Oculus configuration resource, and then tell the next execution by an identity variable that the program is configured.
    Below is the Configurecamera function, which configures the parameters for each camera: [CSharp] View plain copy print? Privatecamera configurecamera (ovreye eye)    {       transform  anchor =  (Eye == ovreye.left)  ? leftEyeAnchor : rightEyeAnchor;        camera cam = anchor. Getcomponent<camera> ();       ovrdisplay.eyerenderdesc eyedesc =  ovrmanager.display.geteyerenderdesc (eye);       cam.fieldofview =  eyeDesc.fov.y;       cam.aspect = eyedesc.resolution.x /  eyeDesc.resolution.y;       cam.rect = newrect (0f, 0f,  Ovrmanager.instance.virtualtexturescale, ovrmanager.instance.virtualtexturescale);        cam.targettexture = ovrmanager.display.geteyetexture (eye);       cam.hdr = OVRManager.instance.hdr;        ...       returncam;  }  
Privatecamera Configurecamera (Ovreye eye)
{
    Transform anchor = (eye = = ovreye.left)? lefteyeanchor:righteyeanc Hor;
    Camera cam = anchor. Getcomponent<camera> ();
    Ovrdisplay.eyerenderdesc Eyedesc = OVRManager.display.GetEyeRenderDesc (eye);
    Cam.fieldofview = eyedesc.fov.y;
    Cam.aspect = EYEDESC.RESOLUTION.X/EYEDESC.RESOLUTION.Y;
    Cam.rect = Newrect (0f, 0f, OVRManager.instance.virtualTextureScale, OVRManager.instance.virtualTextureScale);
    Cam.targettexture = OVRManager.display.GetEyeTexture (eye);
    CAM.HDR = OVRMANAGER.INSTANCE.HDR;
    ...
    Returncam;
}

Among them, the Ovrmanager class is the main interface with the Oculus Mobile SDK, which is responsible for many things, including the local Oculus SDK interface. If you're curious about this script, you can go back to the Unity editor and find the Ovrcamerarig object, and then find the Ovrmanager script in its component. So far, we've given two cameras a black box with parameters, including: FOV, screen aspect ratio, viewport, render target, and whether HDR is supported.
    The basic parameters of the camera have been set, but we still have to adjust the camera position and orientation according to the HMD information, through the following updateanchors function can be achieved: [CSharp] View plain copy print? Privatevoid updateanchors ()    {       boolmonoscopic =  ovrmanager.instance.monoscopic;       OVRPose tracker =  OVRManager.tracker.GetPose ();       OVRPose hmdLeftEye =  OVRManager.display.GetEyePose (ovreye.left);       ovrpose hmdrighteye =  ovrmanager.display.geteyepose (ovreye.right);        trackeranchor.localrotation = tracker.orientation;        centereyeanchor.localrotation = hmdlefteye.orientation; // using left eye  for now       lefteyeanchor.localrotation = monoscopic ?  centereyeanchor.localrotation : hmdlefteye.orientation;       rightEyeAnchor.localRotation = monoscopic ?  centereyeanchor.localrotation : hmdrighteye.orientation;        trackeranchor.localposition = tracker.position;        centereyeanchor.localposition = 0.5f *  (hmdlefteye.position +  hmdrighteye.position);       lefteyeanchor.localposition = monoscopic  ? centerEyeAnchor.localPosition : hmdLeftEye.position;        righteyeanchor.localposition = monoscopic ? centereyeanchor.localposition :  hmdRightEye.position;       if (updatedanchors != null)         {           updatedanchors (this );       }  }&NBSP;&NBSp
Privatevoid updateanchors () {boolmonoscopic = OVRManager.instance.monoscopic;
    Ovrpose tracker = OVRManager.tracker.GetPose ();
    Ovrpose Hmdlefteye = OVRManager.display.GetEyePose (ovreye.left);
    Ovrpose Hmdrighteye = OVRManager.display.GetEyePose (ovreye.right);
    Trackeranchor.localrotation = tracker.orientation; Centereyeanchor.localrotation = hmdlefteye.orientation; Using left eye for now lefteyeanchor.localrotation = Monoscopic?
    CenterEyeAnchor.localRotation:hmdLeftEye.orientation; Righteyeanchor.localrotation = Monoscopic?
    CenterEyeAnchor.localRotation:hmdRightEye.orientation;
    Trackeranchor.localposition = tracker.position;
    Centereyeanchor.localposition = 0.5f * (hmdlefteye.position + hmdrighteye.position); Lefteyeanchor.localposition = Monoscopic?
    CenterEyeAnchor.localPosition:hmdLeftEye.position; Righteyeanchor.localposition = Monoscopic?
    CenterEyeAnchor.localPosition:hmdRightEye.position;
  if (updatedanchors!= null)  {updatedanchors (this); }
}

This function obtains the current position and orientation of the HMD by Ovrtracker and Ovrdisplay, and then assigns the corresponding component.     The left and right eye anchors are responsible for real rendering, and center anchor exists as a tag, which facilitates the application of locating the center rather than recalculation, and the tracker variable is responsible for saving the location tracking information. At this point, we just add a preset instance, we have achieved the Oculus Rift stereo rendering and position tracking function. Although this preset is a bit of a responsibility, we'll find out what the mystery is if we look deeply into it.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.