How to shoot mixed Reality (mixed reality) videos via HTC Vive

Source: Internet
Author: User
Tags artoolkit amd radeon

Https://www.vive.com/cn/forum/1706?extra=page%3D1

Maybe you're a developer who wants to create a cool promo for your HTC Vive game, or you're a game host that wants to bring a high-quality VR live experience to your audience, even if you're a VR enthusiast. Anyway, if you're interested in mixed Reality (mixed reality), keep looking. This article describes how to use HTC Vive to capture high-quality Mr Video.

========================================================
What is mixed Reality (mixed reality) video?
========================================================

The Mixed reality film is a video + portrait synthesis through VR. The Mixed reality video is more suitable for making live games and making promotional videos than the general method of recording VR movies.
If still not clear, you can look at the following film, that is, through the mixed reality produced by the film effect.
Http://v.qq.com/x/page/x0194621p8b.html

========================================================
Make mixed reality movies need to be prepared for props
========================================================

1.) Green screen Environment

2.) Camera/photographic machine (recommended parameter: 1080P/60FPS)

3.) 4K screen: In order to synthesize 1080p movie, the screen needs to be able to put down multiple 1080p windows (recommended parameter: 4k/60hz)

4.) HDMI splitter

5.) HTC Vive a group

6.) Third HTC Vive handle (Buy site: https://www.htcvive.com/cn/accessory/controller/)

7.) VR ready PC (Recommended parameters: Operating system: Windows 7 SP1, Windows 8.1, or Windows 10, processors: Intel? I5-4590/AMD FX 8350 equivalent or greater, Memory: 4 GB RAM, graphics: NVIDIA GeForce? GTX 970/amd Radeon? R9 290 equivalent or greater)

--------------------------------------------------------------------------------------------

If you are going to cast the screen directly to the audience, you may also need to prepare the following items:
(You can refer to figure one for the hardware configuration diagram when you take the Mr Movie)

1.) Video capture card: You can capture the camera or camera image to the computer, if you are using a network camera to directly input the image signal to the computer, you do not need this device

2.) 4K screen: In order to synthesize 1080p movie, the screen needs to be able to put down multiple 1080p windows (recommended parameter: 4k/60hz)

3.) screen (for photographers Viewing): To enrich the Mr Film, the mobile framing of the person following the experience will be more interesting than the fixed framing. It is therefore necessary to play back the synthesized results and assist the photographer in framing. If you use fixed framing, such as webcast, you do not need this device

4.) screen (for audience viewing purposes): Play the Mr Screen in real time through this screen for activities such as public display

========================================================
Mr Screen output description (please refer to figure I)
========================================================

Generally speaking, when the VR game is executed, the screen window will display the head view, and when the output of the Mr Screen is started, the screen window will display four quadrant screen for the user to synthesize, the four quadrant images are:

. Foreground (upper left)
The foreground screen shows the visual angle of the virtual camera in which objects between the head and the virtual camera are displayed in the game scene.

. Foreground Alpha Mask (top right)
This screen is the Alpha Shield of the foreground, and is mainly used as a foreground material for the post-production of the film.

. Background (lower left)
The background screen shows the visual angle of the virtual camera, and in the game scene, objects that are between the head and the distant background are displayed in this screen.

. Game screen (lower right)

========================================================
How to record Mr Images
========================================================

In order to activate the mixed reality screen output, the following three conditions must be met:

A.) The content program must be developed by the Unity engine SteamVR plug-in version v1.0.8 (or newer version)

B.) Place the archive externalcamera.cfg (plain text file with. cfg extension) in the content program execution file directory

C.) Link the third handle (connect the computer via USB)

The following are examples of file externalcamera.cfg:

-------------------------------------------------------------------------------------
X=0
Y=0
Z=0
Rx=0
Ry=0
Rz=0
Fov=60
near=0.1
far=100
sceneresolutionscale=0.5

-------------------------------------------------------------------------------------

When the above three conditions are established, the content program creates a virtual camera in the scene, and the file externalcamera.cfg defines the relevant parameters of the virtual camera (see figure III):

. X, Y, Z (in meters): three-dimensional distance between the virtual camera and the third handle

. Rx, Ry, RZ (unit: degree): The flip angle of the virtual camera relative to the third handle

. FOV: The vertical FOV of the virtual camera. (This FOV needs to be the same as the FOV of the physical camera)

In addition to defining the parameters of the virtual camera, this file also defines some picture parameters:

. Far (unit: meters): The maximum distance of the background screen, if the game scene is very large, it is recommended to increase this parameter

. Sceneresolutionscale: The quality of the game screen, reduce the picture to reduce the use of computer resources

========================================================
How to Calibrate camera parameters (calculate vertical FOV)
========================================================

When shooting a mixed reality movie, the most important thing is to get the FOV parameter value and the distance from the camera to the third handle to successfully synthesize the foreground with the solid object. Here we use the AR tool to calculate the camera and lens characteristics and get the correct vertical FOV. (This step is performed once, unless you replace the camera or lens at the time of shooting)

1.) Download Artoolkit for Unity (http://artoolkit.org/dist/arunity5/5.3/ARUnity5-5.3.2-tools-win.zip)

2.) Locate the calibration template "calibration chessboard (A4). pdf" in Artoolkit for Unity and print it in the original size (path: [Downloaded Artoolkit root directory]/doc/ Patterns

3.) from the command prompt character Execution program calib_camera.exe (path: [Downloaded Artoolkit root directory]/bin) instructions are as follows:

-----------------------------------------------------------------------------------------------------

> Calib_camera.exe--vconf "-devnum=1-flipv-showdialog"

The calibration tool must take the picture of the physical camera and, if it captures the front-facing lens, modify the instruction parameter to 1 or 2

-----------------------------------------------------------------------------------------------------

4.) Follow the online instruction procedure (http://artoolkit.org/documentation/doku.php?id=2_Configuration:config_camera_calibration) Generate the camera's characteristic parameter file Camera_para.bytes (please refer to figure IV)

"If the result of the calibration is available, the error of each image should be within one pixel, if the error exceeds two pixels, indicating that the calibration result is unsuccessful, please recalibrate"

5.) Copy the file Camera_para.bytes generated after calibration to the displacement calculation tool Externalcamera_cfg_gen

$PATH \externalcamera_cfg_gen\
Externalcamera_cfg_gen_data\
Streamingassets\
Ardata\

========================================================
How to pin the third handle to the camera
========================================================

Next, we try to calculate the distance from the third handle to the physical camera. The difficulty of calculation depends on the way the handle is fixed to the camera, and the relative position changes, it is necessary to modify the file externalcamera.cfg, so it is recommended to find a fixed and stable way, such as fixing the handle to the hot shoe, disassembly and installation fast, and the relative position is easy to fix (see figure V)


In order for the handle to appear at the same level as the physical camera, we built a handle bracket that vertically stands on the camera and the three-axis rotation in the archive externalcamera.cfg is very close to 0 degrees. Download 3D printing files now and paste one to two hex nuts (with 1/4 threads) after printing (refer to figure VI)
(3D Print File download link: https://drive.google.com/file/d/0B9XEEDfLPxmjTkc2RF85OVo4clk/view?pref=2&pli=1)

Once the third handle has been successfully fixed on the tripod, camera holder, or hot shoe, the next step is to calculate the desired displacement (x, Y, z) and rotation (RZ, Ry, RZ) in the file externalcamera.cfg. We recommend the following two ways:

Scenario A:

If you use the HTC vive handle bracket and attach it to the camera shoe, the handle should be at the same level as the camera lens, and all relative rotations (Rx, ry, RZ) are close to 0, so we only need to calculate the displacement:

1.) Locate the center of the handle (see figure Seven): The center point of the handle is located in the upper edge of the disc tracker, where the position of the rectangle represents the center point of the handle.


2.) Calculate the displacement (x, Y, z): The distance between the center point of the handle and the image position of the camera lens is calculated by means of a tape measure, and the lower right is the relative position of the x, Y, Z axis and the handle.


Programme B:

If your handle is not on the same level as the camera, here is a tool for calculating the displacement (download link: https://drive.google.com/file/d/0B9XEEDfLPxmjZjhYaEZrNE9mLVE/view)

1.) Download Artoolkit for Unity (http://www.artoolkit.org/dist/arunity5/5.3/ARUnity5-5.3.2-tools-win.zip)

2.) Locate the calibration template "Multi pattern 4x3 (A4). pdf" in Artoolkit for Unity and print it in the original size. (path: [Downloaded Artoolkit root directory]/doc/patterns)

3.) Copy the file Camera_para.bytes generated after calibration to the displacement calculation tool Externalcamera_cfg_gen

$PATH \externalcamera_cfg_gen\
Externalcamera_cfg_gen_data\
Streamingassets\
Ardata\

4.) Link the camera to the computer

5.) Execution $PATH \externalcamera_cfg_gen\externalcamera_cfg_gen.exe

--------------------------------------------------------------------------------------------------------------- -------

. Make sure the calculation tool fetches the correct calibration parameter file name, which you can specify in the profile videopara.cfg:

VIDEOCPARAMNAME0 = Camera_para

. The calibration tool must take the picture of the physical camera and, if captured to the front lens of the helmet, modify the parameters in the file videopara.cfg to 1 or 2:

VideoConfigurationWindows0 =-DEVNUM=1-SHOWDIALOG-FLIPV

. Make sure the screen scale of the physical camera is 16:9, for example: x 1080

--------------------------------------------------------------------------------------------------------------- -------

6.) Wear head display

7.) Place two handles in the position of the blue virtual handle (refer to figure Eight)

8.) Remove the head display

9.) The calibration template "Multi pattern 4x3 (A4). pdf" As for the center of two handles, the direction needs to be consistent with the virtual scene (see figure Nine)

10.) Align the camera to the calibration template "Multi pattern 4x3 (A4). pdf". (preferably through the tripod) (refer to Figure 10)

11.) Wear head display

12.) Press the Grip button on the side of the next handle to switch the virtual orange handle to the position of the third handle. The relative position of the orange handle and the virtual camera is the same as the solid State (refer to Figure 11)

13.) Press the Trigger button on the next handle to generate the file externalcamera00.cfg

14.) Rename the file to Externalcamera.cfg and copy it to the game execution file folder

This tool also provides "floor correction function", when you find that the floor is slightly tilted (the handle does not match the blue handle), may affect the calculation results, use this function to calibrate the floor height.

========================================================
Compositing imagery through OBS studio
========================================================

Next, we will synthesize the image through the OBS Studio software, before you begin, you need to decide the quality of the output movie, and the quality depends on your object and purpose, the following is the setting of this case, can be adjusted if necessary:

. Resolution: 1080P, 30FPS

. Movie Size: 100~200MB (3-5 minutes)

1.) in OBS Studio > Settings > Output > Volume label "Recording" (see Figure 12):

. Output mode: Advanced

. Type: Standard

. Video format: mp4

. Flow: 6000

2.) in OBS Studio > Settings > Images (refer to Figure 13):

. Source (full Screen) Resolution: 1920x1080

. Output (zoom) Resolution: 1920x1080

. Common fps:29.97

3.) to get a movie with a resolution of 1080P, you must open the game on a 4K screen in full screen mode (see Figure 14):

. Press the Shift key to execute the game execution file (. exe)

. Deselect "Windowed"

. Screen resolution: 3840 x 2160


4.) in OBS Studio, three new sources must be added in the following order, namely "foreground", "Image capture" and "background" (refer to Figure 15):

A.) foreground setting (refer to Figure 15)

The foreground screen shows the visual angle of the virtual camera, in which objects between the head and the virtual camera are displayed in the game scene. Since the foreground object needs to be superimposed over the player, the foreground layer needs to be at the top.

. Add a source "Get window"

. window: Specifies the name of the game's execution file

. Cancel the capture cursor

Add two effect filters to the foreground layer:

. Crop (right: 1920, under: 1080)

. Color Key goes back to the black part of the foreground layer.

B.) Image acquisition Settings [Resolution: 1920x1080] (refer to Figure 17)

This source primarily captures images from the camera, so a new image capture device is added:

. Device: Select a network camera or video capture card

. Resolution/FPS Type: Custom

. Resolution: 1080 X

Add a special effect filter:

. Chroma Key (green) will be behind the player to do the back of the screen, you can adjust the "similarity" and "smoothness" to achieve a better back effect

C.) Background setting

The background screen shows the visual angle of the virtual camera, and in the game scene, objects that are between the head and the distant background are displayed in this screen.

Add a special effect filter to the background layer:

? Crop (right: 1920, on: 1080)

--------------------------------------------------------------------------------------------------------------- -------

Start making your own HTC Vive Mixed Reality (mixed reality) video now!!






<ignore_js_op>

<ignore_js_op><ignore_js_op><ignore_js_op><ignore_js_op><ignore_js_op><ignore _js_op><ignore_js_op><ignore_js_op><ignore_js_op><ignore_js_op><ignore_js_op> <ignore_js_op><ignore_js_op><ignore_js_op><ignore_js_op><ignore_js_op>

How to shoot mixed Reality (mixed reality) videos via HTC Vive

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.