Kinect 3D video capture

Source: Internet
Author: User

ProgramSource code: Http://idav.ucdavis.edu /~Okreylos/resdev/Kinect/Download.html

To run this program, you need to installVrui Toolkit, For http://idav.ucdavis.edu /~ Okreylos/resdev/vrui/download.html

Kinect2.2 requirements Vrui-2.4-001 and above.

Install vruitoolkit

1. install necessary tools first.

 
Sudo aptitude update

Sudo aptitude install build-essential

 
Sudo aptitude install zlib1g-dev Mesa-common-dev libgl1-mesa-dev libglu1-mesa-dev

2. install some optional Libraries

 
Sudo aptitude install libusb-1.0-0-dev libpng12-dev libjpeg62-dev libtiff4-dev
Sudo aptitude install libdc1394-22-dev libspeex-dev libogg-dev libtheora-dev libbluetooth-dev libopenal-Dev
 
These libraries can be connected to the Kinect via USB, download and save images in multiple formats, record and play back the stereo sound, capture the video, and send the video or audio to the Internet,
 
3. Download and install vrui
 
Decompress the downloaded installation package ~ /Src path, make, make install to complete the installation
 
4. Test
 
Go to the exampleprograms folder and make
 
Run showearmodel in the bin folder.
 
If the installation is successful, a earth model can be displayed.
 
 
 
 
 
InstallKinect2.2
 
In~ /SrcDecompress the installation package
 
Make
 
Make install

 
 
 
 
  

Use of kinect2.2

For a single Kinect

./Bin/kinectutil can be used to list the connected Kinect Devices

./Bin/rawkinectviewer <camera index>

The color and depth graphs can be calibrated and a binary intrinsicparameters-<serial number>. dat file is generated.

./Bin/kinectviewer-C <camera index> returns the reconstructed result.

For multiple Kinect servers

To merge the 3D reconstruction results of multiple Kinect cameras, you must first use rawkinectviewer to calibrate each camera. Then all the Kinect devices are calibrated based on the selected world coordinate system.

1. Place the calibration target in the field of view of all the devices to be calibrated.

2 for each Kinect device:

A runs rawkinectviewer and fits the mesh of the Target Image in the deep stream.

B. Save but not project the mesh to accept a series of 3D nodes.

C. Copy the node to the file kinectpoints-<serial number>. CSV

3. Create a New targetpoints.csv file that contains the 3D interior corner positions of the calibration target.

4 For each Kinect device:

A. Run alignpoints:

Alignpoints-ogkinectpoints-<serial number>. CSV targetpoints.csv

B. Observe the Mismatching points displayed by alignpoints. if the error is too large, repeat Step 2.

C. Save the optimal fitting transformation displayed in the final alignpoints to the projectortransform-<serial number>. txt file.

5. Run kinectviewer on all Kinect devices. All 3D videos are displayed, and there are more or less gaps in matching, depending on the quality of the previous calibration.

In addition, there are video demos on YouTube, but there is a gap between my running effect and the video. I guess it may be because there is a library that is not compatible with NVIDIA's graphics card.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.