Kienct and Arduino learning notes (2) Depth images and depth maps in real-world coordinate systems

Source: Internet
Author: User
Reprinted please indicate the source: http://blog.csdn.net/lxk7280
First, contact the kinectorbit camera library. The download URL and brief introduction of this library are provided in this article: http://blog.csdn.net/lxk7280/article/details/38184355. Put the downloaded file in a subfolder of the corresponding processing.
How to operate the mouse and keyboard under the kinectorbit Library: 1. Right-click and drag the camera to shake the camera. 2. left-click drag: rotate around the object. 3. Scroll: zoom operation. 4. Save the P key and exit the O key. After the program is run again, the above stored pilot will start. The viewpoint parameters are stored in a file named Export orbitset_0.csv in the data folder. If the file is deleted, start with the default value.
First, attach the following three items: 1. Close View

Because the depth range and field angle range of the Kinect are as follows:
Colors and depth 1.2 ~ 3.6 meters
Skeleton Tracing 1.2 ~ 3.6 meters
AOYE Angle Horizontal 57 degrees, vertical 43 degrees
Some of my bodies are out of the range that the Kinect deep camera can take, so they are not shown in the picture.
2. Moderate distance Angle of View

Obviously, we can see the big projection of the table behind me and my arm behind me.
3. Remote View



Code: Step 1: import the three databases you need
import processing.opengl.*;import SimpleOpenNI.*;import kinectOrbit.*;
Step 2: define the object myorbit and Kinect
KinectOrbit myOrbit;SimpleOpenNI kinect;
Step 3: Initialize the object and start the deep camera
void setup(){    size(800,600,OPENGL);    myOrbit = new KinectOrbit(this,0);    kinect = new SimpleOpenNI(this);    kinect.enableDepth();}
Step 4: In 3D rendering, draw the vertex cloud and the video cone (that is, the 3D area visible on the screen. The Kinect cone indicates the area that the Kinect can see in the space .)
void draw(){    kinect.update();    background(0);        myOrbit.pushOrbit(this);    drawPointCloud();    kinect.drawCamFrustum();    myOrbit.popOrbit(this);}
Step 5: complete the point cloud Function
void drawPointCloud(){  int[] depthMap = kinect.depthMap();  int steps = 3;  int index;  PVector realWorldPoint;    stroke(255);  for(int y=0;y < kinect.depthHeight();y += steps){    for(int x=0;x < kinect.depthWidth();x += steps){      stroke(kinect.depthImage().get(x,y));      index = x + y * kinect.depthWidth();      if(depthMap[index] > 0){        realWorldPoint = kinect.depthMapRealWorld()[index];        point(realWorldPoint.x,realWorldPoint.y,realWorldPoint.z);      }    }  } }










Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.