Early development experience of Kinect for Windows SDK (3) skeleton tracking

Source: Internet
Author: User
Tags silver light
ArticleDirectory
    • Initialization code
    • Add code
    • Written to the end
    • Initialization code
    • Add code
    • Written to the end

Http://www.cnblogs.com/aawolf/archive/2011/06/21/2086139.html

Author: Ma Ning

The development of our Kinect SDK is getting better and better. Skeleton tracking is the core technology of Kinect. With this technology, many interesting functions can be implemented.

First, let's take a look at the implementation of bone tracing. Kinect can track up to 20 bone points, and currently only the human body can be tracked, other objects or animals can't do anything. This section describes the distribution of the skeleton points of Kinect:

Initialization Code

Next, let's take a look at how the code for bone tracing is written. First, we need to create a new Visual C # project called "skeletontracking" and add the KinectProgramFor the work of the coding4fun assembly and coding4fun assembly, refer to the content in the previous article "experience the development of the Kinect for Windows SDK (ii) operating on camera", which will not be repeated here.

First, create a runtime object. during initialization, specify the runtimeoptions of useskeletaltracking, and then add the handler in the skeletonframeready event.

Runtime Nui;Private VoidWindow_loaded (ObjectSender, routedeventargs e) {Nui =NewRuntime (); Nui. initialize (runtimeoptions. useskeletaltracking); Nui. skeletonframeready + =NewEventhandler <skeletonframereadyeventargs> (nui_skeletonframeready );}

Next, the event processing function when the form is closed:

 
Private VoidWindow_closed (ObjectSender, eventargs e) {Nui. uninitialize ();}

At this time, add a breakpoint in the null nui_skeletonframeready event handler:

 
VoidNui_skeletonframeready (ObjectSender, skeletonframereadyeventargs e ){}

The skeletonframeready event is triggered when you connect to the Kinect device correctly and stand in front of it so that the player can properly recognize the human body.

We can see the returned event processing parameters, of which skeletonframe and skeletons are important.

Add code

Next, we will prepare the WPF interface, and add five balls on the interface to track the positions of the head, hands, and knees. In mainpage. XAML, add the following code:

 <  Canvas   Name  = "Maincanvas"  >  <  Ellipse   Canvas . Left  = "0"   Canvas . Top  = "0"   Height  = "50"   Name = "Headellipse"   Stroke  = "Black"   Width  = "50"   Fill  = "Orange"   />  <  Ellipse   Canvas . Left  = "50"   Canvas . Top  = "0"   Height  = "50"   Name  = "Rightellipse"  Stroke  = "Black"   Width  = "50"   Fill  = "Slategray"   />  <  Ellipse   Canvas . Left  = "100"   Canvas . Top  = "0"   Fill  = "Springgreen"   Height  = "50"   Name = "Leftellipse"   Stroke  = "Black"   Width  = "50"   />  <  Ellipse   Canvas . Left  = "150"   Canvas . Top  = "0"   Height  = "50"   Name  = "Kneerightellipse"   Stroke  = "Black"  Width  = "50"   Fill  = "Salmon"   />  <  Ellipse   Canvas . Left  = "200"   Canvas . Top  = "0"   Fill  = "Navy"   Height  = "50"   Name  = "Kneeleftellipse"   Stroke = "Black"   Width  = "50"   />  </  Canvas  > 

Then, we add the method to capture skeletondata in the skeletonframeready event handler:

 
VoidNui_skeletonframeready (ObjectSender, skeletonframereadyeventargs e) {skeletonframe allskeletons = E. skeletonframe;// Get the first tracked skeletonSkeletondata skeleton = (from SInAllskeletons. SkeletonsWhereS. trackingstate = skeletontrackingstate. trackedselect s). firstordefault ();}

We use LINQ to obtain the skeletondata whose trackingstate is equal to tracked. The joints attribute set of the skeletondata object stores information about all the skeleton points. The information of each bone point is a joint object, where position X, Y, and Z represent three-dimensional positions. The range of X and Y is-1 to 1, while Z is the distance from Kinect to the recognition object.

We can use the following code to scale the joint to a proper ratio:

 
Joint J = skeleton. joints [jointid. handright]. scaleto (640,480,. 5f,. 5f );

The last two parameters are the maximum and minimum values of the original size. The preceding statement is equivalent to extending the range from-0.5 to 0.5 to 0 to 640.

We encapsulate a function to convert the obtained skeletondata to a circle on the screen:

 
Private VoidSetellipseposition (frameworkelement ellipse, joint) {var scaledjoint = joint. scaleto (640,480 ,. 5f ,. 5f); canvas. setleft (ellipse, scaledjoint. position. x); canvas. settop (ellipse, scaledjoint. position. Y );}

Finally, we can handle the skeletonframeready event as follows:

VoidNui_skeletonframeready (ObjectSender, skeletonframereadyeventargs e) {skeletonframe allskeletons = E. skeletonframe;// Get the first tracked skeletonSkeletondata skeleton = (from SInAllskeletons. SkeletonsWhereS. trackingstate = skeletontrackingstate. trackedselect S ). firstordefault (); setellipseposition (headellipse, skeleton. joints [jointid. head]); setellipseposition (leftellipse, skeleton. joints [jointid. handleft]); setellipseposition (rightellipse, skeleton. joints [jointid. handright]); setellipseposition (kneeleftellipse, skeleton. joints [jointid. kneeleft]); setellipseposition (kneerightellipse, skeleton. joints [jointid. kneeright]);}

Finally, the program runs as follows. It seems that knee recognition still has some problems:

When the program is running, we will find that the ball is beating. to reduce this problem, we need to set the transformsmooth attribute of the skeletonengine to true and specify the transformsmoothparameters parameter, this parameter should also be fine-tuned based on the specific application conditions.

The load function after the code is added. The Code is as follows:

Private VoidWindow_loaded (ObjectSender, routedeventargs e) {Nui =NewRuntime (); Nui. initialize (runtimeoptions. useskeletaltracking );// Must set to true and set after call to initializeNui. skeletonengine. transformsmooth =True;// Use to transform and reduce jitterVaR parameters =NewRequired {smoothing = 0.75f, correction = 0.0f, prediction = 0.0f, jitterradius = 0.05f, maxdeviationradius = 0.04f}; Nui. skeletonengine. smoothparameters = parameters; Nui. Parameters +NewEventhandler <skeletonframereadyeventargs> (nui_skeletonframeready );}
Written to the end

Here, we have introduced the best part of "skeleton tracking" in Kinect. You can write some interesting applications. Next, we will introduce the depth data, another core feature of Kinect.

 

Openxlive cup Windows Phone game development competition

Openxlive cup Windows Phone game development competition is jointly developed by openxlive and well-known developers in ChinaCommunity: Devdiv, smart machine network, wpmind, Silverlight silver light China and xNa game world, a Windows Phone game development competition.

Http://www.openxlive.net/posts/news/40

Author: Ma Ning

The development of our Kinect SDK is getting better and better. Skeleton tracking is the core technology of Kinect. With this technology, many interesting functions can be implemented.

First, let's take a look at the implementation of bone tracing. Kinect can track up to 20 bone points, and currently only the human body can be tracked, other objects or animals can't do anything. This section describes the distribution of the skeleton points of Kinect:

Initialization code

Next, let's take a look at how the code for bone tracing is written. First, we need to create a new project named "skeletontracking" for Visual C # and add the work of the Kinect assembly and coding4fun assembly, you can refer to the previous article titled "experience (2) Operating camera in the development of the Kinect for Windows SDK", which will not be repeated here.

First, create a runtime object. during initialization, specify the runtimeoptions of useskeletaltracking, and then add the handler in the skeletonframeready event.

Runtime Nui;Private VoidWindow_loaded (ObjectSender, routedeventargs e) {Nui =NewRuntime (); Nui. initialize (runtimeoptions. useskeletaltracking); Nui. skeletonframeready + =NewEventhandler <skeletonframereadyeventargs> (nui_skeletonframeready );}

Next, the event processing function when the form is closed:

 
Private VoidWindow_closed (ObjectSender, eventargs e) {Nui. uninitialize ();}

At this time, add a breakpoint in the null nui_skeletonframeready event handler:

 
VoidNui_skeletonframeready (ObjectSender, skeletonframereadyeventargs e ){}

The skeletonframeready event is triggered when you connect to the Kinect device correctly and stand in front of it so that the player can properly recognize the human body.

We can see the returned event processing parameters, of which skeletonframe and skeletons are important.

Add code

Next, we will prepare the WPF interface, and add five balls on the interface to track the positions of the head, hands, and knees. In mainpage. XAML, add the following code:

 <  Canvas   Name  = "Maincanvas"  >  <  Ellipse   Canvas . Left  = "0"   Canvas . Top  = "0"   Height  = "50"   Name = "Headellipse"   Stroke  = "Black"   Width  = "50"   Fill  = "Orange"   />  <  Ellipse   Canvas . Left  = "50"   Canvas . Top  = "0"   Height  = "50"   Name  = "Rightellipse"  Stroke  = "Black"   Width  = "50"   Fill  = "Slategray"   />  <  Ellipse   Canvas . Left  = "100"   Canvas . Top  = "0"   Fill  = "Springgreen"   Height  = "50"   Name = "Leftellipse"   Stroke  = "Black"   Width  = "50"   />  <  Ellipse   Canvas . Left  = "150"   Canvas . Top  = "0"   Height  = "50"   Name  = "Kneerightellipse"   Stroke  = "Black"  Width  = "50"   Fill  = "Salmon"   />  <  Ellipse   Canvas . Left  = "200"   Canvas . Top  = "0"   Fill  = "Navy"   Height  = "50"   Name  = "Kneeleftellipse"   Stroke = "Black"   Width  = "50"   />  </  Canvas  > 

Then, we add the method to capture skeletondata in the skeletonframeready event handler:

 
VoidNui_skeletonframeready (ObjectSender, skeletonframereadyeventargs e) {skeletonframe allskeletons = E. skeletonframe;// Get the first tracked skeletonSkeletondata skeleton = (from SInAllskeletons. SkeletonsWhereS. trackingstate = skeletontrackingstate. trackedselect s). firstordefault ();}

We use LINQ to obtain the skeletondata whose trackingstate is equal to tracked. The joints attribute set of the skeletondata object stores information about all the skeleton points. The information of each bone point is a joint object, where position X, Y, and Z represent three-dimensional positions. The range of X and Y is-1 to 1, while Z is the distance from Kinect to the recognition object.

We can use the following code to scale the joint to a proper ratio:

 
Joint J = skeleton. joints [jointid. handright]. scaleto (640,480,. 5f,. 5f );

The last two parameters are the maximum and minimum values of the original size. The preceding statement is equivalent to extending the range from-0.5 to 0.5 to 0 to 640.

We encapsulate a function to convert the obtained skeletondata to a circle on the screen:

 
Private VoidSetellipseposition (frameworkelement ellipse, joint) {var scaledjoint = joint. scaleto (640,480 ,. 5f ,. 5f); canvas. setleft (ellipse, scaledjoint. position. x); canvas. settop (ellipse, scaledjoint. position. Y );}

Finally, we can handle the skeletonframeready event as follows:

VoidNui_skeletonframeready (ObjectSender, skeletonframereadyeventargs e) {skeletonframe allskeletons = E. skeletonframe;// Get the first tracked skeletonSkeletondata skeleton = (from SInAllskeletons. SkeletonsWhereS. trackingstate = skeletontrackingstate. trackedselect S ). firstordefault (); setellipseposition (headellipse, skeleton. joints [jointid. head]); setellipseposition (leftellipse, skeleton. joints [jointid. handleft]); setellipseposition (rightellipse, skeleton. joints [jointid. handright]); setellipseposition (kneeleftellipse, skeleton. joints [jointid. kneeleft]); setellipseposition (kneerightellipse, skeleton. joints [jointid. kneeright]);}

Finally, the program runs as follows. It seems that knee recognition still has some problems:

When the program is running, we will find that the ball is beating. to reduce this problem, we need to set the transformsmooth attribute of the skeletonengine to true and specify the transformsmoothparameters parameter, this parameter should also be fine-tuned based on the specific application conditions.

The load function after the code is added. The Code is as follows:

Private VoidWindow_loaded (ObjectSender, routedeventargs e) {Nui =NewRuntime (); Nui. initialize (runtimeoptions. useskeletaltracking );// Must set to true and set after call to initializeNui. skeletonengine. transformsmooth =True;// Use to transform and reduce jitterVaR parameters =NewRequired {smoothing = 0.75f, correction = 0.0f, prediction = 0.0f, jitterradius = 0.05f, maxdeviationradius = 0.04f}; Nui. skeletonengine. smoothparameters = parameters; Nui. Parameters +NewEventhandler <skeletonframereadyeventargs> (nui_skeletonframeready );}
Written to the end

Here, we have introduced the best part of "skeleton tracking" in Kinect. You can write some interesting applications. Next, we will introduce the depth data, another core feature of Kinect.

 

Openxlive cup Windows Phone game development competition

Openxlive cup Windows Phone game development competition is a community jointly developed by openxlive and well-known developers in China: devdiv, smart machine network, wpmind, Silverlight silver light China and xNa game world, A Windows Phone game development competition was held together.

Http://www.openxlive.net/posts/news/40

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.