This is the second example. Xiao Jin will introduce how to use openni to enable Kinect to identify and display the output. The current version of openni supports four types of gestures: raisehand, wave, click, and movinghand, respectively, representing the hand's "lifting" and "waving ", "Forward push" and "move" actions. It is worth mentioning that the current Microsoft official Kinect SDK does not support gesture recognition. It can also be said that it is one of the advantages of using openni.
With Gesture Recognition, you can actually use Kinect to do some practical things, such as using the mouse and keyboard to control some applications.
It's time to get started. Part of the code references heresy shoes. The gesture part is well written. on the shoulders of giants, Xiao Jin uses opencv to display the gesture, and red points indicate the hand position and track, the blue solid circle indicates the forward position, and the yellow line indicates the hand-waving trajectory (from the start point to the end point ).
#include <stdlib.h>#include <iostream>#include "opencv/cv.h"#include "opencv/highgui.h"#include <XnCppWrapper.h>using namespace std;using namespace cv;// output for XnPoint3Dostream& operator<<( ostream& out, const XnPoint3D& rPoint ){out << "(" << rPoint.X << "," << rPoint.Y << "," << rPoint.Z << ")";return out;}//【4】// callback function for gesture recognizedvoid XN_CALLBACK_TYPE gestureRecog( xn::GestureGenerator &generator, const XnChar *strGesture, const XnPoint3D *pIDPosition, const XnPoint3D *pEndPosition, void *pCookie ){cout << strGesture<<" from "<<*pIDPosition<<" to "<<*pEndPosition << endl;int imgStartX=0;int imgStartY=0;int imgEndX=0;int imgEndY=0;char locationinfo[100];imgStartX=(int)(640/2-(pIDPosition->X));imgStartY=(int)(480/2-(pIDPosition->Y));imgEndX=(int)(640/2-(pEndPosition->X));imgEndY=(int)(480/2-(pEndPosition->Y));IplImage* refimage=(IplImage*)pCookie;if(strcmp(strGesture,"RaiseHand")==0){cvCircle(refimage,cvPoint(imgStartX,imgStartY),1,CV_RGB(255,0,0),2);}else if(strcmp(strGesture,"Wave")==0){cvLine(refimage,cvPoint(imgStartX,imgStartY),cvPoint(imgEndX,imgEndY),CV_RGB(255,255,0),6);}else if(strcmp(strGesture,"Click")==0){cvCircle(refimage,cvPoint(imgStartX,imgStartY),6,CV_RGB(0,0,255),12);}cvSetImageROI(refimage,cvRect(40,450,640,30));CvFont font;cvInitFont( &font, CV_FONT_VECTOR0,1, 1, 0, 3, 5);cvSet(refimage, cvScalar(255,255,255));sprintf(locationinfo,"From: %d,%d to %d,%d",(int)pIDPosition->X,(int)pIDPosition->Y,(int)(pEndPosition->X),(int)(pEndPosition->Y));cvPutText(refimage, locationinfo ,cvPoint(30, 30), &font, CV_RGB(0,0,0));cvResetImageROI(refimage);}void clearImg(IplImage* inputimg){CvFont font;cvInitFont( &font, CV_FONT_VECTOR0,1, 1, 0, 3, 5);memset(inputimg->imageData,255,640*480*3);cvPutText(inputimg, "Hand Raise!" ,cvPoint(20, 20), &font, CV_RGB(255,0,0));cvPutText(inputimg, "Hand Wave!" , cvPoint(20, 50), &font, CV_RGB(255,255,0));cvPutText(inputimg, "Hand Push!" , cvPoint(20, 80), &font, CV_RGB(0,0,255));}//【5】// callback function for gesture progressvoid XN_CALLBACK_TYPE gestureProgress( xn::GestureGenerator &generator, const XnChar *strGesture, const XnPoint3D *pPosition, XnFloat fProgress, void *pCookie ){cout << strGesture << ":" << fProgress << " at " << *pPosition << endl;}int main( int argc, char** argv ){IplImage* drawPadImg=cvCreateImage(cvSize(640,480),IPL_DEPTH_8U,3);IplImage* cameraImg=cvCreateImage(cvSize(640,480),IPL_DEPTH_8U,3);cvNamedWindow("Gesture",1);cvNamedWindow("Camera",1);clearImg(drawPadImg);XnStatus res;char key=0;// contextxn::Context context;res = context.Init();xn::ImageMetaData imgMD;// create generator xn::ImageGenerator imageGenerator;res = imageGenerator.Create( context ); //【1】xn::GestureGenerator gestureGenerator;res = gestureGenerator.Create( context ); //【2】// Add gesture//gestureGenerator.AddGesture( "MovingHand", NULL );gestureGenerator.AddGesture( "Wave", NULL );gestureGenerator.AddGesture( "Click", NULL );gestureGenerator.AddGesture( "RaiseHand", NULL );//gestureGenerator.AddGesture("MovingHand",NULL); /【3】// 6. Register callback functions of gesture generatorXnCallbackHandle handle;gestureGenerator.RegisterGestureCallbacks( gestureRecog, gestureProgress, (void*)drawPadImg, handle );//start generate datacontext.StartGeneratingAll();res = context.WaitAndUpdateAll(); while( (key!=27) && !(res = context.WaitAndUpdateAll()) ) { if(key=='c'){clearImg(drawPadImg);}imageGenerator.GetMetaData(imgMD);memcpy(cameraImg->imageData,imgMD.Data(),640*480*3);cvCvtColor(cameraImg,cameraImg,CV_RGB2BGR);cvShowImage("Gesture",drawPadImg);cvShowImage("Camera",cameraImg);key=cvWaitKey(20);}cvDestroyWindow("Gesture");cvDestroyWindow("Camera");cvReleaseImage(&drawPadImg);cvReleaseImage(&cameraImg);context.StopGeneratingAll();context.Shutdown();return 0;}
[1] the shoes I read in the previous article should have understood the usage of the generator (portal). For Gesture Recognition, Xiao Jin uses the gesturegenerator generator, which is created in the same way as imagegenerator, use context as the parameter to call the create method.
[2] after a gesturegenerator is created, it only works when we tell it what gesture to recognize. Here, use the addgesture method to add the gesture.
[3] This step registers the callback functions related to gesturegenerator:
XnStatus xn::GestureGenerator::RegisterGestureCallbacks (GestureRecognized RecognizedCB, GestureProgress ProgressCB, void * pCookie, XnCallbackHandle & hCallback )
The definition of recognizedcb is the callback function after gesture recognition, and progresscb is the callback function in progress.
Pcookie is a pointer sent to the callback function. You can put some user data. In the code, Xiao Jin passes the artboard image pointer of the program to the callback function, in this way, you can directly plot in the callback function.
Phcallback is the handle of a callback function. You can call unregistergesturecallbacks (xncallbackhandle hcallback) to cancel the callback function ).
[4] This gesturerecog () is the real body of the recognizedcb callback function. It has five parameters:
Generator specifies the generator for gesture discovery.
Strgesture tells us which gesture has been recognized.
Pidposition and pendposition represent the start position and end position of the gesture. For raisehand gestures, the two values are the same, but for click and wave gestures, the values are different because the positions of the hands have changed.
Pcookie is the incoming user data.
In [4], Xiao Jin draws different gestures and displays the current pidposition and pendposition of the hand.
[5] The gestureprogress () callback function is similar to the gesturerecog () function because it is called during the gesture. It has only one position, indicating the position of the current hand. In addition, another fprogress indicates the current progress.
After the program is compiled and executed, the effect is as follows:
The test result of Xiao Jin is that raisehand is the easiest to trigger, basically in recognizedcb, and movinghand has not been triggered. Click and wave gestures can be triggered in both recognizedcb and progresscb. Although the track of the wave gesture is not necessarily accurate, the wave track is of little use.
If you use raisehand, click, and wave in combination and use location determination and adjustment to implement partial control of the application, there is not much problem. You can give full play to your imagination and have some good ideas to discuss!
----------------------------------
Author: Chen Jin)
This article is an original article. If you need to repost and quote it, please specify the original author and link. Thank you.