Human motion recognition system based on Kinect (both algorithms and codes are released)
first of all, the development environment version used by this system is the computer systems Windows 10, Visual Studio 2013, Opencv3.0, and the Kinect SDK v2.0. These can be found on Baidu, download down to install it.
For the Kinect environment configuration and bone data acquisition and so on, refer to my previous Kinect series blog (http://blog.csdn.net/baolinq/article/details/52373574).
Complete Engineering Code in GitHub (https://github.com/baolinhu/kinect-gesture), want to see can go to see, by the way a star~~~ One, the human body attitude feature extraction 1.1 Calculation of the relative distance coefficient of joint point
Because the CSDN edit page support for the formula is too unfriendly, I can only use screenshots ~ ~
the core algorithm of human motion recognition
The human action can be accurately identified according to the acquired data source. Here I use a simpler but more general method, because Kinect can directly obtain the human body's three-dimensional coordinates, according to the human body three-dimensional coordinates relative position relation can be to the human body's movement and shape to accurately judge and identify.
The algorithm flowchart is given below. Because lazy to draw, directly with the hand on the paper, the word Chou figure also Chou, will take a look at it ~ ~ ~
Fig. 5-1 The core algorithm flowchart of the system
According to the flow chart, human action system in the work, mainly through the human skeleton data to determine what kind of behavior, using the distance between the bone points and the relationship between the angle. This system only considers single simple behavior of single target human body, and does not consider multiple targets and multiple actions. First, to judge the position of the center of gravity changes, is generally the top and bottom changes, if the center of gravity has left, and the distance exceeds the given threshold, it is believed that the target human body has left movement behavior, the same, if the center of gravity has a right shift, and distance exceeds the given threshold, it is If the center of gravity does not move, there is definitely no left and right move behavior. For squat detection, from a defined point of view, the leg is curved. The system is also the main test leg is bent to a certain extent, to determine the hips, knees, ankle three-point line of the angle is less than 160 degrees (experience), if less than, that is squat, because under normal circumstances, three points between the angle of approximately 180 degrees. or to detect the hips, knees, ankle three points between the sum of 2 and greater than the hip to the knee between the direct distance of 1.15 times times, also think there is squat behavior. Finally is the test of the tension, jumping from the definition, is the feet off the ground, the body has a small movement up. The system is considered to be on the jump behavior by detecting whether the feet are out of the ground above a given threshold, or when the body's center of gravity rises more than a given threshold in normal standing. All motion recognition is real-time and the results are output in real time.
Part of the core code: complete code at the end of the text will give a full download link
Detection function: In order to detect accurate please try to stand in the right place, let Kinect get the whole body bone point void Cbodybasics::D etection (joint joints[]) {static double tin, tout;
Double tframe; Cmfc_demo01dlg *pdlg0 = Cmfc_demo01dlg::s_pdlg; Instantiate a CMFC_DEMO01DLG pointer to calculate the height difference of each adjacent 10 frame, thus calculating the speed, 1,11,12,22//approximately 30 frames per second, then 10 frames is 0.33 seconds, if (framenumber% = 1)//framenu
Mber is the frame serial number, which is defined by its own {tin = Static_cast<double> (GetTickCount ());
cout << "Tin is" << tin << Endl; Spinemid_xin = Joints[jointtype_spinemid].
position.x; Spinemid_yin = Joints[jointtype_spinemid].
POSITION.Y; Rightfoot_yin = Joints[jointtype_kneeright].
POSITION.Y; Leftfoot_yin = Joints[jointtype_kneeleft].
POSITION.Y; Spinebase_yin = Joints[jointtype_spinebase].
POSITION.Y; Rightankle_yin = Joints[jointtype_ankleright].
POSITION.Y;
base_foot_in = Spinebase_yin-rightankle_yin;
cout << "Basefootin for:" << base_foot_in << Endl;
cout << "The height of the current spineheightin is" << spineheightin << "M" <<endl; } if (! ( Framenumber% ) {tout = static_cast<double> (GetTickCount ());
cout << frmamenumber << Endl;
cout << "Tout is" << tout << endl;
cout << "Calculate a descent rate per 10 frames" << Endl; Spinemid_xout = Joints[jointtype_spinemid].
position.x; Spinemid_yout = Joints[jointtype_spinemid].
POSITION.Y; Rightfoot_yout = Joints[jointtype_kneeright].
POSITION.Y; Leftfoot_yout = Joints[jointtype_kneeleft].
POSITION.Y; Rightankle_yout = Joints[jointtype_ankleright].
POSITION.Y; Spinebase_yout = Joints[jointtype_spinebase].
POSITION.Y;
Base_foot_out = Spinebase_yout-rightankle_yout;
cout << "Current frame number is:" << base_foot_out << Endl;
cout << "***********************************" << Endl;
cout << "The height of the current spineheightin is" << spineheightin << "M" << Endl;
Tframe = (tout-tin)/gettickfrequency ();
cout <<tframe << Endl;
cout << gettickfrequency () <<endl; cout <<
"The current spineheightout height is" << spineheightout << "M" << Endl;
Spinev = (spineheightin-spineheightout)/tframe;
spinemid_x = Spinemid_xout-spinemid_xin;
Spinemid_y = Spinemid_yout-spinemid_yin;
rightfoot_y = Rightfoot_yout-rightfoot_yin;
leftfoot_y = Leftfoot_yout-leftfoot_yin;
Base_foot = base_foot_out-base_foot_in; cout << "How much is spinemid_x?"
"<< base_foot << Endl; cout << "How much is spinemid_y?"
"<< spinemid_y << Endl; On-Hop detection: feet from the ground more than 0.15 meters, or the body center of gravity relatively normal standing up more than 0.15 meters if ((leftfoot_y>0.15&&rightfoot_y > 0.15) | | (Spinetemp>0.01&&spinetemp + 0.15<joints[jointtype_spinemid]. POSITION.Y))//y axis is positive {string str1 = ' hop on \ r \ n '; This is in order to output data to the MFC display box, you can not pipe, the same as CString CStr = Str1.c_str ();
Delete the contents of the edit box, method one: GetDlgItem (idc_edit1)->setwindowtext (""); Method Two: Define a control variable for the edit box, m_edit1.
SetWindowText (""); Pdlg0->m_outedit.
SetSel (-1); Pdlg0->m_outedit. Replacesel (CStr);
cout << str1; The console output is shown below. For ease of debugging, this will also be shown to the console}//else if (Base_foot <-thresh_y)//Squat detection: The main test leg is bent, the distance between the hips and ankle is reduced by more than 0.2 m else if (Distance j Oints[jointtype_hipleft], Joints[jointtype_ankleleft]) * (1 + 0.15) < Distance (Joints[jointtype_hipleft), joints[
Jointtype_kneeleft]) + Distance (Joints[jointtype_kneeleft], joints[jointtype_ankleleft)) {flag++;
if (flag = = 2)//Squat state need time, here gave a bit of sign, similar to the timer, continuous monitoring to two times before the squat, to avoid repeated occurrence of the result {flag = 0; Squat other detection methods, can also detect hipleft, knee, ankle three points between the angle relationship and distance relationship, the angle is less than 160 degrees (can be more than a few/more), indicating that there is squat, or on both sides of the sum is greater than the third side of about 1.15 times times, can also indicate that there is squat str
ing str1 = "squat \ r \ n";
CString CStr = Str1.c_str (); Pdlg0->m_outedit.
SetSel (-1); Pdlg0->m_outedit.
Replacesel (CStr);
cout << str1;
//x axis direction to the right of positive//center of gravity over threshold thresh_x, then right shift if (Spinemid_x > Thresh_x) {string str1 = "Right shift \ r \ n";
CString CStr = Str1.c_str (); Pdlg0->m_outedit.
SetSel (-1); Pdlg0->m_outedit.
Replacesel (CStr); cout << str1;
If else if (spinemid_x <-thresh_x)////center of gravity moves more than the threshold thresh_x to the left, the decision is left to move {string str1 = "Left-shift \ r \ n";
CString CStr = Str1.c_str (); Pdlg0->m_outedit.
SetSel (-1); Pdlg0->m_outedit.
Replacesel (CStr);
cout << str1; }///According to the Pythagorean theorem, calculate the distance relationship between Hipleft, Ankleleft and Ankleleft. 0.15 is an estimate that can be slightly adjusted according to the actual situation/* if (Distance (Joints[jointtype_hipleft], Joints[jointtype_ankleleft]) * (1 + 0.15) < Distance (Joints[jointtype_hipleft], Joints[jointtype_kneeleft]) + Distance (joints[jointtype_kneeleft), joints[
Jointtype_ankleleft])) {flag++;
if (flag = = 2)//Squat state need time, here gave a bit of sign, similar to the timer, continuous monitoring to two times before the squat, to avoid repeated occurrence of the result {flag = 0;
cout << "Squat 1111111\n"; }
}
*/
}
Summary: Although the whole system of human action recognition is very simple, but there is a certain guidance and inspiration, the accurate recognition of human action has been widely used in man-machine interaction [1], intelligent monitoring [2], Robot autonomous navigation [3], animation games and medical rehabilitation and other fields, There is a great sense that interested students can continue to do more in-depth research.
Reference documents:
[15] Yu Tao. Kinect Application Development Combat: dialogue with robots in the most natural way. Beijing: Machinery Industry Press, 2012. 1-337.
[16] Shu Jingshu, Zhou National. Research on gesture recognition based on the Kinect skeleton tracking technique. Anhui Agricultural Science, 2014,42 (11): 3444-3446.
[17] Shei Liang, Lio Hongjian, Yang Yubao. The research of attitude recognition and application based on Kinect. Computer technology and development, 2005,23 (5): 258-260.
[18] Battle Shade, Yuchi branch, Cai Jun. A posture recognition algorithm based on the Kinect angle measurement. Sensors and micro Systems, 2014,33 (7): 129-132.
[19] Liu Kaiyu, Xia. Real-time human posture recognition based on Kinect [J]. Electronic design Engineering, 2014 (19): 31-34.
Series Blog URL:
The first overview of the V2 based on the Kinect's fall to detection system
http://blog.csdn.net/baolinq/article/details/52356863
The second article KinectV2 combined with OPENCV introductory development and some related learning materials
http://blog.csdn.net/baolinq/article/details/52356947
The third article KinectV2 skeleton acquisition principle and obtaining method and source code
http://blog.csdn.net/baolinq/article/details/52373574
The fourth chapter uses the Kinect to pull the picture and the automatic photograph procedure
http://blog.csdn.net/baolinq/article/details/52388095
Fifth chapter analysis of Fall detection algorithm
http://blog.csdn.net/baolinq/article/details/52400040
Sixth chapter KINECTV2 display and process image data with MFC (top)
http://blog.csdn.net/baolinq/article/details/52401116
Seventh chapter KINECTV2 display and processing of image data with MFC (Part I)
http://blog.csdn.net/baolinq/article/details/52422206
The eighth chapter is based on the summary of KINECTV2 Fall detection System
http://blog.csdn.net/baolinq/article/details/52440447
Well, this article is coming to an end here. This article mainly introduces the use of Kinect to accurately identify human movements, you can play their own imagination, custom more action, applied in a wider range of scenes. The following is attached more complete and standard source download URL, 1 points download (because seemingly can not 0 points upload resources, embarrassed ~).
http://download.csdn.net/download/baolinq/10003879
I'll see you in the next article.
Super Run Away ~