From Shenzhen, Wen Tao software Studio Blog
There are some open source Kinect gesture recognition libraries under WPF, and the Kinect for Windows SDK 1.7 version of Tool kit also offers a lot of gestures to the UI controls that are quite handy.
However, given the efficiency problem, our project must be developed using C + + (previously developed versions of WPF, not fluent).
Kinect's gesture recognition of C + + only provides some simple, in practice we can not directly use. We had to, according to the Kinect skeleton data, made our own hand-raising, buttons, face-changing, and left-right gesture tracking.
This article only provides recognition and tracking ideas for these gestures.
1. Raise Your Hand
Ensure that the x,y,z shaft distance between hand and wrist is within a certain threshold.
2, Button
The x,y coordinates of the hand overlap with the coordinates of the button, and the z axis of the hand tends to grow larger. It is worth mentioning that the x,y axis of the skeleton data needs to be converted two times before it can be compared with the coordinates of the button. The interface for two transitions is
Nuitransformskeletontodepthimage
Nuiimagegetcolorpixelcoordinatesfromdepthpixel
3, change face
The x,y axis coordinates and the face of the rectangular area overlap, time over a certain threshold
4, left page, right page
The x-coordinate of the hand changes dramatically in a short time.