Leap motion, as a gesture recognition device, has the advantage of accuracy over kniect.
In the development of my graduation design "Scene Rover". The gesture control of Leap motion is an important part. In this paper, we talk about the implementation of gesture recognition using leap motion in development and the areas needing attention.
I. Assessment of the capabilities of leap Motion
Before setting the gesture. We need to know what leap motion can do to avoid finding it very difficult to implement after setting the scheme.
This assessment relies on the actual experience with the device. Mainly from three aspects:
Visual gesture recognition interface provided by 1.Leap Motion
2.SDK Document Description
Apps in the 3.Leap store
Basically able to draw:
1.Leap motion recognition can be better identified for horizontal or horizontal-based gestures.
2. There will be an error in the fist or vertical behavior recognition. This error is related to the detailed gesture behavior.
3. It should not be overly dependent on high accuracy, and Leap motion can detect a millimeter level, but sometimes it will recognize your straight fingers as bent. So try to prepare for the worst.
Second, the actual need
Move, rotate, click button, zoom and rotate objects, close programs, pause, the main functional requirement is this.
There are some principles:
1. Gestures in the same environment should be approached and conveniently converted. The transition between rotation and movement should be designed very naturally.
2. Gesture avoidance conflict, gesture too similar is not a good thing.
For example, three straight fingers and four straight fingers should not be designed to be two gestures. Of course it's not absolute. Suppose you take a slow motion and the action is a camera for leap motion, you should believe it. At least make a separate test for this gesture.
Third, consider the main data structure and the contour of the algorithm
The SDK for Leap motion has been browsed in the first section. At the very least, you can know the information that leap motion can include. From the SDK it seems that this is very rich, since the design of their own gestures, it is best not to rely on the SKD development package of cool gestures. It is very likely that these gestures are only officially used to demonstrate or show off. The basic data structure of their own design gestures also has other advantages, such as the replacement of the somatosensory device, but the function is similar. It is only necessary to change the way data is obtained (from one SDK to another) without altering the algorithm.
The contour of the algorithm has a very large relationship with the basic data. So the data structure must be as concise as possible and agree to change (perhaps an algorithm occupies the decisive factor, but not at the beginning of consideration).
public class Handandfingerspoint:monobehaviour {const int buffer_max=5; Controller M_leapctrl; <span style= "White-space:pre" ></span>public e_handinaboveview m_aboveview = E_HandInAboveView.None; Finger-data, [0] means left hand, [1] means right hand private dictionary<finger.fingertype,fingerdata>[] M_fingerdatas = new dictionary< Finger.fingertype, Fingerdata>[2];//buffer,[0] represents the left hand, [1] represents the right hand, [, N] (n is 0, 3. indicates nth cache) private dictionary<finger.fingertype,fingerdata>[,] m_fingerdatasbuffer=new dictionary< Finger.fingertype, Fingerdata>[2,buffer_max];p rivate int M_curbufindex=0;//palm 0: Left hand and 1: Right hand private pointdata[] M_ Palmdatas = new Pointdata[2];p rivate readonly pointdata m_defaultpointdata = new Pointdata (Vector.zero, Vector.zero);
private readonly Fingerdata m_defaultfingerdata = new Fingerdata (Vector.zero,vector.zero,vector.zero);
The remainder of the Handandfingerspoint class is the method of filling, purging, and refreshing the data. E_handinaboveview records which hands go first into leap Motion's vision. Used to set the priority level.
The other two main data structures Pointdata and Fingerdata:
The data for one finger includes a fingertip point data and the location of the finger root bone data public struct fingerdata{public pointdata m_point;//The location of the fingertip and points to the public Vector m_position;/ /finger root bone position, for thumb is proximal phalanges proximal phalanx position public fingerdata (Pointdata pointdata, Vector pos) {m_point = Poin Tdata; M_position = pos; } public Fingerdata (vector pointpos, vector pointdir, vector pos) {m_point.m_position = Pointpos; M_point.m_direction = Pointdir; M_position = pos; The public void Set (Fingerdata fd) {m_point = Fd.m_point;m_position = fd.m_position; }}//data for a point, including direction and position public struct pointdata{public vector m_position;//position public vector m_direction;//direction public pointd ATA (Vector pos,vector dir) {m_position = Pos;m_direction = dir;} public void Set (Pointdata pd) {m_position = Pd.m_position;m_direction = pd.m_direction;} public void Set (Vector pos,vector dir) {m_position = Pos;m_direction = dir;}} First seen hand public enum e_handinaboveview{None, left, right}
After the basic data is defined, it is a good idea to confirm that the data is populated. actually through frame frame = Leap.Controller.Frame (), to get the latest data.
This is not a rush to write the basic data related methods. What is now finally needed is the rationality of the gesture algorithm. To infer whether it is reasonable, it is best to write an algorithm first.
The simplest is the outstretched hand gesture, which is used to roam the horizontal palm of the control, and the vertical extension is used for pausing. I found that the palms were dependent on the fingers and the fingers contained two states-straight and curved.
In addition, the other gestures, the fingers are straight or curved, plus the determination of the direction of accumulation of various effects. As a matter of course, the bending and straightening algorithms for fingers should be written separately:
<summary>///This method provides algorithms for matching a single finger, such as straighten. Bending///later possible changes: different scenarios may be required. The thresholds here may change//</summary>public class fingermatch{//The angle threshold of the bend state static readonly Float Fingerbendstate_radian = MATHF.PI*4F/18;//40 degrees//straighten state angle threshold static readonly float Fingerstrightstate_radian = MATHF.PI/12;//15 degrees//<summary >///finger straightening State, when the root-fingertip orientation and the direction of the deviation is less than the threshold, the determination of the finger is straight. Note the invalid direction is a zero vector. The first decision is 0 vectors///</summary>///<param name= "Adjustborder" > Fine tuning of Thresholds </param>///<returns></ Returns>public static bool Strightstate (Fingerdata fingerdata, float adjustborder=0f) {bool Isstright =false; Vector Disaldir = fingerdata.m_point.m_direction;//assumes a fingertip direction of 0 vectors, indicating invalid data if (!disaldir.equals (Vector.zero)) {vector Fingerdir = fingerdata.m_point.m_position-fingerdata.m_position;//fingertip position minus the position of the finger, the vector from the finger root to the fingertip, float radian = Fingerdir. AngleTo (Disaldir); if (Radian < Fingerstrightstate_radian + Adjustborder) {isstright = true;}} return isstright;} <summary>///infer whether a finger is bent or not.//</summary>///<param name= "Fingerdata" > Finger data to be judged </param>///<param name= "Bandborder" > Bent threshold </param>///< Returns></returns>public static bool Bendstate (Fingerdata fingerdata, float adjustborder=0f)//,out float Euleraugle) {bool Isbend = False;//euleraugle = -1f; Vector disaldir = fingerdata.m_point.m_direction;if (!disaldir.equals (Vector.zero)) {Vector Fingerdir = fingerData.m_ point.m_position-fingerdata.m_position;//fingertip position minus the position of the finger, refers to the vector followed to the fingertips float Radian = Fingerdir.angleto (Disaldir);// Euleraugle = radian*180/mathf.pi;//When the angle exceeds the defined threshold, it is assumed to be a curved state if (Radian > Fingerbendstate_radian + adjustborder) {isbend = true;}} return isbend;}}
The above includes an important concept-the threshold value. It is descriptive of the extent to which the narrative is straight and to what extent it is bent. The determination of the threshold value is determined by the actual test.
Writing here is also the time to do a simple test, after all, the contour of the algorithm has been determined. I didn't even write a judgment algorithm for the straight hand. It is possible to make sure.
Basic data structure-related Operations--handandfingerspoint class: Source GitHub link
This class uses basic data. Execution in Unity Editor shows the silhouette of a palm, and blue indicates the direction of the finger. Red indicates the connection of the finger bone root to the palm and fingertips, and yellow indicates the attachment of the palm to the fingertips:
Iv. Brief summary of gesture realization
Other code is available in my github:leap Motion in Unity3d repository. In the implementation of gestures, it also includes some small tricks. For example, the match of the action should prevent the error caused by the trembling of the finger. Take a discrete sample of the data-take a sample every time.
How to use and observe these scripts: the ability to put these scripts in a gameobject. By using leap Motion, you see that the properties of the script change when the match succeeds. In addition, the script includes the function of the registration of events, in other words. The external is able to register an event with random gestures so that the gesture finishes matching or does some extra processing when it reaches a certain matching state. These scripts do not now directly complete our needs, such as pausing. We need to make further restrictions on these gesture states or actions, such as setting the palms of the vertical forward in the direction of the palm as a pause, the horizontal palm for panning, and so on.
Gesture control using leap motion in Unity3d