Brief introduction
Apple has provided us with a simple gesture recognizer, but for the recognition of graphical gestures such as pentagram, triangles, etc., it is necessary to implement them. By identifying these gestures, you can perform specific actions, or enter formulas, unleash spells, and so on, and you can spice up your app.
Download and use
The framework has been uploaded to GitHub, click here to download, welcome star!
The use of the framework has been stated on GitHub and is not mentioned here, this article mainly introduces the realization principle and framework structure of the graphic gesture recognition.
Structure of the framework
A graphical gesture is a curve that can be described with a sample point, and the class that stores the sample point is replaced with an SGGesturePoint CGPoint object-oriented operation specification.
The collection of sample points is SGGestureSet used to record all sample points as well as the names of the gestures, as well as the normalized gesture vectors.
Gesture vectors use SGGestureVector storage, and vectors are used to calculate cosine similarity by inserting and standardizing (x, y) coordinates of all sample points sequentially.
The class for standardizing collections, generating vectors, saving, loading, and recognizing gestures SGGestureManager is a singleton object that only needs to be cared for by the manager and set two objects, and other objects are managed by the manager.
Procedure for gesture sampling 1. Sampling
The demo shows samples done using Uipangesturerecognizer, each of which is one CGPoint , wrapping it in a scale and storing it in NSValue an array.
2. Generating collections
Using SGGestureSet the Gesturesetwithname:points: method, the name of the incoming gesture and the sample point (Nsvalue array), you can use the sample point to initialize a collection.
3. resampling
The graph of all the sampling points in the collection is treated as a polyline, the sample spacing is determined according to the desired sample density, and a uniformly distributed sample point is generated on the original curve, the code for generating the sample points is as follows, and the code is interval with the legend.
//To resample the curve, calculate the length of the curveSggestureset *tempset = *Set;DoubleSumlength =0; for(inti =1; i < tempset.countpoints; i++) {Sggesturepoint *pt1 = [Tempset pointatindex:i]; Sggesturepoint *pt2 = [Tempset pointatindex:i-1]; Sumlength + = [pt1 distanceto:pt2];}//Resample with sample uniform distributed pointsSggestureset *resampleset = [Sggestureset gestureSetWithName:tempSet.name];DoubleInterval = Sumlength/self.samplepointcount;DoubleD =0; Sggesturepoint *P1 = [Tempset pointatindex:0]; [Resampleset ADDGESTUREPOINT:P1]; for(inti =1; i < tempset.countpoints;) {Sggesturepoint *P2 = [Tempset pointatindex:i];Doubled = [P1 DISTANCETO:P2];if((d + D) >= Interval) {DoubleK = (interval-d)/D;Doublex = p1.x + k * (p2.x-p1.x);Doubley = p1.y + k * (P2.Y-P1.Y); Sggesturepoint *p = [Sggesturepoint gesturepointwithcgpoint:cgpointmake (x, y)]; [Resampleset Addgesturepoint:p]; B0; P1 = p; }Else{D + = D; P1 = p2; i++; }}
where d is used to determine the distance of the next sampling point after the folding inflection points, D is the adjacent sampling point spacing in the original set, and is a local line chart of a graphic gesture.
The first time the loop is entered, the P1 is the first sample point of the original collection (also the first point of the Resampling collection), and the P2 is the second sample point of the original collection, where the distance p is greater than the resampling sampling point spacing interval, at which point D=0,d+d=d>interval So go to the if branch.
Next, the step value of x and Y is calculated according to the proportional relationship between interval and D, thus the coordinate point of the next resampling point is obtained, and this point is used as the new P1.
And so on, since the length of the broken segment is much larger than interval, it is possible to distribute many new sampling points until the P1 is close enough to P2, making the interval>d as shown.
At this point the next sample should fall on the next segment, and in order to ensure uniform distribution, the next sample point distance from the break line inflection points should be subtracted from the current P1 to P2 distance, this is the role of D. Illustrates the purpose of this calculation.
This time it will go into the Else branch, update the P1 to P2 coordinates, and add D to the D, it should be noted that the next P2 coordinates are not the points in the original set, but based on the points computed by D, so the next point in the original set should be skipped, which is the role of i++. If the curve has enough short broken segments, it will continue to go to the Else branch, always accumulate D, until the new sampling point spacing, that is d+d>=interval, to generate a new sampling point, the situation in this figure only calculated once D entered the next longer segment, At this point, clear D and start distributing the sample points on the broken segment, as shown in.
After a number of such operations, the uniformly distributed resampling can be done, followed by a collection of resampling points.
4. Standardization of Curve position
By averaging the X and Y in the set, the center of gravity of the curve is obtained, and the curve is moved to the origin of coordinates according to the coordinate of center of gravity, and the curve of standard position is obtained.
5. Standardization of Curve dimensions
The curve is scaled to the standard size by scaling each point on the curve according to the bounding rectangle of the curve and the standard size.
6. Standardization of Curve Corners
The curve is normalized according to the angle of the first sampling point on the curve and the connection of the center, the current angle is iangle and the target angle is r, and the curve is rotated to the red position as shown by the coordinate transformation.
7. Generating vectors
In order to perform subsequent operations, it is necessary to set the two-element sampling point into a unary set, or as a multidimensional vector, by inserting the X-and y-coordinates of the sample points into a unary set, and standardizing the vectors, each of which represents a gesture that can be used for subsequent comparison operations.
The process of gesture recognition
For the gesture to be recognized, first through the above operation to get the gesture vector, and then the gesture vector and the vector in the gesture library of the cosine similarity of the operation, the cosine similarity comparison is the angle of the vector, the smaller the angle is more similar, according to a certain threshold to filter out all the matching conditions of the gesture, And at the end of the traversal, the optimal (the result of the operation is the smallest) as the matching result.
The calculation code for the cosine similarity is as follows:
- (Double) CosDistanceWithVector1: (Sggesturevector *) vec1 Vector2: (Sggesturevector *)VEC2{DoubleA =0;Doubleb =0; for(inti =0; I <= vec1.length-1&& I <=VEC2.length-1; i+=2) {A + = [vec1 doubleatindex:i] * [VEC2DOUBLEATINDEX:I] + [VEC1 doubleatindex:i +1] * [VEC2Doubleatindex:i +1]; b + = [VEC1 doubleatindex:i] * [VEC2Doubleatindex:i +1]-[VEC1 doubleatindex:i +1] * [VEC2DOUBLEATINDEX:I]; }DoubleAngle =Atan(b/a);return ACOsACos(angle) + b *Sin(angle));}
The code to identify a gesture is as follows: Standardize the set of gestures and get the vectors, then filter in the Gesture library and select the best results.
- (NSString*) Recognizegestureset: (Sggestureset *) set {[ Selfstandardizeset:&set]; Sggesturevector *VEC1 = [set getvector]; Sggestureset *bestset =Nil;DoubleMind = Cgfloat_max; for(inti =0; I < Self. Gesturesets. Count; i++) {Sggestureset *libset = Self. Gesturesets[i]; Sggesturevector *vec2 = [Libset getvector];DoubleD = [ SelfCOSDISTANCEWITHVECTOR1:VEC1 VECTOR2:VEC2];if(D <= Self. Threshold&& D < mind) {mind = D; Bestset = Libset; } }returnBestset. Name;}
Access to gestures
Each of the storage-related classes described above follows the Nscoding protocol, depositing each normalized SGGestureSet into an array and storing the array with the NSKeyedArchiver archive to disk, which needs to be read and then re NSKeyedUnarchiver -archived.
iOS graphics gesture Recognition Framework Sggesturerecognizer