Analysis of CMT tracking algorithm for computer Vision CV 3

Source: Internet
Author: User

1 Preface

In the previous blog, we analyzed the overall algorithm flow of CMT and the implementation of the previous steps, then we continue to analyze the following steps.

2 Step 4,5,6 feature point matching and data fusion

These steps are to get the feature points of this frame by tracking and feature matching, and combine the two.

The previous article analyzed the optical flow, and analyzed the feature matching here. The source code is as follows:

    //Detect keypoints, compute descriptors 计算当前图像的关键点    vector<KeyPoint> keypoints;    detector->detect(im_gray, keypoints);    // 计算当前图像特征点的描述    Mat descriptors;    descriptor->compute(im_gray, keypoints, descriptors);    //Match keypoints globally 在全局和之前的数据库匹配特征点,计算出匹配的特征点    vector<Point2f> points_matched_global;    vector<int> classes_matched_global;    matcher.matchGlobal(keypoints, descriptors, points_matched_global, classes_matched_global);

The main process is in the Matchglobal function, which is analyzed as follows:

voidMatcher::matchglobal (Const  vector<KeyPoint>& Keypoints,ConstMat descriptors, vector<Point2f>& Points_matched, vector<int>& classes_matched) {if(keypoints.size () = =0)    {return; } vector<vector<DMatch> >Matches//using Knnmatch for feature matching, each feature description matches the best 2 featuresBfmatcher->knnmatch (descriptors, database, matches,2); for(size_t i =0; I < matches.size (); i++) { vector<DMatch>m = Matches[i];//The distance here is the distance between two feature descriptions is not the distance between points and points, the greater the distance, the lower the matching degree        floatDistance1 = m[0].distance/desc_length;floatDistance2 = m[1].distance/desc_length;intMatched_class = classes[m[0].TRAINIDX];//If the match is a background, skip        if(Matched_class = =-1)Continue;//distance is less than a threshold value of 0.25, indicating a high degree of matching, skipping        if(Distance1 > Thr_dist)Continue;The //ratio is also less than the threshold of 0.8, which means that match 1 is much better than matching 2, so matching 1 can be the best match.         if(Distance1/distance2 > Thr_ratio)Continue;        Points_matched.push_back (keypoints[i].pt);    Classes_matched.push_back (Matched_class); }}

The distance above is Hamming distance:

A smaller distance indicates a higher degree of matching.

Next is the fusion trace and matching points, which analyze the code as follows:

    //Fuse tracked and globally matched points    //融合跟踪和匹配的点 将两种点都放在一起,并且不重复    vector<Point2f> points_fused;    vector<int> classes_fused;    fusion.preferFirst(points_tracked, classes_tracked, points_matched_global, classes_matched_global,            points_fused, classes_fused);

The core code in the Preferfirst function, the purpose is to not repeat the same feature points, it is good to understand:

voidFusion::p Referfirst (Const  vector<Point2f>& Points_first,Const  vector<int>& Classes_first,Const  vector<Point2f>& Points_second,Const  vector<int>& Classes_second, vector<Point2f>& Points_fused, vector<int>& classes_fused) {points_fused = Points_first; classes_fused = Classes_first;//purpose is to add the same feature points without repeating     for(size_t i =0; I < points_second.size (); i++) {intClass_second = Classes_second[i];BOOLFound =false; for(size_t j =0; J < Points_first.size (); J + +) {intClass_first = Classes_first[j];if(Class_first = = class_second) found =true; }if(!found)            {Points_fused.push_back (points_second[i]);        Classes_fused.push_back (Class_second); }    }}
3 Step 8,9 estimated zoom ratio and rotation angle

First of all, the problem of how to calculate, the fact that the principle is very simple, that is, at the beginning we have stored the original feature points, and is a normalized feature point points_normalized, first calculate the relative distance between 22 and the relative angle, the specific idea see the previous blog diagram, the initial code is as follows:

    for0; i < num_points; i++)    {        for0; j < num_points; j++)        {            Point2f v = points_normalized[i] - points_normalized[j];            floatdistance = norm(v);            float angle = atan2(v.y,v.x);            distances_pairwise.at<floatdistance;            angles_pairwise.at<float>(i,j) = angle;        }    }

Then for the new feature points, it is also calculated their relative distance and relative angle, and with the initial data division or subtraction, will be changed.
Finally take their median as the overall zoom ratio and rotation.
The code is as follows:

voidConsensus::estimatescalerotation (Const  vector<Point2f>& Points,Const  vector<int>& Classes,float& Scale,float& rotation) {//compute pairwise changes in Scale/rotation    //Calculate pairwise changes from scale and rotation scale     vector<float>Changes_scale;if(Estimate_scale) Changes_scale.reserve (Points.size () *points.size ()); vector<float>Changes_angles;if(estimate_rotation) Changes_angles.reserve (Points.size () *points.size ()); for(size_t i =0; I < points.size (); i++) { for(size_t j =0; J < Points.size (); J + +) {if(Classes[i]! = Classes[j]) {//calculates the relative position of any two feature pointsPOINT2F v = points[i]-points[j];if(Estimate_scale) {//Calculate distance                    floatDistance = Norm (v);//Gets the initial distance of the feature point                    floatDistance_original = distances_pairwise.at<float> (classes[i],classes[j]);//Divide the ratio to get changed                    floatChange_scale = distance/distance_original;                Changes_scale.push_back (Change_scale); }if(estimate_rotation) {//Calculate relative angle                    floatAngle =atan2(v.y,v.x);//Calculate initial angle                    floatAngle_original = angles_pairwise.at<float> (classes[i],classes[j]);//Calculation angle Change                    floatChange_angle = angle-angle_original;//fix Long Angles                    if(fabs(change_angle) > M_pi) {change_angle = SGN (change_angle) *2* M_PI + change_angle;                } changes_angles.push_back (Change_angle); }            }        }    }//do not with Changes_scale, changes_angle after this point as their order is changed by median ()    //Calculate the median as the result    if(Changes_scale.size () <2) scale =1;ElseScale = median (Changes_scale);if(Changes_angles.size () <2) Rotation =0;Elserotation = median (changes_angles);}

Time relationship, first analyze to zoom and rotate this step, the next article analyzes the last few steps of CMT.

This article is original article, reprint please indicate source: 47830463

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.