OpenCV2: Feature matching and its optimization

Source: Internet
Author: User

In the simple feature matching of OpenCV2, a simple introduction is made to the steps of using OpenCV2 to perform feature matching, and the result of matching is very rough, in this article we make a simple summary of the matching refinement using OpenCV2. The main contents include the following:

    • Descriptormatcher
    • Dmatcher
    • KNN match
    • Calculates the base matrix F of the two views and refines the matching results
    • Computes the single-matrix H of two views and refines the matching results
Descriptormatcher and Dmatcher

Descriptormatcher are abstract classes that match eigenvectors, and feature matching methods in OpenCV2 inherit from that class (for example: Bfmatcher,flannbasedmatcher). This class mainly contains two sets of matching methods: the matching between the image pairs and the match between the image and an image set.

Declaration of the method used to match between pairs of images

//Find one best match for each query descriptor (if mask is empty).Cv_wrapvoidMatchConstmat& Querydescriptors,Constmat&traindescriptors, Cv_out vector<DMatch>& matches,Constmat& Mask=mat ())Const; //Find K best matches for each query descriptor (in increasing order of distances). //Compactresult is used if mask is not empty. If Compactresult is false matches//Vector would has the same size as querydescriptors rows. If Compactresult is True//matches vector won't contain matches for fully masked out query descriptors.Cv_wrapvoidKnnmatch (Constmat& Querydescriptors,Constmat&traindescriptors, Cv_out vector<vector<DMatch> >& matches,intK,Constmat& Mask=mat (),BOOLcompactresult=false)Const; //Find best matches for each query descriptor which has distance less than//MaxDistance (in increasing order of distances).    voidRadiusmatch (Constmat& Querydescriptors,Constmat&traindescriptors, Vector<vector<DMatch> >& matches,floatMaxDistance,Constmat& Mask=mat (),BOOLcompactresult=false)Const;

Method overloads, method declarations for image and image set matching

Cv_wrapvoidMatchConstmat& querydescriptors, Cv_out vector<dmatch>&matches,Constvector<mat>& masks=vector<mat>() ); Cv_wrapvoidKnnmatch (Constmat& querydescriptors, Cv_out vector<vector<dmatch> >& matches,intK,Constvector<mat>& masks=vector<mat> (),BOOLcompactresult=false ); voidRadiusmatch (Constmat& querydescriptors, vector<vector<dmatch> >& matches,floatMaxDistance,Constvector<mat>& masks=vector<mat> (),BOOLcompactresult=false);

Dmatcher is used to save matching results, mainly with the following properties

int // Query Descriptor Index    int // Train Descriptor Index    int imgidx;   // Train image Index float distance ;     

When an image is matched, there are two sets of images, the lookup set (Query set) and the training set (Train set), and the Train descriptor that best matches are saved for each Query descriptor,dmatch. In addition, each train image generates multiple train descriptor.

If it is a match between the image pairs, since all the train descriptor are generated by a train image, all imgidx in the matching result dmatch are the same, all 0.

Knnmatch

The matching process is likely to have errors in the match, there are two main errors: matching feature points are wrong, the feature points on the image can not match. Common Delete error matches are

    • Cross Filter

      If a feature point of the first image matches a feature point of the second image, an opposite check is made to match the feature point on the second image to the corresponding feature point on the first image, and if the match succeeds, the match is considered correct.

      The Bfmatcher in OpenCV already contains this filter bfmatcher Matcher (norm_l2,true), in the construction Bfmatcher is the second parameter set to True.

    • Ratio test
      Knnmatch, which can be set to K = 2, which returns two nearest neighbor descriptors for each match, is considered a match only if the distance between the first match and the second match is enough hours.

The Knnmatch method is encapsulated in the abstract base class Descriptormatcher, using the following methods:

voidFeaturematchtest::knnmatch (vector<dmatch>&matches) {    Const floatMinratio =1. F/1.5f; Const intK =2; Vector<vector<DMatch>>knnmatches; Matcher->knnmatch (Leftpattern->descriptors, rightpattern->descriptors, knnmatches, K);  for(size_t i =0; I < knnmatches.size (); i++) {        Constdmatch& Bestmatch = knnmatches[i][0]; Constdmatch& Bettermatch = knnmatches[i][1]; floatDistanceratio = bestmatch.distance/bettermatch.distance; if(Distanceratio <minratio) Matches.push_back (Bestmatch); }}

Rasic method calculates the base matrix and refines the matching results

If you already know the matching of multiple points between two views (images), you can calculate the base matrix F. The Findfundamentalmat method can be used in OpenCV2, which is declared as follows:

// ! finds fundamental matrix from a set of corresponding 2D points cv_exports_w Mat Findfundamentalmat (Inputarray points1, Inputarray points2,                                     int method=fm_ RANSAC,                                     double param1=3double param2=0.99,                                     Outputarray Mask=noarray ());

Parameter description:

Points1,points2 two images matching points, point coordinates if floating point numbers (float or double)

The third parameter method is a concrete way to calculate the underlying matrix, which is an enumeration value.

PARAM1,PARAM2 Keep the default value.

In the main case of the mask parameter, where n matches are used to calculate the underlying matrix, the value has n elements, each element has a value of 0 or 1. The value is 0 o'clock, which represents the match (outlier) of the match point, which is valid only when using the Ransac and Lmeds methods.

You can use this value to remove an incorrect match.

In addition, when the matching points are used to calculate the base matrix, the feature points are first aligned and the feature points are converted to 2D points, which are implemented as follows:

//Align All PointsVector<keypoint>AlignedKps1, ALIGNEDKPS2;  for(size_t i =0; I < matches.size (); i++) {alignedkps1.push_back (Leftpattern-Keypoints[matches[i].queryidx]); Alignedkps2.push_back (Rightpattern-Keypoints[matches[i].trainidx]); }    //keypoints to pointsVector<point2f>PS1, PS2;  for(Unsigned i =0; I < alignedkps1.size (); i++) Ps1.push_back (alignedkps1[i].pt);  for(Unsigned i =0; I < alignedkps2.size (); i++) Ps2.push_back (alignedkps2[i].pt);

Using the Ransac method to calculate the underlying matrix, you can get a status vector that removes the wrong match

//optimize Match ResultsVector<keypoint>Leftinlier; Vector<KeyPoint>Rightinlier; Vector<DMatch>Inliermatch; intindex =0;  for(Unsigned i =0; I < matches.size (); i++) {        if(Status[i]! =0) {leftinlier.push_back (alignedkps1[i]);            Rightinlier.push_back (Alignedkps2[i]); Matches[i].trainidx=index; Matches[i].queryidx=index;            Inliermatch.push_back (Matches[i]); Index++; }} Leftpattern->keypoints =Leftinlier; Rightpattern->keypoints =Rightinlier; Matches= Inliermatch;

Calculate the single-should matrix H and refine the matching results

Similar to the basic matrix, the single-matrix can also be computed after the matching feature points are obtained.

// ! Computes the best-fit Perspective transformation Mapping srcpoints to dstpoints. cv_exports_w Mat findhomography (Inputarray srcpoints, Inputarray dstpoints,                                 int method=0  Double ransacreprojthreshold=3,                                 Outputarray mask=noarray ());

Parameter description:

Srcpoints,dstpoints is a matching point in two views

Method is an enumeration value that is used to calculate the single-application matrix.

Ransacreprojthreshold is the maximum allowable anti-projection error, only valid when the Ransac method is used.

Mask is similar to Findfundamentalmat , indicating that the matched point is not an outlier and is used to optimize the matching result.

voidFeaturematchtest::refinematcheswithhomography (vector<dmatch>& matches,DoubleReprojectionthreshold, mat&homography) {    Const intminnumbermatchesallowed =8; if(Matches.size () <minnumbermatchesallowed)return; //Prepare data for findhomographyVector<point2f>srcpoints (Matches.size ()); Vector<Point2f>dstpoints (Matches.size ());  for(size_t i =0; I < matches.size (); i++) {Srcpoints[i]= rightpattern->keypoints[matches[i].trainidx].pt; Dstpoints[i]= leftpattern->keypoints[matches[i].queryidx].pt; }    //Find homography Matrix and get Inliers maskVector<uchar>Inliersmask (Srcpoints.size ()); Homography=findhomography (srcpoints, dstpoints, Cv_fm_ransac, Reprojectionthreshold, Inliersmask); Vector<DMatch>inliers;  for(size_t i =0; I < inliersmask.size (); i++){        if(Inliersmask[i]) inliers.push_back (matches[i]); } matches.swap (Inliers);}

Comparison of matching results
Filtering after the base matrix Filter after single-response matrix
Cross Filter Knnmatch

Code description

Defines the pattern structure used to hold the data needed for the matching process.

struct pattern{    cv::mat image;    Std::vector<cv::KeyPoint>  keypoints;    Cv::mat descriptors;    Pattern (Cv::mat& img):        image (IMG) {}};

The various matching methods are encapsulated into a class that fills in the constructor of the class the data required to obtain a match

Featurematchtest::featurematchtest (std::shared_ptr<pattern> left, std::shared_ptr<pattern> right, std: : SHARED_PTR&LT;CV::D escriptormatcher>matcher): Leftpattern (left), Rightpattern (right), Matcher (matcher) {//Step1:create Detector    intMinhessian = -;    Surffeaturedetector detector (Minhessian); //step2:detecte KeyPointDetector.detect (Leftpattern->image, leftpattern->keypoints); Detector.detect (Rightpattern->image, rightpattern->keypoints); //Step3:compute DescriptorDetector.compute (Leftpattern->image, leftpattern->keypoints, leftpattern->descriptors); Detector.compute (Rightpattern->image, Rightpattern->keypoints, rightpattern->descriptors);}

OpenCV2: Feature matching and its optimization

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.