Summary of Image stitching algorithm (II.)

Source: Internet
Author: User
Tags abs scalar first row


2. Feature extraction and matching



OpenCV about the surf algorithm in the section, often involves surf, surffeaturedetector, surfdescriptorextractor these three classes;



FEATURES2D.HPP header file, there are: TypeDef SURF surffeaturedetector; and typedef SURF Surfdescriptorextractor; The typedef declaration is to create a new name for the existing type, the type alias, that is, the surf class has two new names Surffeaturedetector and Surfdescriptorextractor.



In other words, the Surffeaturedetector class and the Surfdescriptorextractor class, in fact, are the surf class, the three of them are equivalent.



Therefore contains the necessary header files:



#include <iostream>



#include <stdio.h>



#include "highgui/highgui.hpp"



#include "opencv2/nonfree/nonfree.hpp"



#include "opencv2/legacy/legacy.hpp"



UsingNamespace CV;



UsingNamespace std;






2.1 Loading of stitching images:



Instance:



Matimagergbleft = Imread ("img_left_1.jpg");



Matimagergbright = Imread ("img_right_1.jpg");






Grayscale conversion, using grayscale images in feature extraction and matching modules, so the image needs to be converted to grayscale



Matimagegrayleft, Imagegrayright;



Cvtcolor (Imagergbleft,imagegrayleft, Cv_rgb2gray);



Cvtcolor (Imagergbright,imagegrayright, Cv_rgb2gray);






2.2 extracting feature points



Instance:



Intminhessian = 800; Setting the threshold Minhessian is related to the feature point of the last extraction, which is a parameter of the surf algorithm.



Surffeaturedetectorsurfdetector (Minhessian); Sea-slug matrix threshold value



Vector<keypoint>keypointleft, keypointright;//defines two keypoint to Keypoints_object, Keypoints_scene to store the detected feature points;



Surfdetector.detect (Imagegrayleft,keypointleft);



Surfdetector.detect (Imagegrayright,keypointright);






2.3 drawkeypoints ()



Role: Draw key points.



form: void drawkeypoints (const mat&image, constvector<keypoint>& keypoints, mat& outImage, ConstScalar &color=scalar::all ( -1), int flags=drawmatchesflags::D efault);



Parameters:



Image:const mat& type src, input image;



Keypoints:const vector<keypoint>& type of keypoints, according to the source image to obtain the feature point, it is an output parameter;



outimage:mat& type of outimage, the output image, its content depends on the fifth parameter identifier Falgs;



Color:const scalar& The color of the type, the key point of colors, has the default value Scalar::all (-1);



Flags:int type flags, which draw the key is the feature identifier, which has the default value drawmatchesflags::D efault;



Instance:



Matimageleftpoint, Imagerightpoint;



Drawkeypoints (Imagergbleft,keypointleft, Imageleftpoint, Scalar::all ( -1), Drawmatchesflags::D efault);



Imshow ("Imageleftpoint", imageleftpoint);



Drawkeypoints (Imagergbright,keypointright, Imagerightpoint, Scalar::all ( -1), Drawmatchesflags::D efault);



Imshow ("Imagerightpoint", imagerightpoint);



Waitkey (0);



The image of the feature point extraction is as follows:







Figure 9-3: Left image feature point extraction result







Figure 9-4: Right image feature point extraction result



2.4 Feature point description, prepare for feature point matching



Instance:



Surfdescriptorextractorsurfdescriptor;



Matimagedescleft, Imagedescright;



Surfdescriptor.compute (Imagegrayleft,keypointleft, imagedescleft);



Surfdescriptor.compute (Imagergbright,keypointright, imagedescright);






2.5 Get the matching feature points and extract the optimal pairing



Match the feature vectors of two images with the function match in the Bruteforcematcher class



Instance:



Bruteforcematchermatcher;



vector<dmatch>matchepoints;



Matcher.match (Imagedescleft,imagedescright, Matchepoints, Mat ());



Bruteforcematcher is a class derived from Descriptormatcher, and Descriptormatcher defines a common interface for different matching policies. After calling the match method, output a CV::D match vector in its third argument. So we define a std::vector<dmatch> type of matches.



Match the feature vectors of two images with the function match in the Flannbasedmatcher class



Instance:



Flannbasedmatchermatcher;



vector<dmatch>matchepoints;



Matcher.match (Imagedescleft,imagedescright, Matchepoints, Mat ());



The flannbasedmatcher algorithm is faster but the nearest neighbor-like match is found, so we often use flannbasedmatcher when we need to find a relatively good match but don't need the best match. It is also possible to adjust the parameters of the flannbasedmatcher to improve the accuracy of the matching or improve the speed of the algorithm, but the corresponding algorithm speed or algorithm accuracy will be affected.






2.6 drawmatches ()



Role: Draw the key points of the found match from two images.



form: void drawmatches (const mat& IMG1, constvector<keypoint>& keypoints1, const mat& IMG2, constvector& Lt keypoint>& keypoints2, const Vector<dmatch>&matches1to2, mat& outimg, const scalar& Matchcolor=scalar::all ( -1), const scalar& Singlepointcolor=scalar::all ( -1), constvector<char>& Matchesmask=vector<char> (), int flags=drawmatchesflags::D efault);



Parameters:



IMG1, Img2: Two images of the source;



Keypoints1, Keypoints2: Two key points in the source image;



Matches1to2: From the first image to match the second image, that is, from the keypoints1[i] to find the corresponding point with keypoints2[i];



OUTIMG: output image; its value depends on what flags--is drawing in the image;



Matchcolor: Match color (line and point color), if Matchcolor==scalar::all (-1), color randomly generated;



Singlepointcolor: The color of a single key point, that is, no key points to match, if Matchcolor==scalar::all (-1), color randomly generated;



Matchesmask: The mask determines which matching key points to draw, and if the mask is empty, draw all the matching key points;



Flags: Setting the drawing function, the value of the possible flag bit is determined by "drawmatchesflags";



Instance:



Matfirstmatches;



Drawmatches (Imagergbleft,keypointleft, Imagergbright, Keypointright,



Matchepoints, Firstmatches,scalar::all ( -1), Scalar::all (-1),



Vector<char> (), drawmatchesflags::not_draw_single_points);



Imshow ("First_matches", firstmatches);



Waitkey (0);



Draw a matching effect diagram as follows:







Figure 9-5: Feature Matching results



2.7 Matching feature points sort sort



The sort method can be ordered from small to large for the matching point, before the sort order, the distance between each match point pair (that is, match robustness degree) is randomly distributed, after sorting, the distance by the order of small to large, the more forward, the higher the matching degree, can be sorted by the pre-matching to extract out. In this example, the first 20 best bets are extracted.



Instance:



Sort (Matchepoints.begin (), Matchepoints.end ()); Feature Point Ordering



Get the best matching feature points ranked in the first n



Vector<point2f>imagepointsleft, Imagepointsright;



INTJ = 0;



for (int i = 0; I<matchepoints.size (); i++)



{



j + +;



Imagepointsleft.push_back (keypointleft[matchepoints[i].queryidx].pt);



Imagepointsright.push_back (keypointright[matchepoints[i].trainidx].pt);



if (J > 20)



Break



}



3. Image Registration



3.1 findhomography () and Ransac to remove false matches



Function: Look for the transformation of the key points on the two planar matches.



Form: Mat findhomography (Inputarray srcpoints, Inputarray dstpoints, intmethod=0, double ransacreprojthreshold=3, Outputarray Mask=noarray ());



Parameters:



Srcpoints: The coordinates of the midpoint of the original plane, i.e. a cv_32fc2 or vector<point2f> matrix;



Dstpoints: The coordinates of the midpoint of the target plane, i.e. a cv_32fc2 or vector<point2f> matrix;



Method: Methods used to calculate a single---0: A conventional method of using all points;



Cv_ransac: A robust method based on RANSAC;



Cv_lmeds: least average method;



Ransacreprojthreshold: When using the Cv_ransac method, a double point is used as the maximum projection error of the inner window;



Mask: Optional output mask setting for robust methods;



Get the projection mapping matrix for image 1 to Image 2, size 3*3



Vector<unsignedchar> Inliersmask (Imagepointsleft.size ());



Mathomo = Findhomography (Imagepointsleft, Imagepointsright, Cv_ransac, 5, inliersmask);//Use Cv_ransac to remove false matches



vector<dmatch>matches_ransac;



Manual retention Ransac Filtered match point pairs



for (int i = 0; I<inliersmask.size (); i++)



{



cout<<inliersmask[i]<<endl;



cout << (int) (Inliersmask[i]) << Endl;



if (Inliersmask[i])



{



Matches_ransac.push_back (Matchepoints[i]);



}



}



Matsecondmatches;



Drawmatches (Imagergbleft,keypointleft, Imagergbright, Keypointright,



Matches_ransac, Secondmatches,scalar::all ( -1), Scalar::all (-1),



Vector<char> (), drawmatchesflags::not_draw_single_points);//Use drawmatches to draw the matching points after Ransac



Imshow ("Secondmatches", secondmatches);



Waitkey (0);



The matching effect diagram is as follows:







Figure 9-6:ransac Feature matching results after error matching



H-Matrix Correction: If the direct use of the H-matrix image splicing will make the left-hand image of the pixel without overlapping areas of the coordinate change is negative, this is because the left image is in accordance with the coordinate system of the right graph, so the right and left relative to the horizontal axis in the non-overlapping area is negative, so need to reference the correction Where the first row of data in the H matrix affects the horizontal (x) direction of the offset, the second row of data affects the vertical (y) direction of the offset. So the horizontal offset needs to fix the first row of the H matrix, and the vertical offset needs to fix the second row of the H-matrix. The adjustment of the value of H-matrix can be achieved by multiplying the matrix by a 3*3, the correction matrix of the horizontal direction:



Matadjustmat= (mat_<double> (3,3) <<1.0,0,adjustvalue,0,1.0,0,0,0,1.0);



Correction matrix in the vertical direction:



Matadjustmat= (mat_<double> (3,3) <<1.0,0,0,0,1.0,adjustvalue,0,0,1.0);



Correction matrices in horizontal and vertical two directions:



Matadjustmat= (mat_<double> (3,3) <<1.0,0,adjustvalue1,0,1.0,adjustvalue2,0,0,1.0);



Instance:



The correction matrix is calculated by first defining the 4 vertex coordinates of the original image rectangle, then doing the radial change operation, calculating the coordinates of the coordinate after the registration, because the non-overlapping area coordinates of the left image will become negative, so the absolute value of the 0,0 point is the width of the non-overlapping area.



Vector<point2f>obj_corners (4);



Obj_corners[0]= Point (0, 0); OBJ_CORNERS[1] = point (imagergbleft.cols, 0);



Obj_corners[2]= Point (Imagergbleft.cols, imagergbleft.rows); OBJ_CORNERS[3] = point (0,imagergbleft.rows);



Vector<point2f>scene_corners (4);



Perspectivetransform (obj_corners,scene_corners, Homo);



Matadjustmat = (mat_<double> (3, 3) << 1.0, 0, ABS (scene_corners[0].x), 0, 1.0, 0, 0, 0, 1.0);



Matadjusthomo = Adjustmat*homo;






3.2 Perspective Changes



Voidwarpperspective (Inputarray src, Outputarray DST,



Inputarraym, Size Dsize,



Intflags=inter_linear,



Intbordermode=border_constant,



constscalar& bordervalue=scalar ());



SRC: Input 2-channel or 3-channel floating-point array, each element is a 2d/3d vector that will be converted;



DST: An output matrix with the same size and type as the input;



m:3x3 or 4x4 floating-point conversion matrix;



Dsize: Stitching image to complete the image size;



Matimagetransformleft;



Warpperspective (Imagergbleft,imagetransformleft, Adjusthomo, Size (Imagergbright.cols +abs (scene_corners[0].x), Imagergbright.rows));



The effect diagram is as follows:







Figure 9-7: Image registration before the original diagram







Figure 9-8: Image Registration Result graph



4. Image Fusion



4.1 positioning overlapping areas



The overlapping regions on the left side of the strongest match point accumulate, which is a stable transition, eliminating mutation



Matimage1overlap, Image2overlap; Overlapping portions of Figure 1 and Figure 2



Image1overlap= Imagetransformleft (Rect (Point (ABS (scene_corners[0].x), 0), point (ABS (scene_corners[0].x) + (Scene_ corners[2].x), imagergbright.rows));



Image2overlap= imagergbright (Rect (0, 0, Image1overlap.cols, image1overlap.rows));



Matimage1roicopy = Image1overlap.clone (); Copy an overlapping portion of Figure 1



4.2 Overlapping area weighted fusion



for (int i = 0; i<image1overlap.rows; i++)



{



for (int j = 0; j<image1overlap.cols;j++)



{



Double weight;



Weight = (double) j/image1overlap.cols; Coefficient of superposition changing with distance



Image1overlap.at<vec3b> (i, J) [0]= (1-weight) *image1roicopy.at<vec3b> (i, J) [0] +weight*image2overlap.at <Vec3b> (i, J) [0];



Image1overlap.at<vec3b> (i, J) [1]= (1-weight) *image1roicopy.at<vec3b> (i, J) [1] +weight*image2overlap.at <Vec3b> (i, J) [1];



Image1overlap.at<vec3b> (i, J) [2]= (1-weight) *image1roicopy.at<vec3b> (i, J) [2] +weight*image2overlap.at <Vec3b> (i, J) [2];



}



}



4.3 non-overlapping zone fusion



Matroimat = Imagergbright (Rect (Point (image1overlap.cols, 0), point (Imagergbright.cols, imagergbright.rows))); The parts that are not coincident in Figure 2



Roimat.copyto (Mat (imagetransformleft,rect abs (scene_corners[0].x) + (scene_corners[2].x), 0, Roimat.cols, (imagergbright.rows))); The non-coincident parts are directly linked up



Namedwindow ("Stitching results", 0);



Imshow ("Stitching results", imagetransformleft);



Imwrite ("d:\\ stitching result. jpg", imageTransform1);



Waitkey (0);



Image Fusion After the effect diagram is as follows:







Figure 9-9: Image Fusion Results











Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.