Template matching is given an image and a template (the general template is much smaller than the size of the image), to find the most similar to the image of the template location, such as
The first is a given picture, the second is a template, and the third is the location of the search.
This search process, I did not seek in the source code, but according to tutorial, should be used sliding window method, the use of sliding windows, pixel-wise matching. This one-pixel meaning is that after anchoring a pixel, the similarity between the template and the image is calculated at the right and bottom of the pixel, within the range of the width and height of the stencil.
This method is similar to the method commonly used in pedestrian detection. is sliding window.
The calculation of the matching degree, that is, the method of calculating similarity generally has the following six kinds:
Method=cv_tm_sqdiff
Method=cv_tm_sqdiff_normed
Method=cv_tm_ccorr
Method=cv_tm_ccorr_normed
Method=cv_tm_ccoeff
where
Method=cv_tm_ccoeff_normed
The first and second is that the smaller the result, the better. The other four kinds are the results of the larger the better. Note that the third is generally not used, because this method will incorrectly match this position in the case where the image has a large pixel value. Look at the formula to know.
In addition, our result image also has a point to note that its width is the original image width-template width +1, its height is the original image height-template height +1. Why is that? If you think about it, you know.
When we get the result, we find the maximum or minimum value of this result (depending on our similarity calculation method) to find the matching position.
Description of the corresponding function in OpenCV:
void matchtemplate(inputarray image, Inputarray templ, Outputarray result , int method)
Method from 0 to 5, respectively, corresponding to the above six methods.
void minmaxloc(inputarray src, double* minval, double* maxval=0 , point* minloc=0, point* maxloc=0, Inputarraymask=noarray ())
The method used to find the maximum or minimum value and its position.
Corresponding usage examples:
1 ///Source image to display2 Mat Img_display;3 Img.copyto (img_display);4 5 ///Create The result matrix6 intResult_cols = Img.cols-templ.cols +1;7 intResult_rows = Img.rows-templ.rows +1;8 9 result.create (Result_cols, Result_rows, CV_32FC1);Ten One ///Do the Matching and Normalize A matchtemplate (IMG, Templ, result, Match_method); -Normalize (result, result,0,1, Norm_minmax,-1, Mat ()); - the ///localizing the best match with Minmaxloc - DoubleMinval;DoubleMaxval; Point Minloc; Point Maxloc; - Point matchloc; - +Minmaxloc (result, &minval, &maxval, &minloc, &Maxloc, Mat ()); - + ///for Sqdiff and Sqdiff_normed, the best matches is lower values. For all of the other methods, the higher the better A if(Match_method = = Cv_tm_sqdiff | | match_method = =cv_tm_sqdiff_normed) at{Matchloc =Minloc;} - Else -{Matchloc =Maxloc;} - - ///Show me What do you got -Rectangle (Img_display, Matchloc, point (matchloc.x + templ.cols, Matchloc.y + templ.rows), Scalar::all (0),2,8,0 ); inRectangle (result, Matchloc, point (matchloc.x + templ.cols, Matchloc.y + templ.rows), Scalar::all (0),2,8,0 ); - to imshow (Image_window, img_display); +Imshow (Result_window, result);
OpenCV notes (22)--Templates match template matching