Target tracking algorithm based on Meanshift and its implementation

Source: Internet
Author: User
Tags pow

First, Introduction

First of all, the theory of non-parametric density estimation, which is called nonparametric Estimation, is a branch of mathematical statistics, and the parameter density estimation is a common method of estimating probability density. The parameter density estimation method requires that the characteristic space obey a known probability density function, which is difficult to achieve in the practical application. The method of non-parametric density estimation has the least requirement of prior knowledge, and it is estimated by training data, and can be used for arbitrary shape density estimation. Therefore, the value of the density function at a continuous point can be estimated by the method of non-parametric density estimation, that is, the structural form of the probability density function is not specified beforehand. The common methods for estimating the parameters of non-parametric density are histogram method, nearest neighbor domain method and kernel density estimation method.

The meanshift algorithm is a kernel density estimation method, which does not require any prior knowledge and relies entirely on the calculation of the density function values of sample points in the feature space. For a set of sampled data, the histogram method usually divides the value of the data into several equal intervals, the data is divided into several groups according to the interval, the ratio of the number of each set of data to General staff is the probability of each unit; The kernel density estimation method is similar to the Histogram method, but it has a kernel function for smoothing data. By using the kernel function estimation method, we can gradually converge to any density function in the case of sufficient sampling, that is, it is possible to estimate the density of the data subjected to any distribution.

Then talk about the basic idea and physical meaning of Meanshift:

In addition, as you can see from Equation 1, the contribution to the final MH (x) calculation is the same as long as it is a sample point that falls into SH, regardless of its distance from the center x. However, in the real-time tracking process, when the tracking target has occlusion and other effects, because the outer pixel value is vulnerable to occlusion or background, so the pixel near the center of the target model is more reliable than the outside pixels. Therefore, for all sampling points, the importance of each sample point should be different, and the farther away from the center point, the smaller its weight should be. Therefore, the kernel function and the weight coefficient are introduced to improve the robustness of the tracking algorithm and to increase the searching and tracking ability.

Next, talk about the kernel function:

Kernel functions, also called window functions, play a smooth role in nuclear estimation. The kernel functions commonly used are: Uniform,epannechnikov,gaussian and so on. The algorithm uses only Epannechnikov, and it is defined in several order as follows:

Second, the target tracking algorithm based on Meanshift

The target tracking algorithm based on mean shift is described for target model and candidate model by calculating the eigenvalue probability of the pixels in the target region and candidate region respectively, then the similarity function is used to measure the similarities between the initial frame target model and the candidate template of the current frame. Select the candidate model that makes the similarity function the largest and get the Meanshift vector about the target model, which is the vector where the target moves from the initial position to the correct position. Because of the fast convergence of the mean shift algorithm, the algorithm will converge to the real position of the target by iterative computation of the Meanshift vector, which can achieve the goal of tracking.

The following diagram shows the basic principles of the Meanshift tracking algorithm visually. As shown in: The target trace starts at the data point xi0 (The Hollow dot xi0,xi1,...,xin represents the center point, the superscript is the number of iterations, the surrounding black dots represent the window sample points that are moving continuously, and the dashed circle represents the size of the density estimation window). The arrows represent the drift vectors of the sample points relative to the center point of the kernel function, and the average drift vectors point to the densest direction of the sample points, which is the gradient direction. Because the meanshift algorithm is convergent, so in the current frame through iterative search feature space in the most dense area of the sample point, the search point along the direction of the density of the sample point "drift" to the local density of the maximum point Xin, which is considered the target location, so as to achieve the purpose of tracking, The Meanshift tracing process is complete.

The realization process of moving target "concrete algorithm":

Third, the code implementation

Description

1. RGB color space planing, using 16*16*16 histogram

2. The probability density calculation formula for the target model and the candidate model is referenced above

3. OpenCV version operation: Press P to stop, intercept the target, and then press p for single target tracking

4. Matlab version, the video changed to a picture sequence, the first frame stop, manually calibrate the target, double-click the target area, single-target tracking.

MATLAB version:

function [] = SELECT () Close All;clear all;%%%%%%%%%%%%%%%%%% the tracking target according to a fully visible image of the target%%%%%%%%%%%%%%%%%%%%%%%i=imread (' Result72.jpg '); figure (1); Imshow (I); [Temp,rect]=imcrop (I); [A,b,c]=size (temp); %a:row,b:col%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% compute the weight matrix of the target image%%%%%%%%%%%%%%%%%%%%%%%y (1) =a/2;y (2) =b/2;tic_x=rect (1) + Rect (3)/2;tic_y=rect (2) +rect (4)/2;m_wei=zeros (A, b),% weight matrix h=y (1) ^2+y (2) ^2;% bandwidth for i=1:a for J=1:b dist= (i-y (1)) ^2+        (J-y (2)) ^2; M_wei (i,j) =1-dist/h; %epanechnikov profile endendc=1/sum (sum (m_wei));% normalization factor% calculates the target weight histogram qu%hist1=c*wei_hist (temp,m_wei,a,b);%target  Modelhist1=zeros (1,4096); i=1:a for j=1:b%rgb color space quantization to 16*16*16 bins Q_r=fix (Double (temp (i,j,1))/16);        %fix is the nearly 0 rounding function q_g=fix (double (temp (i,j,2))/16);        Q_b=fix (Double (temp (i,j,3))/16);            Q_temp=q_r*256+q_g*16+q_b;    % set the percentage of red, green, and blue components per pixel hist1 (q_temp+1) = Hist1 (q_temp+1) +m_wei (I,J); % calculates the weight of each pixel in the histogram statistic Endendhist1=hist1*c;rect (3) =ceil (Rect (3)); Rect (4) =ceil (Rect (4));%%%%%%%%%%%%%%%%%%%%%%%%% reads the sequence image Myfile=dir (' D:\matlab7\work\mean shift\image\*.jpg '); Lengthfile=length (myfile); for l=1:    Lengthfile Im=imread (MyFile (L). name);    num=0;            y=[2,2];        %%%%%%%mean Shift Iteration while ((Y (1) ^2+y (2) ^2>0.5) &num<20)% iteration condition num=num+1;        Temp1=imcrop (Im,rect);        % calculation candidate Area histogram%hist2=c*wei_hist (temp1,m_wei,a,b);%target candidates pu Hist2=zeros (1,4096);                For i=1:a for J=1:b Q_r=fix (double (Temp1 (i,j,1))/16);                Q_g=fix (Double (Temp1 (i,j,2))/16);                Q_b=fix (Double (Temp1 (i,j,3))/16);                Q_TEMP1 (i,j) =q_r*256+q_g*16+q_b;            Hist2 (Q_TEMP1 (i,j) +1) = Hist2 (Q_TEMP1 (i,j) +1) +m_wei (I,J);        End End Hist2=hist2*c;        Figure (2);        Subplot (1,2,1);        Plot (HIST2);                Hold on;        W=zeros (1,4096);            For i=1:4096 if (Hist2 (i) ~=0)% is not equal to W (i) =sqrt (Hist1 (i)/hist2 (i));       Else         W (i) = 0;        End end% variable initialization sum_w=0;        xw=[0,0];            For i=1:a;                For J=1:b sum_w=sum_w+w (UInt32 (Q_TEMP1 (i,j)) +1);            Xw=xw+w (UInt32 (Q_TEMP1 (i,j)) +1) *[i-y (1) -0.5,j-y (2)-0.5];        End End Y=xw/sum_w;        % Center Point Location Update rect (1) =rect (1) +y (2);    Rect (2) =rect (2) +y (1);    End%%% Tracking Track Matrix%%% Tic_x=[tic_x;rect (1) +rect (3)/2];        Tic_y=[tic_y;rect (2) +rect (4)/2];    V1=rect (1);    V2=rect (2);    V3=rect (3);    V4=rect (4);    %%% Display trace Results%%% subplot (1,2,2);    Imshow (Uint8 (Im));    Title (' Target tracking results and their trajectories ');    Hold on; Plot ([v1,v1+v3],[v2,v2],[v1,v1],[v2,v2+v4],[v1,v1+v3],[v2+v4,v2+v4],[v1+v3,v1+v3],[v2,v2+v4], ' LineWidth ', 2, '    Color ', ' R ');        Plot (tic_x,tic_y, ' linewidth ', 2, ' Color ', ' B '); End


OpenCV version:

#include "stdafx.h" #include "cv.h" #include "highgui.h" #define U_char unsigned char#define DIST 0.5#define NUM 20//global variable B Ool Pause = False;bool is_tracking = false;   Cvrect drawing_box;iplimage *current;double *hist1, *hist2;double *m_wei;//weight matrix double C = 0.0; Normalization coefficient void init_target (double *hist1, double *m_wei, Iplimage *current) {iplimage *pic_hist = 0;int t_h, T_w, t_x, T_y;dou Ble h, dist;int I, J;int q_r, Q_g, q_b, q_temp;t_h = Drawing_box.height;t_w = drawing_box.width;t_x = Drawing_box.x;t_y = Drawing_box.y;h = Pow ((double) t_w)/2,2) + POW (((double) t_h)/2,2);//Bandwidth pic_hist = Cvcreateimage (Cvsize (300,200), Ipl_     depth_8u,3); Generate histogram image//Initialize weights matrix and target histogram for (i = 0;i < t_w*t_h;i++) {m_wei[i] = 0.0;} for (i=0;i<4096;i++) {hist1[i] = 0.0;}  for (i = 0;i < T_h; i++) {for (j = 0;j < T_w; J + +) {dist = POW (i-(double) t_h/2,2) + POW (J-(double) t_w/2,2); m_wei[i * T_w + j] = 1-dist/h; printf ("%f\n", M_wei[i * t_w + j]); C + = M_wei[i * T_w + j];}} Calculate target weights straight for (i = t_y;i < t_y + T_H i++) {for (j = t_x;j < t_x + T_w; j + +) {//rgb color space quantization to 16*16*16 Binsq_r = ((U_char) current->imagedata[i * Current->wid Thstep + J * 3 + 2])/16;q_g = ((U_char) current->imagedata[i * current->widthstep + J * 3 + 1])/16;q_b = ((U_char ) Current->imagedata[i * Current->widthstep + J * 3 + 0])/16;q_temp = Q_r * + q_g * + q_b;hist1[q_temp] = h Ist1[q_temp] + m_wei[(i-t_y) * t_w + (j-t_x)];}} Normalized histogram for (i=0;i<4096;i++) {hist1[i] = Hist1[i]/c;//printf ("%f\n", Hist1[i]);} Generate the target histogram double temp_max=0.0;for (i = 0;i < 4096;i++)//For histogram maximum, in order to normalized {//printf ("%f\n", val_hist[i]); if (Temp_max < Hist1[i]) {Temp_max = Hist1[i];}} Picture histogram cvpoint p1,p2;double bin_width= (double) pic_hist->width/4096;double bin_unith= (double) pic_hist->height /temp_max;for (i = 0;i < 4096; i++) {p1.x = i * bin_width;p1.y = pic_hist->height;p2.x = (i + 1) *bin_width;p2.y = pic _hist->height-hist1[i] * bin_unith;//printf ("%d,%d,%d,%d\n", p1.x,p1.y,p2.x,p2.y); Cvrectangle (pic_hist,p1,p2,cvscalar (0,255,0), -1,8,0);} Cvsaveimage ("Hist1.jpg", pic_hist); Cvreleaseimage (&pic_hist);}  void Meanshift_tracking (Iplimage *current) {int num = 0, i = 0, j = 0;int T_w = 0, T_h = 0, t_x = 0, t_y = 0;double *w = 0,  *hist2 = 0;double sum_w = 0, x1 = 0, x2 = 0,y1 = 2.0, y2 = 2.0;int q_r, q_g, q_b;int *q_temp;iplimage *pic_hist = 0;t_w =     Drawing_box.width;t_h = Drawing_box.height;pic_hist = Cvcreateimage (Cvsize (300,200), ipl_depth_8u,3); Generate histogram image hist2 = (double *) malloc (sizeof (double) *4096); w = (double *) malloc (sizeof (double) *4096); q_temp = (int *) malloc (sizeof (int) *t_w*t_h), while (POW (y2,2) + POW (y1,2) > 0.5) && (num < num)) {num++;t_x = Drawing_box.x;t_y = Dr Awing_box.y;memset (q_temp,0,sizeof (int) *t_w*t_h); for (i = 0;i<4096;i++) {W[i] = 0.0;hist2[i] = 0.0;} for (i = t_y;i < T_h + t_y;i++) {for (j = t_x;j < T_w + t_x;j++) {//rgb color space quantization to 16*16*16 Binsq_r = ((U_char) current-> Imagedata[i * Current->widthstep + J * 3 + 2])/16;q_g = ((U_char) current->imagedAta[i * Current->widthstep + J * 3 + 1])/16;q_b = ((U_char) current->imagedata[i * current->widthstep + J * 3 + 0]/16;q_temp[(i-t_y) *t_w + j-t_x] = Q_r * + q_g * + q_b;hist2[q_temp[(i-t_y) *t_w + j-t_x]] = hist2[q_ temp[(i-t_y) *t_w + j-t_x]] + m_wei[(i-t_y) * t_w + j-t_x];}} Normalized histogram for (i=0;i<4096;i++) {hist2[i] = Hist2[i]/c;//printf ("%f\n", Hist2[i]);} Generate the target histogram double temp_max=0.0;for (i=0;i<4096;i++)//For histogram maximum, in order to normalization {if (Temp_max < hist2[i]) {Temp_max = Hist2[i];}} Picture histogram cvpoint p1,p2;double bin_width= (double) pic_hist->width/(4368);d ouble bin_unith= (double) pic_hist->  Height/temp_max;for (i = 0;i < 4096; i++) {p1.x = i * bin_width;p1.y = pic_hist->height;p2.x = (i + 1) *bin_width;p2.y = pic_hist->height-hist2[i] * Bin_unith;cvrectangle (Pic_hist,p1,p2,cvscalar (0,255,0), -1,8,0);} Cvsaveimage ("Hist2.jpg", pic_hist); for (i = 0;i < 4096;i++) {if (hist2[i]! = 0) {W[i] = sqrt (Hist1[i]/hist2[i]);} Else{w[i] = 0;}} Sum_w = 0.0;x1 = 0.0;x2= 0.0;for (i = 0;i < T_h; i++) {for (j = 0;j < T_w; J + +) {//printf ("%d\n", Q_temp[i * t_w + j]); sum_w = Sum_w + W[q_tem P[i * t_w + j]];x1 = x1 + w[q_temp[i * t_w + j]] * (I-T_H/2); x2 = x2 + w[q_temp[i * t_w + j]] * (J-T_W/2);}} Y1 = x1/sum_w;y2 = x2/sum_w;//Center Point location Update drawing_box.x + = y2;drawing_box.y + y1;//printf ("%d,%d\n", Drawing_box.x,drawing_ BOX.Y);} Free (HIST2), Free (w), Free (q_temp),//Display trace results Cvrectangle (Current,cvpoint (drawing_box.x,drawing_box.y), Cvpoint ( Drawing_box.x+drawing_box.width,drawing_box.y+drawing_box.height), Cv_rgb (255,0,0), 2); CvShowImage ("Meanshift", //cvsaveimage ("Result.jpg", current); Cvreleaseimage (&pic_hist);} void Onmouse (int event, int x, int y, int flags, void *param) {if (pause) {switch (event) {case Cv_event_lbuttondown://the Rectdrawing_box.x=x;drawing_box.y=y;break;case Cv_event_lbuttonup://finish drawing the rect (use col or green for finish) Drawing_box.width=x-drawing_box.x;drawing_box.height=y-drawing_box.y;cvrectangle (cUrrent,cvpoint (DRAWING_BOX.X,DRAWING_BOX.Y), Cvpoint (drawing_box.x+drawing_box.width,drawing_box.y+drawing_ Box.height), Cv_rgb (255,0,0), 2); Cvshowimage ("Meanshift", current);//target initialization hist1 = (double *) malloc (sizeof (double) *16 *16*16); M_wei = (double *) malloc (sizeof (double) *drawing_box.height*drawing_box.width); Init_target (Hist1, M_wei, current); is_tracking = True;break;} return;}} int _tmain (int argc, _tchar* argv[]) {cvcapture *capture=cvcreatefilecapture ("Test.avi"); current = Cvqueryframe ( capture); char res[20];int nframe = 0;while (1) {/*sprintf (res, "result%d.jpg", Nframe); Cvsaveimage (res,current); nframe ++;*/if (is_tracking) {meanshift_tracking (current);} int C=cvwaitkey (1);//Pause if (c = = ' P ') {pause = True;cvsetmousecallback ("Meanshift", Onmouse, 0);} while (pause) {if (Cvwaitkey (0) = = ' P ') pause = false;} Cvshowimage ("Meanshift", current); current = Cvqueryframe (capture); Grab a Frame}cvnamedwindow ("Meanshift", 1), Cvreleasecapture (&capture); Cvdestroywindow ("Meanshift"); return 0;}

Target tracking algorithm based on Meanshift and its implementation

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.