SMO algorithm for support vector machines (MATLAB code)

Source: Internet
Author: User
Tags abs svm

Establish SMO.M

% function [Alpha,bias] = SMO (x, Y, C, tol) function model = SMO (x, Y, C, tol)% Smo:smo algorithm for svm%%implementation of the Sequential Minimal optimization (SMO)%training algorithm for Vapnik ' s support Vector machine (SVM) percent this is a Modi Fied code from Gavin Cawley's MATLAB support% Vector machine toolbox% (c) September 2000.%% Diego Andres alvarez.%% USAG E: [Alpha,bias] = SMO (K, Y, C, tol)% input:%% k:n x n kernel matrix% y:1 x n vector of labels, 1 or 1% c:a Regu Larization parameter such 0 <= alpha_i <= c/n% tol:tolerance for terminating criterion%% output:%% Alpha: 1 x N Lagrange multiplier coefficient% bias:scalar bias (offset) term% input/output arguments modified by Jooseuk Kim a nd Clayton Scott, 2007global smo;y = y '; ntp = size (x,1),%recompute c% C = C/ntp;%initializeii0 = Find (y = =-1); II1 = Find ( y = = 1); I0 = Ii0 (1); i1 = II1 (1); alpha_init = Zeros (NTP, 1); Alpha_init (I0) = C;alpha_init (i1) = C;bias_init = c* (X (I0,:) *x ( I1,:) '-X (I0,:) *x (i1,:) ') + 1;%i Nicializando Las Variablessmo.epsilon = 10^ (-6); Smo.tolerance = tol; Smo.y = y '; Smo. c = C; Smo.alpha = Alpha_init; Smo.bias = Bias_init; SMO.NTP = NTP; %number of training Points%caches:smo. Kcache = X*x '; %kernel evaluationssmo.error = zeros (smo.ntp,1); %error numchanged = 0; Examineall = 1;%when All data were examined and no changes do the loop reachs its%end. Otherwise, loops with all data and likely support vector is%alternated until all support vector is found.while ((Numchan Ged > 0) | |    Examineall) numchanged = 0; If Examineall%loop sobre todos los puntos for i = 1:ntp numchanged = numchanged + examineexample        (i);     End            else%loop sobre KKT points for i = 1:ntp%solo los puntos que violan las condiciones KKT if (Smo.alpha (i) >smo.epsilon) && (Smo.alpha (i) < (SMO.            C-smo.epsilon)) numchanged = numchanged + examineexample (i);        End    End        End IF (Examineall = = 1) examineall = 0;    ElseIf (numchanged = = 0) Examineall = 1; End;end;alpha = Smo.alpha '; Alpha (Alpha < Smo.epsilon) = 0;alpha (Alpha > C-smo.epsilon) = C;bias =-SMO.BIAS;MODEL.W = (y.*alpha) * X; %%%%%%%%%%%%%%%%%%%%%%model.b = bias;return;function RESULT = FWD (n) global SMO; LN = Length (n); RESULT =-smo.bias + sum (Repmat (SMO.Y,1,LN). * Repmat (SMO.ALPHA,1,LN). * SMO. Kcache (:, N)) '; Return;function RESULT = Examineexample (i2)%first heuristic selects I2 and asks to Examineexample to find a% Second point (I1) in order to do a optimization step with the%lagrange multipliersglobal smo;alpha2 = Smo.alpha (I2); y2 = smo.y (I2), if (Alpha2 > Smo.epsilon) && (Alpha2 < (SMO. C-smo.epsilon)) e2 = Smo.error (I2), Else e2 = FWD (i2)-y2;end;% R2 < 0 if point i2 is placed between margin (-1) -(+1)% Otherwise R2 is > 0. r2 = F2*Y2-1R2 = E2*y2; %kkt conditions:% r2>0 and alpha2==0 (well classified)% r2==0 and 0% r2<0 and Alpha2==c (supporT vectors between margins) percent Test the KKT conditions for the current i2 point. The percent If a point are well classified their alpha must be 0 or if% it's out of its margin its alpha must be C. If it is in margin% its alpha must are between 0%take action only if I2 violates Karush-kuhn-tucker conditionsif ((R2 < -smo.tolerance) && (Alpha2 < (SMO. C-smo.epsilon))) | | ... ((R2 > Smo.tolerance) && (Alpha2 > Smo.epsilon))% If It doens ' t violate KKT conditions then exit, OTHERW    Ise continue. %try I2 by three ways;     If successful, then immediately return 1;    RESULT = 1; % first the routine tries to find a i1 Lagrange multiplier that% maximizes the measure | e1-e2|.    As large this value is as bigger% the dual objective function becames.    % in this first test, only support vectors would be tested. POS = Find ((Smo.alpha > Smo.epsilon) & (Smo.alpha < (SMO).    (C-smo.epsilon)));    [Max,i1] = MAX (ABS (E2-smo.error (POS))); If ~isempty (i1) if TAKestep (I1, I2, E2), return;    End    End        %the Second heuristic Choose any Lagrange Multiplier that's a SV and tries to optimize for i1 = Randperm (SMO.NTP) if (Smo.alpha (i1) > Smo.epsilon) & (Smo.alpha (I1) < (SMO.            C-smo.epsilon)%if A good i1 is found, optimise if Takestep (I1, I2, E2), return;        End End end%if Both HEURISTC above fail, iterate over all data set for I1 = Randperm (SMO.NTP) if ~ ((Smo.alph A (I1) > Smo.epsilon) & (Smo.alpha (I1) < (SMO.            C-smo.epsilon))) if Takestep (I1, I2, E2), return;        End End End;end; %no Progress Possibleresult = 0;return;function RESULT = Takestep (I1, I2, E2)% for a pair of alpha indexes, verify if it I s possible to execute% the optimisation described by Platt.global SMO; RESULT = 0;if (I1 = = I2), return; end;% compute Upper and lower constraints, L and H, on multiplier a2alpha1 = Smo.alpha (I1); ALPHA2 = Smo.alpha (i2); y1 = Smo.y (I1);y2 = smo.y (I2); C = SMO. C K = SMO. Kcache;s = y1*y2;if (y1 ~= y2) L = max (0, ALPHA2-ALPHA1); H = min (C, alpha2-alpha1+c); else L = max (0, alpha1+alpha2-c);    h = min (C, ALPHA1+ALPHA2); End;if (L = = H), return;end;if (Alpha1 > Smo.epsilon) & (Alpha1 < (C-smo.epsilon)) e1 = Smo.error (I1), Else e1 = FWD (I1)-y1;end;%if (Alpha2 > Smo.epsilon) & (Alpha2 < (C-smo.epsilon))% E2 = S Mo.error (I2);%else% e2 = FWD (i2)-Y2;%end;%compute etak11 = K (I1,I1); K12 = K (I1,I2); K22 = K (i2,i2), eta = 2.0*k12-k11-k22;%recompute Lagrange multiplier for pattern i2if (ETA < 0.0) A2 = alpha2-y2* (E    1-E2)/eta;    %constrain A2 to lie between L and H if (A2 < L) A2 = l;    ElseIf (A2 > h) a2 = h; End;else%when ETA isn't negative, the objective function W should be%evaluated at each end of the line segment.    Only those terms in the%objective function, depend on ALPHA2 need be evaluated ... ind = FIND (smo.alpha>0); Aa2 = L; Aa1 = alpha1 + s* (ALPHA2-AA2);    Lobj = Aa1 + Aa2 + sum ((-Y1*AA1/2). *SMO.Y (Ind). *k (IND,I1) + (-Y2*AA2/2). *SMO.Y (Ind). *k (IND,I2)); Aa2 = H;    Aa1 = alpha1 + s* (ALPHA2-AA2);    Hobj = Aa1 + Aa2 + sum ((-Y1*AA1/2). *SMO.Y (Ind). *k (IND,I1) + (-Y2*AA2/2). *SMO.Y (Ind). *k (IND,I2));    if (lobj>hobj+smo.epsilon) a2 = H;    ElseIf (lobj

Paint File: START_SMOFORSVM.M (click to automatically generate two-dimensional data, drawing, here is only linear, non-linear can be modified)

Clearx = []; y=[];figure;% Initialize training data to empty; Would get points from user% obtain points froom the User:trainpoints=x;trainlabels=y;clf;axis ([-5 5-5 5]); If IsEmpty (train    Points)% Define The symbols and colors we ' ll use in the plots latersymbols = {' O ', ' x '};classvals = [-1 1];trainlabels=[]; Hold on; % allow for overwriting existing plots Xlim ([-5 5]);        Ylim ([-5 5]);        For C = 1:2 title (sprintf (' Click-to-create points from-class%d. Press ENTER when finished. ', c));                [x, Y] = getpts;                Plot (X,y,symbols{c}, ' LineWidth ', 2, ' Color ', ' black ');        % Grow the data and label matrices trainpoints = Vertcat (trainpoints, [x y]);            Trainlabels = Vertcat (Trainlabels, Repmat (Classvals (c), Numel (x), 1)); endend% C = 10;tol = 0.001;% par = SMOFORSVM (trainpoints, Trainlabels, C, tol);% p=length (par.b); M=size (trainpoints,2);% if m==2%% for i=1:p%% plot (x (LC (i)-L (i) +1:LC (i), 1), X (LC (i)-L (i) +1:LC (i), 2), ' BO ')% hold on%% end% k =-PAR.W (1)/PAR.W (2);% B0 =-PAR.B/PAR.W (2);% bdown= (-par.b-1)/PAR.W (2); % bup= (-par.b+1)/PAR.W (2);% for i=1:p% hold on% h = refline (k,b0 (i)); % set (h, ' color ', ' r ')% Hdown=refline (K,bdown (i));% set (Hdown, ' Color ', ' B ')% Hup=reflin E (K,bup (i));% set (Hup, ' Color ', ' B ')% end% end% Xlim ([-5 5]); Ylim ([-5 5]);% pausec = 10;tol = 0.001;par = SMO (trainpoints, Trainlabels, C, tol);p =length (PAR.B); M=size (trainpoints,2); If m==2% for i=1:p% plot (x (LC (i)-L (i) +1:LC (i), 1), X (LC (i)-L (i) +1:LC (i), 2), ' Bo ')% hold on% end K    =-PAR.W (1)/PAR.W (2);    B0 =-PAR.B/PAR.W (2);    Bdown= (-par.b-1)/PAR.W (2);    bup= (-par.b+1)/PAR.W (2);         For i=1:p hold on h = refline (k,b0 (i));        Set (H, ' Color ', ' R ') hdown=refline (K,bdown (i));        Set (Hdown, ' Color ', ' B ') hup=refline (K,bup (i)); Set (Hup, ' Color ', ' B ') end Endxlim([-5 5]); Ylim ([-5 5]);

  

SMO algorithm for support vector machines (MATLAB code)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.