# Paper 36: [Tutorial] optimization of SVM parameters based on Gridsearch

Source: Internet
Author: User
Tags svm rbf kernel

Respecting originality ~ ~ ~

A brief introduction to cross-validation (Validation) method ideas
Http://www.matlabsky.com/forum-v ...-fromuid-18677.html

The following excerpt from the MATLAB Neural Network 30 Case study, chapter 13th:

on the optimization of SVM parameter selection, there is no universally accepted uniform best method, now commonly used method is to let C and G in a certain range of values, for the set of C and G for the training set as the original data set using the K-CV method to get the training set in this group C and G to verify the classification accuracy rate, Finally, the group C and G that make the training set verify the highest classification accuracy are the best parameters, but one problem is that there may be multiple groups of C and G corresponding to the highest validation classification accuracy, how is this handled? The method adopted here is to select the group of C and G, which can achieve the highest verification classification accuracy and the lowest parameter C, as the best parameter, if the corresponding minimum C has more than one group of G, the first group of C and G is selected as the best parameter. The reason for this is: too high C will lead to the learning state, that is, the training set classification accuracy is very high and the test set classification accuracy is very low (the generalization ability of the classifier is reduced), so in the highest verification classification accuracy rate of all pairs of C and G is considered to be the smaller penalty parameter C is a better choice for the object.

The above-mentioned ideas have been implemented in the libsvm-mat-2.89-3[farutoultimate3.0] toolbox SVMCGFORCLASS.M (Classification problem optimization), SVMCGFORREGRESS.M (regression problem parameter optimization):

The function uses the interface as follows:

Grid parameter optimization function (classification problem): Svmcgforclass
[Bestcvaccuracy,bestc,bestg]= svmcgforclass (Train_label,train,cmin,cmax,gmin,gmax,v,cstep, Gstep,accstep)
Input:
Train_label: The label of the training set, the format requirement is the same as Svmtrain.
Train: Training set with the same format requirements as Svmtrain.
Cmin,cmax: The variation range of the penalty parameter C, that is, in the [2^cmin,2^cmax] range to find the best parameter C, the default value is Cmin=-8,cmax=8, that is, the default penalty parameter C range is [2^ (-8), 2^8].
GMIN,GMAX:RBF The range of the kernel parameter g, that is, in the [2^gmin,2^gmax] range to find the best RBF kernel parameter g, the default value is Gmin=-8,gmax=8, that is, the default RBF kernel parameter g range is [2^ (-8), 2^8].
V: The parameters in the cross Validation process, that is, the V-fold cross Validation of the training set, which defaults to 3, which is the 30 percent CV process by default.
Cstep,gstep: The parameter optimization is the step size of C and G, that is, the value of C is 2^cmin,2^ (cmin+cstep),..., 2^cmax,,g value is 2^gmin,2^ (gmin+gstep),..., 2^gmax, The default value is Cstep=1,gstep=1.
Accstep: The last parameter selects the step interval size (a number between [0,100]) of the exact rate discretization displayed in the result graph, which defaults to 4.5.
Output:
Bestcvaccuracy: The best classification accuracy rate in the final CV sense.
BESTC: The best parameter C.
Bestg: Best parameter G.
Grid parameter optimization function (regression problem): svmcgforregress
[bestcvmse,bestc,bestg]=
Svmcgforregress (Train_label,train,
Cmin,cmax,gmin,gmax,v,cstep,gstep,msestep)
The input and output are similar to Svmcgforclass and are not mentioned here.
svmcgforclass.m Source code:

function [BESTACC,BESTC,BESTG] = Svmcgforclass (train_label,train,cmin,cmax,gmin,gmax,v,cstep,gstep,accstep)
%SVMCG Cross Validation by Faruto

%%
% by Faruto
%email:[email protected] qq:516667408 Http://blog.sina.com.cn/faruto BNU

If the percent is reproduced please specify:
% Faruto and Liyang, libsvm-farutoultimateversion
% a toolbox with implements to support vectors machines based on LIBSVM, 2009.
%
% Chih-chung Chang and Chih-jen Lin, LIBSVM:A library for
% support Vector machines, 2001. Software available at
% HTTP://WWW.CSIE.NTU.EDU.TW/~CJLIN/LIBSVM

Percent of the parameters of svmcg
If Nargin <
accstep = 4.5;
End
if Nargin < 8
cstep = 0.8;
gstep = 0.8;
End
if Nargin < 7
v = 5;
End
if Nargin < 5
gmax = 8;
gmin =-8;
End
if Nargin < 3
Cmax = 8;
cmin =-8;
End
x:c y:g cg:cvaccuracy
[x, Y] = Meshgrid (Cmin:cstep:cmax,gmin:gstep:gmax);
[M,n] = size (X);
CG = zeros (m,n);

EPS = 10^ (-4);

Percent record acc with different C & G,and find the BESTACC with the smallest C
BESTC = 1;
BESTG = 0.1;
BESTACC = 0;
basenum = 2;
for i = 1:m
for j = 1:n
cmd = ['-V ', Num2str (v), '-C ', Num2str (Basenum^x (i,j)), '-G ', Num2str (Basenum^y (I,J))];
CG (I,J) = Svmtrain (Train_label, train, cmd);

if CG (i,j) <=
continue;
End

if CG (i,j) > Bestacc
BESTACC = CG (I,J);
BESTC = basenum^x (i,j);
Bestg = Basenum^y (i,j);
End

If ABS (CG (I,J)-BESTACC) <=eps && bestc > Basenum^x (i,j)
BESTACC = CG (I,J);
BESTC = basenum^x (i,j);
Bestg = Basenum^y (i,j);
End

End
End
Percent to draw the ACC with different C & G
Figure ;
[C,h] = contour (x,y,cg,70:accstep:100);
Clabel (c,h, ' Color ', ' R ');
xlabel (' log2c ', ' FontSize ', ' n ');
ylabel (' log2g ', ' FontSize ', ' n ');
firstline = ' svc parameter selection result graph (contour map) [Gridsearchmethod] ';
secondline = [' Best c= ', Num2str (BESTC), ' g= ', Num2str (BESTG), ...
' cvaccuracy= ', Num2str (BESTACC), '% '];
title ({firstline;secondline}, ' Fontsize ', n);
grid on;

figure;
MESHC (X,Y,CG);
% mesh (X,Y,CG);
% Surf (X,Y,CG);
axis ([cmin,cmax,gmin,gmax,30,100]);
xlabel (' log2c ', ' FontSize ', n);
ylabel (' log2g ', ' FontSize ', n);
zlabel (' accuracy (%) ', ' FontSize ');
firstline = ' svc parameter selection result graph (3D view) [Gridsearchmethod] ';
secondline = [' Best c= ', Num2str (BESTC), ' g= ', Num2str (BESTG), ...
' cvaccuracy= ', Num2str (BESTACC), '% '];
title ({firstline;secondline}, ' Fontsize ', n);

LIBSVM-MAT-2.89-3[FARUTOULTIMATE3.0]
Http://www.matlabsky.com/forum-v ...-fromuid-18677.html

To use

That dissected thing about SVM [long-term update finishing by Faruto]
Http://www.matlabsky.com/forum-v ...-fromuid-18677.html

Paper 36: [Tutorial] optimization of SVM parameters based on Gridsearch

Related Keywords:

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

## A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

• #### Sales Support

1 on 1 presale consultation

• #### After-Sales Support

24/7 Technical Support 6 Free Tickets per Quarter Faster Response

• Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.