LIBSVM parameter description of machine learning

Source: Internet
Author: User
Tags svm rbf kernel

Because to use SVM to do regression, so read some about LIBSVM, summarize to spare

LIBSVM in the training model, there are the following parameters to set, of course, there are default parameters, but in the specific application of the effect will be greatly discounted.

Options: The option that is available is the meaning of the following
-S SVM type: SVM setting type (default 0)
0--C-svc
1--v-svc
2--A class of SVM
3--E-svr
4--V-svr


-T kernel function type: kernel function set type (default 2)
0– linear: U ' V
-A-polynomial: (r*u ' v + coef0) ^degree
2–RBF function: exp (-gamma|u-v|^2)
3–sigmoid:tanh (r*u ' v + coef0)


-D Degree: degree settings in kernel functions (for polynomial kernel functions) (default 3)
-G R (GAMA): Gamma function setting in kernel function (for polynomial/rbf/sigmoid kernel function) (default 1/k)
-R COEF0: COEF0 settings in kernel functions (for polynomial/sigmoid kernel functions) (default 0)
-C Cost: Set parameters for C-svc,e-svr and v-svr (loss function) (default 1)
-N nu: Set V-SVC, a class of SVM and V-svr parameters (default 0.5)
-P p: Sets the value of the loss function p in E-svr (default 0.1)
-M CacheSize: Sets the cache memory size in megabytes (default 40)
-e EPS: Set the allowable termination criteria (default 0.001)
-H shrinking: whether to use heuristics, 0 or 1 (default 1)
-wi Weight: Set the class parameter C to Weight*c (c in C-svc) (default 1)
-V n:n-fold Interactive test mode, n is the number of fold, must be greater than or equal to 2
where k in the-G option refers to the number of attributes in the input data. Option-v randomly splits the data into n parts

When the completion of the model, but also for the above parameters to choose the appropriate value, the method is mainly Gridsearch, the other feeling is not used, gridsearch is poor lift.


Grid parameter optimization function (classification problem): Svmcgforclass
[bestcvaccuracy,bestc,bestg]=
Svmcgforclass (Train_label,train,
Cmin,cmax,gmin,gmax,v,cstep,gstep,accstep)

Input:
Train_label: The label of the training set, the format requirement is the same as Svmtrain.
Train: Training set with the same format requirements as Svmtrain.
Cmin,cmax: The variation range of the penalty parameter C, that is, in the [2^cmin,2^cmax] range to find the best parameter C, the default value is Cmin=-8,cmax=8, that is, the default penalty parameter C range is [2^ (-8), 2^8].
The variation range of the GMIN,GMAX:RBF core parameter G, that is, to find the best RBF kernel parameter G in the [2^gmin,2^gmax] range, the default value is Gmin=-8,gmax=8, which is the range of the default RBF kernel parameter g is [2^ (-8), 2^8].
V: The parameters in the cross Validation process, that is, the V-fold cross Validation of the training set, the default is 3, that is, the 30 percent CV process by default.
Cstep,gstep: The parameter optimization is the step size of C and G, that is, the value of C is 2^cmin,2^ (cmin+cstep),..., 2^cmax,,g is 2^gmin,2^ (gmin+gstep),..., 2^gmax, The default value is Cstep=1,gstep=1.
Accstep: The final parameter selects the Step interval size (a number between [0,100]) of the exact rate discretization displayed in the result graph, which defaults to 4.5.

Output:
Bestcvaccuracy: The best classification accuracy rate in the final CV sense.
BESTC: the best parameter C.
BESTG: the best parameter G.


Grid parameter optimization function (regression problem):

Svmcgforregress

[bestcvmse,bestc,bestg]=
Svmcgforregress (Train_label,train,
Cmin,cmax,gmin,gmax,v,cstep,gstep,msestep)
The input and output are similar to Svmcgforclass and are not mentioned here.


And when you're done with model, you should know what's in the model and what it means before you use it for classification or regression.

The heart data that is used to train the LIBSVM.

Model =
Parameters: [5x1 Double]
        Nr_class:2
        totalsv:259                  % Number of support vectors
        rho:0.0514                  % b
        Label: [2x1 double]Number of labels in% classification
        ProbA: []
        Probb: []
        NSV: [2x1 double] % number of support vectors per class
        SV_COEF: [259x1 Double]% support vector corresponds to WI

SVs: [259x13 double] % loaded with 259 support vectors

Model. Parameters the meaning of the parameter from top to bottom:
-S SVM type: SVM setting type (default 0)
-T kernel function type: kernel function set type (default 2)
-D Degree: degree settings in kernel functions (for polynomial kernel functions) (default 3)
-G R (GAMA): Gamma function setting in kernel function (for polynomial/rbf/sigmoid kernel function) (Inverse of default category number)
-R COEF0: COEF0 settings in kernel functions (for polynomial/sigmoid kernel functions) (default 0)


SVM How to get good results

1. Normalization of data ( simple scaling)

2. application of RBF kernel

3. use cross-validation and grid-search to obtain optimal C and g

4. Optimal C and g training data obtained

5. Testing

Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.

LIBSVM parameter description of machine learning

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.