Analysis of SVM training parameters in OpenCV 3.0 __SVM

Source: Internet
Author: User
Tags svm

The opencv3.0 and 2.4 SVM interfaces are different and can be performed in the following format:

ML::SVM::P arams Params;
Params.svmtype = ml::svm::c_svc;
Params.kerneltype = ML::SVM::P oly;
Params.gamma = 3;
ptr<ml::svm> SVM = ml::svm::create (params);
Mat Traindata; Each act a sample
Mat labels;    
Svm->train (Traindata, ml::row_sample, labels);
// ...

Svm->save ("...");//file in the form of XML, can be saved in txt or XML file
ptr<svm> svm=statmodel::load<svm> ("...");

Mat query; Input, 1 channels
Mat Res;   Output
svm->predict (query, RES);

But note that if the error is best to see the opencv3.0 document, which has function prototypes and explanations, I in the actual operation of the process, also made a number of changes

1) Set parameters

SVM has a lot of parameters, but the c_svc and RBF related to only gamma and C, so set these two is good, terminate the condition set and the default, the experience can be (in fact, it is a lot of data, the gamma set to 0.01, so that the training convergence speed will be much faster)

ptr<svm> SVM = Svm::create ();
Svm->settype (SVM::C_SVC);
Svm->setkernel (SVM::RBF);
Svm->setgamma (0.01);
SVM->SETC (10.0);
Svm->settermcriteria (Termcriteria (Cv_termcrit_eps, 1000,flt_epsilon));

svm_type– Specifies the type of SVM, the following are possible values:

Cvsvm::c_svc class C support vector classifier. N-Class grouping (n \geq 2), which is allowed to be incompletely classified with the exception penalty factor C.
Cvsvm::nu_svc \nu class support vector classifier. n a classifier that is similar to incomplete classification. The parameter is \nu instead of C (its value is in the interval "0,1", the larger the Nu, the smoother the decision-making boundary).
Cvsvm::one_class single classifier, all the training data are extracted from the same class, and then SVM establishes a dividing line to segment the region occupied by the class in the feature space and other classes in the feature space.
CVSVM::EPS_SVR \epsilon class support vector regression machine. The distance between the eigenvector of the training set and the hyperplane of the fitted out is less than p. Abnormal value penalty factor C is adopted.
CVSVM::NU_SVR \nu class support vector regression machine. \nu instead of P.

KERNEL_TYPE–SVM kernel type, the following are possible values:

Cvsvm::linear linear kernel. Without any mapping to the high-dimensional space, the linear distinction (or regression) is accomplished in the original feature space, which is the quickest choice. K (x_i, x_j) = X_i^t X_j.
CVSVM::P oly polynomial kernel: K (x_i, X_j) = (\gamma x_i^t + x_j) coef0}, ^{degree > 0.
CVSVM::RBF are based on radial functions and are a good choice for most situations: K (x_i, X_j) = E^{-\gamma | | x_i-x_j| | ^2}, \gamma > 0.
Cvsvm::sigmoid sigmoid function Kernel: K (x_i, X_j) = \tanh (\gamma x_i^t, X_j + coef0).

degree– the parameters of the kernel function (POLY) degree.

gamma– the parameters of the kernel function (poly/rbf/sigmoid) \gamma.

coef0– the parameters of the kernel function (poly/sigmoid) coef0.

Parameter C of the CVALUE–SVM type (C_SVC/EPS_SVR/NU_SVR).

The parameter \nu of the NU–SVM type (NU_SVC/ONE_CLASS/NU_SVR).

The parameter \epsilon of the P–SVM type (EPS_SVR).

The optional weight in the class_weights–c_svc, assigned to the specified class, multiplied by C to become class\_weights_i * C. So these weights affect different categories of error classification penalties. The greater the weight, the greater the penalty for classifying data in a certain category.

The termination condition of the iterative training process of TERM_CRIT–SVM solves the problem of partially constrained two times. You can specify the tolerance and/or maximum number of iterations.

2) Training

Mat Traindata;
Mat labels;
Traindata = Read_mnist_image (trainimage);
Labels = Read_mnist_label (trainlabel);

Svm->train (Traindata, row_sample, labels);

3) Save

Svm->save ("Mnist_dataset/mnist_svm.xml");

3. Test, compare to the results

(The Flt_epsilon here is a very small number, 1.0-flt_epsilon!= 1.0)

Mat TestData;
Mat Tlabel;
TestData = Read_mnist_image (testimage);
Tlabel = Read_mnist_label (TestLabel);

float count = 0;
for (int i = 0; i < testdata.rows i++) {
    Mat sample = Testdata.row (i);
    float res = svm1->predict (sample);
    res = Std::abs (res-tlabel.at<unsigned int> (i, 0)) <= Flt_epsilon? 1.F:0.F;
    Count = res;
}
cout << "Correct number of identifiers Count =" << count << Endl;
cout << "error rate for ..." << (10000-count + 0.0)/10000 * 100.0 << "%....\n";

Svm->predict is not used here (query, RES);

The OpenCV document is then viewed, and when the incoming data is mat rather than Cvmat, the Predict return value (float) can be used to determine if the forecast is correct.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.