Machine learning Algorithms in OPENCV3

Source: Internet
Author: User
Tags svm

In opencv3.0, a ml.cpp file is provided, all of which are machine learning algorithms, providing a total of a few:

1. Normal Bayesian: Normal Bayessian classifier I have introduced in another article blog post: Realization of machine learning in Opencv3: using normal Bayesian classification

2, K nearest neighbor: K Nearest Neighbors classifier

3, Support vector machine: Supporting vectors machines Please refer to my other blog: The realization of machine learning in OPENCV3: using SVM (Support vector machine) classification

4. Decision Trees: Decision tree

5. ADA Boost:adaboost

6. Gradient Elevation decision tree: Gradient boosted trees

7. Random Forest: Randomness Forest

8. Artificial neural Network: Artificial Neural networks

9, EM algorithm: expectation-maximization

These algorithms can be described in any machine learning book, their approximate classification process is very similar, mainly divided into three links:

I. Collection of sample Data sampledata

Second, training classifier mode

Third, the test data testdata to predict

The different place is the parameter setting in the OPENCV, assumes the training data is Trainingdatamat, and already marked good labelsmat. The data to be measured is testmat.

1. Normal Bayesian

// creating a Bayesian classifier  Ptr<normalbayesclassifier> model=normalbayesclassifier::create ();         // set up training data  ptr<traindata> tdata =traindata::create (Trainingdatamat, Row_sample, Labelsmat);     // Training Classifier    Model->Train (tdata); // Predictive Data float

2. K Nearest Neighbor

Ptr<knearest> KNN = Knearest::create ();  // Create a KNN classifier    KNN->SETDEFAULTK (K);    // Set K value    Knn->setisclassifier (true);     // set up training data    ptr<traindata> tdata = traindata::create (Trainingdatamat, Row_sample, Labelsmat);    KNN---Train (Tdata    ); float response = knn->predict (Testmat);

3. Support Vector Machine

ptr<svm> SVM = Svm::create ();//Create a classifierSvm->settype (SVM::C_SVC);//set the SVM typeSvm->setkernel (SVM::P oly);//set the kernel function;Svm->setdegree (0.5); SVM->setgamma (1); SVM-&GT;SETCOEF0 (1); SVM-&GT;SETNU (0.5); SVM-&GT;SETP (0); SVM->settermcriteria (Termcriteria (Termcriteria::max_iter+termcriteria::eps, +,0.01)); SVM-setc (C); Ptr<TrainData> Tdata =traindata::create (Trainingdatamat, Row_sample, Labelsmat); SVM-train (Tdata); floatResponse = Svm->predict (Testmat);

4. Decision Trees: Decision tree

ptr<dtrees> Dtree = Dtrees::create ();//Creating classifiersDtree->setmaxdepth (8);//Set Maximum DepthDtree->setminsamplecount (2); Dtree->setusesurrogates (false); Dtree->setcvfolds (0);//Cross-validationDtree->setuse1serule (false); Dtree->settruncateprunedtree (false); Ptr<TrainData> Tdata =traindata::create (Trainingdatamat, Row_sample, Labelsmat); Dtree-train (Tdata); floatResponse = Dtree->predict (Testmat);

5. ADA Boost:adaboost

 ptr<boost> Boost = Boost::create ();    Boost ->setboosttype (boost::D iscrete);    Boost ->setweakcount (100  );    Boost ->setweighttrimrate (0.95  );    Boost ->setmaxdepth (2  );    Boost ->setusesurrogates (false  );    Boost ->setpriors (Mat ()); Ptr  <TrainData> tdata = Traindata::create (Trainingdatamat, Row_sample,    Labelsmat);    Boost ->train (Tdata);  float  response = boost->predict (Testmat); 

6. Gradient Elevation decision tree: Gradient boosted trees

This algorithm was commented out in opencv3.0 for unknown reasons, so an older version of the algorithm is provided here.

Gbtrees::P aramsparams(gbtrees::D Eviance_loss,//Loss_function_type                          -,//Weak_count                         0.1f,//Shrinkage                         1.0f,//subsample_portion                         2,//max_depth                         false //use_surrogates)                         ); Ptr<TrainData> Tdata =traindata::create (Trainingdatamat, Row_sample, Labelsmat); Ptr<GBTrees> gbtrees = statmodel::train<gbtrees> (Tdata,params); floatResponse = Gbtrees->predict (Testmat);

7. Random Forest: Randomness Forest

Ptr<rtrees> rtrees =rtrees::create (); Rtrees->setmaxdepth (4); Rtrees->setminsamplecount (2); Rtrees->setregressionaccuracy (0. f); Rtrees->setusesurrogates (false); Rtrees->setmaxcategories ( -); Rtrees-setpriors (Mat ()); Rtrees->setcalculatevarimportance (false); Rtrees->setactivevarcount (1); Rtrees->settermcriteria (Termcriteria (Termcriteria::max_iter,5,0)); Ptr<TrainData> Tdata =traindata::create (Trainingdatamat, Row_sample, Labelsmat); Rtrees-train (Tdata); floatResponse = Rtrees->predict (Testmat);

8. Artificial neural Network: Artificial Neural networks

ptr<ann_mlp> ANN = ann_mlp::create ();    Ann--setlayersizes (layer_sizes);    Ann11);    Ann, Flt_epsilon));    Ann0.001);    Ptr<TrainData> tdata = traindata::create (Trainingdatamat, Row_sample, Labelsmat);    Ann--Train (Tdata)    ; float response = ann->predict (Testmat);

9, EM algorithm: expectation-maximization

The EM algorithm is slightly different from the previous one, it needs to create many model, divides the Trainingdatamat into many modelsamples, each modelsamples trains a model

The training core code is:

 intNmodels = (int) labelsmat.size (); Vector<Ptr<EM> >em_models (nmodels);    Mat Modelsamples;  for(i =0; i < nmodels; i++ )    {        Const intComponentcount =3;        Modelsamples.release ();  for(j =0; J < Labelsmat.rows; J + +)        {            if(labelsmat.at<int> (J,0)==i) Modelsamples.push_back (Trainingdatamat.row (j)); }        //Learn Models        if( !Modelsamples.empty ()) {PTR<EM> EM =em::create (); EM-Setclustersnumber (Componentcount); EM-Setcovariancematrixtype (em::cov_mat_diagonal); EM-trainem (Modelsamples, Noarray (), Noarray (), Noarray ()); Em_models[i]=em; }    }

Forecast:

Mat loglikelihoods (1, Nmodels, CV_64FC1, Scalar (-for0; i < nmodels; i++ )            {                if(!  Em_models[i].empty ())                    loglikelihoods.at<double> (i) = Em_models[i]->predict2 (Testmat , Noarray ()) [0];            }

So many machine learning algorithms, in the actual use of my understanding in fact, only need to master the SVM algorithm on it.

Ann algorithm is also called multi-layer perceptron in OpenCV, so it needs to be divided into multiple layers when training.

The EM algorithm needs to create a model for each class.

Machine learning Algorithms in OPENCV3

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.