AdaBoost of the classifier

Source: Internet
Author: User

Http://www.cnblogs.com/hrhguanli/p/3932488.html
Boosting Brief introduction

Classification is often used to classify multiple weak classifiers into strong classifiers, collectively referred to as Integrated Classification (Ensemble method). A simpler method, such as a bagging before boosting, is to train a weak classifier from a sample collection of the population, and then use multiple weak classifiers to voting, and finally the result is the winning result of the classifier poll. Such a simple voting strategy is often difficult to have a very good effect. It was not until later that the boosting method was invented that the power of the combinatorial weak classifier was played out. Boosting means to strengthen, promote, that is, the weak classifier promoted to a strong classifier. What we often hear is the adaboost of boosting to the most representative of the latter. The so-called adaboost, namely adaptive boosting, refers to the weak classifier based on the results of learning feedback adaptively adjustment If the error rate, so do not need no matter what the prior knowledge can be self-training. Breiman praised AdaBoost as the best off-the-shelf method in his thesis. two kinds of discrete adaboos algorithm flow

The adaboosting methods are broadly: discrete Adaboost, Real Adaboost, Logitboost, and gentle Adaboost. The whole method of training the framework is similar. Taking discrete adaboost as an example, its training process is as follows:


First initialize each sample with the same weight (step 2), then use a weighted sample to train each weak classifier (step 3.1); Get weighted training error rate and scale factor after classification (step 3.2); The weights of the samples that are incorrectly classified are increased and the weights of the changes are normalized again (step 3.3); the cyclic training process finally uses the proportional factor combination weak classifier to form the finally strong classifier.
The following is a more graphic picture, the combination of multiple weak classifiers and the results are as follows:



It is an effective method to accelerate training by increasing the weights of the samples that are incorrectly classified. Because the weight of the weak classifier with high correct rate in training is bigger, the sample that is correctly classified in the new round of training will be more and more, the training sample with smaller weight has less function in the new round of training, that is, each round of new training is focused on training the samples which are classified by mistake.

The weak classifier is the same in the actual training, but the training data used by the weak classifier is different, and each dimension of the eigenvector is usually used to form a weak classifier. And then the famous Haar+adaboost face detection method is to use each Haar feature to form a weak classifier, based on the Haar features of the block than the simple pixel-based features with a lot of other information, usually can get better detection effect, And the method of integrating graph integral has great advantage in calculating speed. Interested in "Face recognition based on AdaBoost and Haar-like features". Real AdaBoost and gentle AdaBoost Discrete AdaBoost is the simplest of the two categories of boosting classification results, and perhaps real AdaBoost (also known as ADABOOST.MH) can be seen as a generalization of discrete AdaBoost, the weak classifier can output multiple classification results, and the possibility of outputting the results of these classifications can be seen as a less "arbitrary" of every weak classifier. While gentle AdaBoost is the method of changing the weight of error samples during the iterative training, less emphasis is placed on the hard-to-classify samples, which avoids the situation that the original adaboost of "atypical" positive sample weights is very high and the efficiency of classifier is decreased. , and the resulting variant algorithm. AdaBoost's Matlabe Toolbox Gml_adaboost_matlab_toolbox implements Real AdaBoost, Gentle AdaBoost and modest AdaBoost, And there is an overview of the introduction (the use of the Toolbox internal manual, you can also participate in the next "cart and GML AdaBoost Matlab Toolbox"):

As for logitadaboost I actually do not understand, detailed can refer to "OpenCV about AdaBoost some notes".


several improvements of AdaBoost Modest adaboost real AdaBoost gentle adaboost logita daboost modest adaboost pseudo-code


Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.