First, the integration method:
1, what is the integration method?
Integration method, also known as meta-algorithm, is an integration of the algorithm. Integration methods can take many forms, allowing for integration of different algorithms, or integration of the same algorithm under different settings
2. Why is the integration method used?
The most popular understanding, "Three Stooges, top a Zhuge Liang", for classification, comprehensive classification of multiple classifiers to classify opinions
3. Weak classifier and strong classifier
Second, boosting
1, boosting is a kind of meta-algorithm
2. What are the characteristics?
(1) The type of multiple classifiers used is consistent
(2) The weights of each classifier are unequal, and the weights represent the success degree of their corresponding classifiers in the previous iteration.
(3) Focus on the data that has been wrongly divided by the classifier to obtain a new classifier
3, a kind of adaboost-boosting
Adaboost:adaptive boosting, adaptive boosting
Third, AdaBoost
1, adaboost algorithm flow?
(1) give equal weight to each sample in the training data set
(2) weak classifier for training
(3) Adjust the weights of each training sample: The sample weight of the classification error increases, and the correct sample weight is reduced.
Adjust the weight of each classifier: the error rate is large, the weight is small
(4) Repeat (2)-(3) until the error rate drops to 0 or the number of weak classifiers reaches the user's specified value
2. Weak classifier
Any classification algorithm can be used as weak classifier, but the effect of simple classifier is better, this paper adopts single-layer decision tree
Single-layer decision tree: decision-making based only on a single feature, with only one split process
Iv. Realization OF
Statistical Learning Method VIII: Lifting method