The routine of Kaggle competition

Source: Internet
Author: User

Picture data: Convolution or kingly, there are a few more common framework has been replaced by people to change

Non-picture feature data: By Category:

Boost Series algorithm: Xgboost Framework Implementation

The AdaBoost algorithm trains the same basic classifier (weak classifier) for different training sets, and then sets up the classifiers that are obtained on different training sets to form a stronger final classifier (strong classifier). The theory proves that, as long as the classification ability of each weak classifier is better than the random guess, the error rate of the strong classifier tends to zero when the number tends to the infinite number. The different training sets in the AdaBoost algorithm are implemented by adjusting the corresponding weights of each sample. Initially, the weights for each sample are the same, and a basic classifier H1 (x) is trained under this sample distribution. For samples with H1 (x) errors, the weights of their corresponding samples are increased, and for samples with the correct classification, their weights are reduced. This allows the wrong sample to be highlighted and a new sample distribution is obtained. At the same time, according to the situation of the wrong points to give H1 (x) a weight, indicating the importance of the basic classifier, the less the wrong points the greater the weight. Under the new sample distribution, the basic classifier is trained again, and the basic classifier H2 (x) and its weight are obtained. And so on, through a cycle like T, we get the T basic classifier, and the corresponding weights of T. Finally, the T-base classifier is summed up according to certain weights, and the final expected strong classifier is obtained.

Xgboost, Extratrees, Gradientboost, and Randomforest classifiers

The routine of Kaggle competition

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.