Xgboost Series
ubuntu14.04 Installation
Pip Install Xgboost
Error
sudo apt-get update
It turned out the same mistake.
Workaround:
sudo-h pip install--pre xgboostsuccessfully installed xgboostcleaning up ...
It worked!
Over fitting
When you observe the training accuracy is high, but the detection accuracy is low, it is likely that you encounter over-fitting problems.
Xgboost is a good boosting model with fast effect.
The boosting classifier belongs to the integrated learning model, and the basic idea is to combine hundreds of tree models with lower classification accuracy to become a high accuracy model. The model will iterate continuously, creating a new tree for each iteration. For how to generate a reasonable tree at each step, we present a lot of methods, here we briefly introduce the gradient boosting machine proposed by Friedman. It uses the idea of gradient descent when generating each tree, based on all previously generated trees, and taking a step toward minimizing the direction of a given target function. In the reasonable parameter setting, we often need to generate a certain number of trees to achieve a satisfactory accuracy rate. When the dataset is larger and more complex, we may need thousands of iterations, and if it takes a few seconds to generate a tree model, so many iterations can be time-consuming and should allow you to think quietly ...
Now, we want to be able to solve this problem better through xgboost tools. The full name of Xgboost is extreme Gradient boosting. As its name, it is a C + + implementation of gradient boosting machine, the author of Daniel Chen Tianchi, who is studying machines at the University of Washington. In his research, he felt constrained by the speed and precision of the existing library , so he started building the Xgboost project a year ago and gradually formed it last summer. The most important feature of Xgboost is that it can automatically utilize the multi-thread of CPU and improve the accuracy by improving the algorithm . Its maiden show is the kaggle of the Higgs signal recognition contest, because outstanding efficiency and high predictive accuracy in the contest Forum attracted the attention of contestants, in the fierce competition of more than 1700 teams to occupy a place. With the improvement of its popularity in the Kaggle community, a team has recently won the first in the competition with Xgboost.
To make it easy for everyone to use, Chen Tianchi encapsulates xgboost as a Python library. I was fortunate enough to work with him to create the R language interface for the Xgboost tool and submit it to the Cran. There are also users who encapsulate it into the Julia Library. The functionality of the Python and R interfaces is constantly being updated, and you can learn by following the general features below and choosing the language you are most familiar with.
Ipython Notebook Use
command line input directly
Ipython Notebook