Xgboost is the most sought-after machine learning algorithm in recent years, proposed by Chen Tianchi of the University of Washington. Recently Learning machine learning, so need to install this tool. However, there are many tutorials on the Internet to install Xgboost, some of them are too complicated to install additional software, some look simple but they follow the steps to install it again, the test is not successful.
Recently found a super simple way, mainly to discover this magical URL:
https://www.lfd.uci.edu/~gohlke/pythonlibs/Click on the Open link
It contains many libraries of. WHL files. The need for quick positioning can be in the top of the URL followed by input #<package_name>, such as to quickly navigate to the Xgboost library, you can enter the https://www.lfd.uci.edu/~gohlke/pythonlibs/#xgboost. The results are shown in the following illustration:
Select the corresponding file according to your system. My own is the win 64 bit, anaconda3 default python3.6 environment, so select the last one. After downloading, save the file in a folder, such as D:\ANACONDA3\MYWHL. MYWHL is a folder that you customize under the Anaconda3 folder. Next, start Anaconda Prompt. Position the path to the directory where the. whl file is placed.
Then enter the PIP install XGBOOST-0.6+20171121-CP36-CP36M-WIN_AMD64.WHL, a large string behind is the full file name, do not lose the wrong oh ... The results are as follows:
The last line shows that the installation was successful and you can also use the command Conda list or PIP list to see if there are any xgboost libraries in the list.
Finally, test, open Spyder, create a. py file, and enter the following test code:
Import NumPy as NP
import xgboost as XGB
data = Np.random.rand (5,10) # 5 entities, each contains features
l Abel = Np.random.randint (2, size=5) # binary target
dtrain = XGB. Dmatrix (data, Label=label)
dtest = dtrain
param = {' bst:max_depth ': 2, ' Bst:eta ': 1, ' silent ': 1, ' objective ': ' Binary:logistic '}
param[' nthread '] = 4
param[' eval_metric '] = ' AUC '
evallist = [(dtest, ' eval '), ( Dtrain, ' train ')]
num_round =
BST = Xgb.train (param, Dtrain, Num_round, evallist)
Bst.dump_model (' Dump.raw.txt ')
Run successfully. The results are as follows: