# like random forests, tree-based decision trees are built in a continuous way, with a very small depth of max_depth<5. Important Parameters N_estimate and learning_rate, the Y effect of these two parameters is to adjust the model overfitting, Thus, the generalization ability of the model can be improved.
From sklearn.ensemble import Gradientboostingclassifier
From sklearn.datasets import Load_breast_cancer
From sklearn.model_selection import Train_test_split
Cancer=load_breast_cancer ()
X_train,x_test,y_train,y_test=train_test_split (cancer.data,cancer.target,random_state=0)
Gbrt=gradientboostingclassifier () #模型不做参数调整
Gbrt.fit (X_train,y_train)
Print (Gbrt.score (x_train,y_train))
Print (Gbrt.score (x_test,y_test))
#对模型做预剪枝
Gbrt=gradientboostingclassifier (n_estimate=100,learning_rate=0.01)
#n_estimate主要控制树的数量, Learning_rate control error correction parameter the smaller the model the more complex
Python machine learning gradient lifting tree