ImportMatplotlib.pyplot as Plt fromSklearnImportDatasetsImportNumPy as NP fromSklearn.treeImportDecisiontreeclassifiern_features= 200X, y= Datasets.make_classification (200,n_informative=5,)#The back of P, is the ratio of positive and negativeTraining = Np.random.choice ([True, False], p=[.75,. 25],size=Len (y)) C=0 forXinchTraining:if(x = =True): C= C+1Print(c,c/750) accuracies= [] forXinchNp.arange (1, n_features+1): DT= Decisiontreeclassifier (max_depth=x) Dt.fit (x[training], y[training]) Preds= Dt.predict (x[~Training]) Accuracies.append ((Preds= = y[~training]). Mean ()) F, Ax= Plt.subplots (figsize= (7, 5)) Ax.plot (range (1, n_features+1), accuracies, color='k') Ax.set_title ("decision Tree Accuracy") Ax.set_ylabel ("% Correct") Ax.set_xlabel ("Max Depth") f.show () N= 15F, Ax= Plt.subplots (figsize= (7, 5)) Ax.plot (range (1, n_features+1) [: n], accuracies[:n], color='k') Ax.set_title ("decision Tree Accuracy") Ax.set_ylabel ("% Correct") Ax.set_xlabel ("Max Depth") f.show ()" "The old version has no this parameter, this parameter is very good, can check important features dt_ci = Decisiontreeclassifier (compute_importances=true) dt.fit (X, y) ne0 = dt.feature_ Importances_! = 0y_comp = Dt.feature_importances_[ne0]x_comp = Np.arange (len (Dt.feature_importances_)) [ne0]f, ax = Plt.subplots (figsize= (7, 5)) Ax.bar (X_comp, Y_comp) f.show ()" "
Decision Tree (branch depth of decision tree and important feature detection)