1 #-*-coding:utf-8-*-2 """3 Created on Thu June 17:16:19 20184 5 @author: Zhen6 """7 fromSklearn.model_selectionImportTrain_test_split8 ImportMglearn9 ImportMatplotlib.pyplot as PltTenX, y =Mglearn.datasets.make_forge () One X_train, X_test, y_train, y_test = Train_test_split (x, Y, random_state=0) # Generate training and test set data A - fromSklearn.neighborsImportKneighborsclassifierCLF = Kneighborsclassifier (n_neighbors=3) # call K nearest Neighbor classification algorithm-clf.fit (X_train, Y_train) # Training Data - - Print("Test set predictions:{}". Format (clf.predict (x_test)) # Forecast + - Print("Test set accuracy:{:.2f}". Format (Clf.score (X_test, Y_test))) + AFig, axes = plt.subplots (1, 3, figsize= (10, 3) # Use Matplotlib to paint at - forN_neighbors, AxinchZip ([1, 3, 9], axes): - #The Fit method returns the object itself, so we can put the instantiation and fitting in one line of code -CLF = Kneighborsclassifier (n_neighbors=n_neighbors). Fit (x, y) -Mglearn.plots.plot_2d_separator (CLF, X, Fill=true, eps=0.5, Ax=ax, alpha=0.4) -Mglearn.discrete_scatter (x[:, 0], x[:, 1], Y, ax=Ax) inAx.set_title ("{} neighbor (s)". Format (n_neighbors)) -Ax.set_xlabel ("feature 0") toAx.set_ylabel ("Feature 1") +Axes[0].legend (loc=3)
Results:
Summary: As you can see, the decision boundaries drawn with a single neighbor follow the training data, and as the neighbors increase, the decision boundaries become smoother and the smoother boundaries correspond to simpler models, in other words, with fewer neighbors that correspond to higher model complexity.
K Nearest Neighbor Classification algorithm