Iris Flower Classification is the representative of classical logistic regression, but its code contains a large number of Python library core processing patterns, this article is to dissect the Python code article.
1 #The width and length of the flowers were taken by using the two feture labeled 2,3.2 #The first dimension takes ":" To represent all rows, the second dimension represents the column range, and this parameter pattern actually looks like reshape .3X = iris["Data"[:, (2,3)]4y = (iris["Target"]==2). Astype (Np.int)#Classification did a digital conversion, if it is iris,ture, then strong to the integral type 1,false is strong to 05Log_reg = Logisticregression (c=10**10)#set c value, C for precision, is to control the accuracy of the shape edge, the greater the value, the higher the accuracy6Log_reg.fit (X, y)#Learn about data, get model parameters, such as coef,intercepted, etc.7 #Meshgrid is the coordinate transformation of the parameters in P1 and P2, which will be described in detail below8 #Np.linspace is 2.9 to 7 equal to 500 copies, reshape ( -1,1) represents the number of rows according to the actual situation, the number of columns is 19X0, X1=np.meshgrid (Np.linspace (2.9, 7,). Reshape (-1, 1),TenNp.linspace (0.8, 2.7, $). Reshape ( -1,1)) One #Raval and flatter are similar in meaning, except that Raval returns a reference that modifies the return value to affect the original data (X0,X1), which returns copy, regardless of the original data A #about NP.C_ is the implementation of the fusion of arrays, the following are specific examples -X_new =np.c_[(X0.ravel (), X1.ravel ())] - #The return Predic only returns the predicted value (which returns the largest of all categories), and Predict_proba returns the predicted values for all categories theY_probe =Log_reg.predict_proba (x_new) -Plt.figure (figsize= (10,4)) - #this x[y==0, 0] The meaning of the expression is more complex, which means that the Y value is 0 of the corresponding x value, this argument perfectly explains X[y==0], then x[y==0, 0] is the first eigenvalue of x value, -Similar to x[y==0,1] represents the second eigenvalue of the X-value; From the head you can tell that X is a collection of two feature tuples, the first representing the width and the second representing the length; +Plt.plot (x[y==0, 0], x[y==0,1],"BS") -Plt.plot (x[y==1, 0], x[y==1, 1],"g^") +zz=y_probe[:, 1].reshape (X0.shape) A #Contour means contour (detailed introduction below) atContour=plt.contour (x0, X1,zz, cmap=PLT.CM.BRG) -Plt.clabel (Contour, inline=1, fontsize=12) - -Left_right=np.array ([2.9, 7]) - #This formula really does not know how to come, boundary's acquisition why is this formula? -Boundary =-(log_reg.coef_[0][0] * left_right + log_reg.intercept_[0])/log_reg.coef_[0][1] inPlt.plot (Left_right, boundary,"k--", linewidth=3) -Plt.text (3.2, 1.5,"Not Iris", Fontsize=14, color="b", ha="Center") toPlt.text (6.5, 2.25,"Iris", Fontsize=14, color="g", ha="Center") +Plt.axis ([2.9, 7, 0.8, 2.7]) -Plt.show ()
Sample Demo for data contours:
1 defheight (x, y):2 return(1-x/2+x**5+y**3) *np.exp (-x**2-y**2)3 4x = Np.linspace (-3, 3, 300)5y = Np.linspace (-3, 3, 300)6X, Y =Np.meshgrid (x, y)7Plt.contourf (x, y, height (× x, Y), ten, alpha=0.75, cmap=plt.cm.hot)8C = Plt.contour (x, y, height (x, y), colors="Black")9Plt.clabel (C, Inline=true, fontsize=10)Ten plt.xticks () One plt.yticks () APlt.show ()
numpy.c_ Example
> >> np. c_ [Np. Array ([1,2,3"), Nparray ([4,5,6])
Array ([[1, 4],
[2, 5],
[3, 6]])
>>>np. c_ [Np. Array ([[1,2,3"]), 0< Span style= "COLOR: #333333", 0array ([[4,5,6])]
Array ([[1, 2, 3, 0, 0, 4, 5, 6]])
Reference
Data Contour Line
78450
About Meshgrid
Https://www.cnblogs.com/sunshinewang/p/6897966.html
Https://docs.scipy.org/doc/numpy/reference/generated/numpy.meshgrid.html
About Ravel
78220080
Https://docs.scipy.org/doc/numpy/reference/generated/numpy.ravel.html
About NUMPY.C_
Https://docs.scipy.org/doc/numpy/reference/generated/numpy.c_.html
About Coef_ and Intercept_ (though I don't understand)
52933430?utm_source=itdadao&utm_medium=referral
The logical regression and realization of iris flower