Code:
1 ImportNumPy as NP2 fromSklearnImportDatasets3 fromSklearn.linear_modelImportlogisticregression4 ImportMatplotlib.pyplot as Plt5 6 __author__='Zhen'7 8Iris =Datasets.load_iris ()9 Ten forIinchRange (0, 4): Onex = iris['Data'[:, I:i+1]#Get Training Data Ay = iris['Target'] - -Param_grid = {"Tol": [1e-4, 1e-3, 1e-2],"C": [0.4, 0.6, 0.8]} the -Log_reg = Logisticregression (multi_class='OVR', solver='SAG', max_iter=1000)#OVR: Two categories - log_reg.fit (x, y) - + #changes the style of the data, reshape (rows, columns), when Rows=-1, represents any row -x_new = Np.linspace (0, 3, +). Reshape (-1, 1) + AY_proba =Log_reg.predict_proba (x_new) atY_hat =log_reg.predict (x_new) - - Print("y_prob:\n{}\ny_hat\n{}". Format (Y_proba, y_hat[:: 10])) - Print("="*60) - - #Drawing inPlt.subplot (2, 2, i+1) -Plt.plot (X_new, y_proba[:, 2],'g', label='Iris-virginica') toPlt.plot (X_new, y_proba[:, 1],'R', label='Iris-versicolour') +Plt.plot (X_new, y_proba[:, 0],'B -', label='Iris-setosa') - the ifi = = 3: *Plt.show ()
Results:
Training 1:
Training 2:
Training 3:
Training 4:
Analysis:
The training result shows that training 4 is the most reasonable (clear classification):
The logistic regression of Python