Implement linear Fit
We use python2.7 to achieve the derivation of the previous article. Please install the Python matplotlib package and the NumPy package first.
The specific code is as follows:
#!/usr/bin/env python#! -*-Coding:utf-8-*-Import Matplotlib.pyplotAs PltFrom NumPyImport *#创建数据集DefLoad_dataset(): n =X = [[1,0.005*XI]For XiIn range (1,] Y = [2*xi[1]For XiIn X]Return X, Y#梯度下降法求解线性回归DefGrad_descent(x, y): x = Mat (x) Y = Mat (Y) row, col = shape (x) Alpha =0.001 maxiter =W = Ones ((1, col))For KIn range (maxiter): w = w + Alpha * (Y-w*x.transpose ()) *xreturn wdef main (): X, Y = Load_dataset () W = Grad_descent (X, Y) print " W = ", W #绘图 x = [xi[ 1] for XI in X] y = y Plt.plot (X, Y, Marker= "*") XM = Mat (X) y2 = w*xm.transpose () Y22 = [Y2[0,i] for i in range (Y2.shape[1])] Plt.plot (x, Y22, Marker= "O") plt.show () if __name__ = = "__main__": Main ()
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21st
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
- 33
- 34
- 35
- 36
- 37
- 38
- 39
- 40
- 41
- 42
The code is super simple, and the Load_dataset function creates a y=2x dataset that grad_descent the function to solve the optimization problem.
In the grad_descent more than two small things, alpha is the learning rate, generally take 0.001~0.01, too large may lead to shocks, solve instability. Maxiter is the maximum number of iterations, it determines the accuracy of the results, usually the larger the better, but the larger the more time-consuming, so often need to try to calculate the following, you can also write another criterion, such as when Y−WXT The iteration is no longer less than the number of times.
Let's take a look at the effect:
When maxiter=5, the fitting result is this:
If maxiter=50, the fitting result is this:
If maxiter=500, the fitting result is this:
If maxiter=1000, the fitting result is this:
If maxiter=5000, the fitting result is this:
The results of the 5,000 times were almost perfect, with two curved graphs coincident. Just the sauce.
At the end of this article, we begin to add the logistic function and derive the logistic regression.
Logistic regression Tutorial 1