Memories
--------------------------------------------------------------------------------------------------------------- -------
Training set
There are n characteristics, only one, then if there is the following relationship:
The
--------------------------------------------------------------------------------------------------------------- --------
Logistic function (sigmoid function):
Let's do it:
Cost function:
which
Calculation
Process steps:
which
Now it is shown that the gradient representation of the logistic function and the linear regression function is the same:
Gradient of linear regression:
Gradient of the logistic:
And because:
So:
where z is the h in the formula, so the gradient function is the same as the linear regression:
Experiment:
% Logistic regression% initial data x=[-3; -2; -1; 0; 1; 2; 3]; y=[0.01; 0.1; 0.3; 0.45; 0.8; 0.8; 0.99]; Plot (x, y, ' ro '); Hold on % fit m = Length (y); theta = [0 0]; a=0.005; loss = 1; Iters = 1; EPS = 0.0001; While loss >eps && iters <100 loss = 0; For i = 1:length (y) h = 1./(1+exp (-(Theta (1) +theta (2) *x (i))); Theta (1) =theta (1) +a* (Y (i)-h); Theta (2) =theta (2) +a* (Y (i)-h) *x (i,1); Err = theta (1) +theta (2) *x (i,1)-Y (i); loss = loss+err*err/m; End iters = iters+1;enditers theta% drawing contrast for x = -3:0.01:3 h = 1./(1+exp (-(Theta (1) +theta (2) *x))); Plot (x,h); Endhold off
Logical (Logistic) regression