Reply content:
1. Epsilon = 0.0001 What is the purpose of this code? In Minimize (Jtheta) and cost function formulas, none of them saw a epsilon logo
In fact, in the bottom 5th, 6 line:
ifabs(error1-error0)<epsilon: break
Epsilon is the convergence condition.
THETA012 is the optimal (and let me say) fitting the various coefficients of the curve polynomial, but the egg, because you this data is self-imagined, the result is not very valuable
I also have a gradient down the realization that you can see
https:// Github.com/racaljk/pyan n/blob/master/network/perceptron.py
As the building says, there are a lot of problems with this piece of code.
The "Data Science from Scratch" book is devoted to a chapter on Python's gradient descent and random gradient descent algorithm, which is worth learning.
I wrote this chapter in my personal blog, adding some theoretical explanations, for reference only.
Reference link: Python implementation of gradient descent algorithm
Epsilon is the error threshold that you set to end the iteration.
The theta is used for testing in the fitting of the original function, and the gradient drop typically has two datasets: the training set and the test set.
The previous k God has pointed out, this is the beginner writes himself, the flaw is very big. Implementation of the defect K God has already mentioned, I mention two other 1 is not written with a matrix. The efficiency of using a for loop is relatively low, and the use of matrices is clearer. 2 self-made data sets, not necessarily convergence at the end.
To learn the suggestion to find a public class, want to see the code can go to GitHub to find, see this error a lot of code is not learning things, Epislon jump out of the loop flag, judge whether convergence