Omit the use of octave end, later use to see it
Week Three:
Logistic Regression:
For 0-1 categories
Hypothesis representation:
: Sigmoid function or Logistic function
Decision Boundary:
Theta's Transpose * small x>=0 is boundary
May:non-linear decision boundaries, constructing the polynomial of X
Cost function:
Simplified cost function and gradient descent:
Because Y has only two values, merging:
To find the least biased guide:
(The denominator should be ignored)
Advanced Optimization:
Conjugate gradient,bfgs,l-bfgs (Inquiry learning)
Multi-Class Classification:one-vs-all:
Once the logistic regression classification is used for each class, after determining the parameters, the category of Max is calculated: Called One-vs-all (a one-to-many method).
Regularization:the Problem of Overfitting
Overfiting:reduce number of features or regularization
Linear regression:
Gradient Descent:
Normal equation:
regularized Logistic regression:like linear Regression,add extra in J (theta)
Attention: Many regular items start from 1 and do not punish for 0.
Machine learning Note (ii)-from Andrew Ng's instructional video