Logistic Regression Notes and understanding
Logistic Regression
Hypothesis H (theta)
H (theta) =g (z)
where G (z) is a function called the logistic function, the g (z) function is defined as follows:
The corresponding image is as follows:
This is a range of 0~1 s-type functions, in the understanding can be considered:
Any point falling on the curve a
The ordinate value of a's horizontal axis corresponds to the z parameter, or the probability that the Z object belongs to "1".
In the logistic regression
The parameter z of G (z) is:
A vectorization representation of a linear or nonlinear function
The image that corresponds to this function is called the decision boundary.
Examples of two decision boundaries:
Linear:
Nonlinear:
For convenience, here we only discuss the linear boundary condition
The representation of a linear boundary is
X ' *theta
So the logistic Regression hypothesis is defined as follows:
As stated above, hypothesis defines the probability that the result takes 1, so the probability of the input x classification result as Category 1 and category 0 is:
The above is the understanding of the logistic Regression hypothesis
Logistic Regression cost Function as J (theta)
The main function of cost is to calculate the difference between H (theta) and answer y, which can be solved by variance in linear regression, but the logistic problem has only two kinds of answers, so the cost function of the logistic regression should be this:
Integration into a function
We get the cost Function of the logistic regression.
Next is the method of solving the cost function minimum-gradient descent
or use the MATLAB built-in miniziae function
Logistic Regression Notes and understanding