In understanding the principle of logistic regression algorithm, we point out the definition of the loss function of logistic regression (here we re-contract the symbol):
For a single sample, the desired output of the sample is denoted as Y, and the actual output of the sample is recorded as Y_hat, then the loss function of the logistic regression can be expressed as:
And for the cost function of the whole sample set, it can be expressed as:
Unlike the loss function, it describes the relationship between the parameters W and B of the model and the optimization target on the last episode of the entire sample, in which the cost function is the average of the loss function in these two formulas.
So let's take a look at why it works for a loss function:
If the output Y=1 is expected, then the optimization target is min L (y,y_hat) =min[-log (Y_hat)], obviously at this time the larger the y_hat, the optimization target will get the minimum value;
If the output y=0 is expected, then the optimization target is min L (y,y_hat) =min[-log (1-y_hat)], obviously the smaller the y_hat at this time, the optimization target will get the minimum value;
Here's how this loss function comes in:
The logistic regression model is as follows:
So the probability of Y=1 y_hat for a given x:
Then there are:
Because it is a two classification problem, the value of Y is not 1 or 0, then the combined formula can be obtained:
At the same time, because the log function is a strictly monotonically increasing function, in machine learning, we often do not pay much attention to the base of the log is what, or even directly omitted, so there is a log of the writing, but in mathematics this is wrong. Therefore, to facilitate the subsequent solution, we can take the logarithm:
For the cost function, he optimizes W and b for the entire training set, so there is a formula that appears above:
In fact, the maximum likelihood estimation method can be used to find the solution, but in the actual optimization, we often use the gradient descent method directly.