What would it be like to be in the air with his mind as if he were interacting with a man? I think I will probably not hesitate to close the point. Why can't life be simple and clear? Because it's too straightforward to be boring. Preserving some uncertainties is confusing and fascinating. We learned about linear regression, and there is no pressure to understand the loss function and the weight update formula, which is a specific straightforward benefit. However, with the abstract and obscure logistic regression, the loss function and the weight update formula experienced from P (range 0~1)->p/(1-p) (Value range 0~+oo)->z=log (p/(1-p)) (Value range-oo~+oo)->p=1/1+e^ (-Z), the maximum likelihood function->log likelihood->log likelihood for the weighting derivative and so on a series of strange but reasonable change. The first feeling is that the routines are so deep and unpredictable that they are difficult to master. However, looking at it a few times and giving it a few more chances, you will find that these formulas are not so ugly. Sometimes the difficulty is growing up in imagination.
Logistic regression is a powerful algorithm for classification, which is widely used in the fields of bank loan, advertising precision delivery and so on. The basic knowledge about it can be consulted:
1. Regression XY | Data lakes and Rivers: the second type of regression five-type (logistic regression) Https://mp.weixin.qq.com/s/MRlH5hdPRYNBem53xQBJNQ
2, 0 basic grasp maximum likelihood estimation
Https://mp.weixin.qq.com/s/Zur3PgwtYvVs9ZTOKwTbYg
3. Principle and implementation of logistic regression (including maximum likelihood function and weight update formula derivation)
Http://www.cnblogs.com/sxron/p/5489214.html
The move of the bricks can not be less, or one day, seemingly towering towering building collapsed, then, the success of learning and chicken soup can not help you ...
To help understand, the following first draws the sigmoid function in code, also known as 1/1+e^ (-Z):
The loss function of logistic regression can be considered as log likelihood function plus minus:
The drawing code for the loss function about φ (z) is as follows:
The following is the implementation of the logistic regression algorithm in Python:
Christened Huan is not tired of long, I hope we can keep the peace of mind, in the nothingness of life to find the eternal meaning. Next week Learn how to use Scikit-learn to complete the classification tasks, please expect:)
21-City routines deep use Python to implement the logistic regression algorithm