As a fan of machine learning, he has recently been studying with Andrew Ng's machines learning. In the first part of the handout, Ng first explains what is called supervised learning, secondly, the linear model solved by least squares, the logistics regression of the response function by using the SIGMOD function, and then, using these two models, a widely used exponential distribution family is introduced. On the basis of the exponential distribution family, the model hypothesis is created, and the GLM model is used to create the model solution of polynomial distribution (Softmax regression), in which the maximum likelihood estimation, Gaussian distribution and Bernoulli distribution cost are also interspersed. Probability interpretation of function selection, a new algorithm for solving linear model-local weighted average method, and two methods for solving the maximum value of function: Gradient Descent Method and Newton method.
But these things stored in the brain seem too messy, hoping to sort out the knowledge in the brain and to be able to communicate with people who also like machine learning.
Organize the directory to
Because the CSDN editor on the top of the formula is too cumbersome, so directly in Word edit, and later converted to a picture format uploaded up.