One of the optimization methods in Machine Learning: gradient method/shortest Descent Method
0. Introduction to Optimization Problems in Machine Learning
The model in Machine Learning basically has an object function. The parameter of the model is obtained by minimizing the object function.
Here we use the "optimization method", and mostly "multi-variable optimization ". Variable quantities are reflected in many parameters.
Of course, a simple model can get a resolution solution (such as linear regression, and the residual square is minimized), but more models cannot get a resolution solution.
Yes: linear regression parameter estimation problem. The maximum likelihood method is used to derive the least square method.
Most machine learning optimization problems are "unrestricted optimization problems", that is, there is no limit on the range of independent variables, the relevance of independent variables, and the real number space R of the scope of each independent variable (parameter.
Typical solution for unrestricted optimization problems:
1. Descent Recursive Algorithm
2, One-dimensional search
Here we will only introduce the Newton method. In machine learning, this step is the determination of learning efficiency, and machine learning processes them as constants.
Newton Method)
3. Basic descent method for the extreme values of Multivariate Functions
3.1 fastest descent method (gradient method)
3.2 Newton Method
3.3 damping Newton Method
Refer:
1. http://book.douban.com/subject/1164411/