Hession Matrix and Newton iterative method

Source: Internet
Author: User

1, solve the equation.

Not all equations have the root formula, or the root formula is complex, resulting in difficult to solve. The Newton method can be used to solve the problem iteratively.

The principle is to use the Taylor formula, unfold at the x0, and expand to the first order, i.e. f (x) = f (x0) + (x-x0) f ' (x0)

The solution equation f (x) = 0, i.e. f (x0) + (x-x0) *f ' (x0) = 0, solves x = X1=x0-f (x0)/F ' (x0), because this is the first order expansion of the Taylor formula, f (x) = f (x0) + (x-x0) f ' (x0) is not exactly equal, It is approximately equal, and the X1 obtained here cannot let f (x) = 0, it can only be said that the value of f (x1) is closer to f (x) = 0, so the idea of iterative solution is natural, and can then be introduced x (n+1) =x (n)-f (x (n))/f (x (n)), by iteration, This formula must converge at the time F (x*) =0. The whole process is as follows:

2. Newton method for optimization

In the optimization problem, the linear optimization can be solved at least by using the simple method, but for the nonlinear optimization problem, Newton method provides a solution. Assuming that the task is to optimize a target function f, the minimax problem of the function f can be transformed into the problem of solving the derivative F ' =0 of the function f, so that the optimization problem can be considered as the equation solving problem (f ' =0). The rest of the question is similar to the Newton solution mentioned in the first part.

This time in order to solve the root of f ' =0, the Taylor of F (x) expands to the 2-order form:

This equation is established when and only if Δx Wireless is approaching 0. At this point the equivalence is equivalent to:

Solving:

The iterative formula is obtained:

It is generally believed that Newton's method can be used to the information of the curve itself, which is more easily convergent than the gradient descent method (iteration less), as an example of minimizing a target equation, the red curve is solved by Newton method, and the green curve is solved by the gradient descent method.

The 2-dimensional case is discussed above, and the Newton iterative formula for the high-dimensional case is:

Where h is the Hessian matrix, defined as:

The high-dimensional situation can still be solved by Newton iteration, but the problem is the complexity of Hessian matrix, which makes Newton's iterative solution much more difficult, but there is a way to solve this problem is Quasi-Newton Methond, no longer directly calculate Hessian matrix, Instead, use gradient vectors to update the approximation of the Hessian matrix at every step. The details of Quasi-Newton method I have not fully understood, and listen to tell bar ...

Http://blog.sina.com.cn/s/blog_5364f9f20101dkyr.html

Hession Matrix and Newton iterative method

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.