[Cs229-Lecture4] Newton's method

Source: Internet
Author: User

Previously, when we were looking for logistic regression, we used the gradient rise algorithm, that is, to maximize the likelihood function, using the gradient rise Algorithm for continuous iteration. This section introduces the Newton method, which functions the same as the gradient rise algorithm. The difference is that the Newton method requires fewer iterations and faster convergence.

The red curve uses the Newton method for iterative solution, and the green curve uses the Gradient Descent Method for solution.

Newton's method: Wiki

Newton's method(Newton's method) Is also calledNewton-lafson Method(Newton-Raphson method), Which is an approximate method for solving equations in real and complex fields. The method uses the first several items of the Taylor series of the function to find the root of the equation.

First, select a function that is close to the zero point and calculate the corresponding and tangent slope (the derivative of the function here ). Then we calculate the coordinates of the intersection of the straight line and the axis passing through the point and slope, that is, the solution of the following equation:

We name the coordinates of the new vertex, which is usually closer to the solution of the equation. So now we can start the next iteration. The iteration formula can be simplified as follows:

It has been proved that if it is continuous and the zero point to be obtained is isolated, there is a region around the zero point. As long as the initial value is located in the adjacent area, the Newton method will surely converge. In addition, if it is not 0, the Newton method will have the square convergence performance. Roughly speaking, this means that the number of valid results for each iteration will double.

 

From: http://blog.csdn.net/luoleicn/article/details/6527049

The two-dimensional condition is discussed above. The Newton iteration formula for the high-dimensional condition is:

H is the Hessian matrix, defined:

In high-dimensional situations, Newton can still be used for iterative solutions. However, the problem is due to the complexity introduced by the Hessian matrix, which greatly increases the difficulty of Newton's Iterative Solutions, however, the solution to this problem is quasi-Newton methond. Instead of directly calculating the Hessian matrix, we use the gradient vector to update the approximation of the Hessian matrix at each step.

[Cs229-Lecture4] Newton's method

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.