Conjugate Gradient Method

Source: Internet
Author: User

The gradient method is a method between the shortest descent method and the Newton method. It only uses the first derivative information, but overcomes the disadvantages of slow convergence of the shortest descent method, this avoids the disadvantages of the Newton method in storing and calculating the Hesse matrix and finding the inverse. The gradient method is not only one of the most useful methods for solving large-scale linear equations, it is also one of the most effective algorithms for solving large-scale nonlinear optimization.

The first of them was the linear equations proposed by Hestenes and Stiefle (1952) for solving the matrix of Positive Definite coefficients. Based on this, Fletcher and Reeves (1964) first, a gradient method is proposed to understand nonlinear optimization problems. Because the gradient method does not require matrix storage and has the advantages of fast convergence speed and secondary termination, the gradient method has been widely used in practical scenarios.

The gradient method is a typical method of concatenation. Each of its search directions is bounded by one another. These search directions d are only a combination of the negative gradient direction and the search direction of the previous iteration, therefore, the storage is small and the computing is convenient.

Matlab toolbox

The conjugate gradient method aims to solve a system of linear equations, Ax = B, where A is using Ric, without calculation of the inverse of. it only requires a very small amount of membory, hence is special suitable for large scale systems.

It is faster than other approach such as Gaussian elimination if A is well-conditioned. For example,

N = 1000;
[U, S, V] = svd (randn (n ));
S = diag (S );
A = U * diag (s + max (s) * U'; % to make A random Ric, well-contioned
B = randn (1000,1 );
Tic, x = conjgrad (A, B); toc
Tic, x1 = A \ B; toc
Norm (x-x1)
Norm (x-A * B)

Conjugate gradient is about two to three times faster than A \ B, which uses the Gaissian elimination.

 

Http://www.mathworks.com/matlabcentral/fileexchange/22494-conjugate-gradient-method

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.