03:QR decomposition of matrix method in machine learning

Source: Internet
Author: User

1. Form of QR decomposition

QR decomposition is the product of decomposing matrices into an orthogonal matrix and an upper triangular matrix. QR decomposition is often used to solve linear least squares problems. QR Decomposition is also the basis of a particular eigenvalue algorithm, the QR algorithm. A graph can be used to represent the decomposition visually:

Wherein, Q is a standard orthogonal square, R is the upper triangular matrix.

2. Solution of QR decomposition

The actual calculation of QR decomposition has many methods, such as Givens rotation, householder transformation, and Gram-schmidt orthogonality, and so on. Each method has its advantages and disadvantages. The previous blog introduces Givens rotation and householder transformations, and the third method of linear algebra is already very common. The intermediate process is deduced using the householder transformation method.

Suppose A is a 5x4 matrix, with the X number representing the unchanged elements of this transformation, with the + sign representing the element of the transformation, the H matrix is equivalent to the right of the A matrix row operation:

After four transformations, a is transformed into an upper triangular matrix. And if a is a column vector unrelated, then the R matrix is a non-singular matrix.

Since H1, H2, H3, and H4 are all standardized orthogonal matrices, QT is also a standard orthogonal matrix.

According to the properties of the matrix multiplication, since R are all 0 elements, therefore, the Q matrix can be decomposed into Q1 and Q2 two parts, and then multiplied by the 0 vector part can be omitted, this is thin QR decomposition:

The column vectors of R in the above can be viewed as coordinates of the subspace based on the Q1 column vector.

3. Using QR to solve least squares

The first blog in this series solves the problem of least squares with the Normal equations method, specifically using "pseudo-inverse", but this approach is flawed, such as the problem of large computational volume and roundoff in floating-point arithmetic. Now let's try using the QR method to try it out. For a overdeterminded matrix A, the least squares problem can be attributed to:

The QR decomposition of A, it is easy to deduce the objective function (the square of the residual error):

which

The latter is a mishap, there is no way to optimize, so that the previous item equals 0, you can get the least squares solution:

And there is a benefit is, with householder transform get P1, P2, P3 ... And so on, we do not need to use Q = P1p2p3 to explicitly find out, but the B vector is added to the right of a matrix, with A householder transformation can be:

Another benefit is that the QR decomposition method is smaller than the Normal equations and the solution of some problems are more accurate because the householder transform and the plane rotation transform have good properties for the rounding error problem of floating-point arithmetic.

4. Update the solution set for least squares

In some applications, the system requires us to update the solution set in real time, but the sample data is gradually generated, that is, matrix A and Vector B are gradually becoming longer, how to update the existing solution set?

The problem can be described as follows: assuming that the resulting data is now being QR-decomposed, the newly generated data is a and beta,

Because the middle of the q_2^t B for the operation has no effect, we will throw it away, with X for the invariant element, with + to represent the change of elements, our main idea is to use Givens rotation operation (the previous blog has introduced, actually householder transformation and Givens rotation are left by a The normalized orthogonal matrix, which is equivalent here, rotates 1th and n+1 lines, then rotates on line 2nd and ground n+1 ... Step by step to find the updated R and q1b:

The first line and the n+1 line are rotated:

The second line and the n+1 line are rotated:

Proceed in turn until:

Finally get the solution set.

03:QR decomposition of matrix method in machine learning

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.