Singular Value Decomposition (SVD)-Singular Value Decomposition

Source: Internet
Author: User

Singular Value DecompositionIt is an important matrix decomposition in linear algebra and has important applications in signal processing, statistics, and other fields. In some aspects, Singular Value Decomposition is similar to the symmetric matrix or the Hermite matrix based on feature vectors. However, although there is a correlation between the two matrix decomposition, there is a significant difference. The basis of feature vector decomposition of symmetric arrays is spectral analysis, while Singular Value Decomposition is the promotion of Spectral Analysis Theory on Arbitrary Matrices.

  • 1. theoretical description 2. Singular Values and singular vectors, and their relationship with Singular Value Decomposition

    • 1.1 intuitive explanation
  • 3. Relationship with feature Decomposition
  • 4. geometric meaning
  • 5. Simplified SVD
  • 6-norm
  • 7 Applications
    • 7.1 pseudo-Inverse
    • 7.2 parallel Singular Value Model
    • 7.3 value range, zero space, and rank
    • 7.4 matrix Approximation
  • 8. Calculate SVD
  • 9. History
  • 10 See
  • 11 external links
  • 12 references
Theoretical description

HypothesisMIsM × nLevel matrix, where all elements belong to the domainK, That is, the real number field or the complex number field. So there is a decomposition

WhereUYesM × mOrder Matrix; Σ is a semi-Definite MatrixM × nDiagonal matrix;V *, That isVIsN × nOrder Matrix. Such decomposition is calledMOfSingular Value Decomposition. Element Σ on the Σ diagonal lineI,IThat isMOfSingular Value.

A common practice is to arrange Singular Values in large and small order. So that Σ canMUniquely identified. (AlthoughUAndVStill uncertain .)

Intuitive explanation

In the matrixMIn the Singular Value Decomposition

  • VColumns form a set of base vectors for orthogonal "input" or "analysis. These vectors are feature vectors.
  • UColumns form a set of base vectors for orthogonal "output. These vectors are feature vectors.
  • The element on the diagonal of Σ is a singular value, which can be regarded as a scalar "Expansion control" between input and output ". These are the characteristics of the sum and correspondUAndVCorresponding to the row vector.
Singular Values and singular vectors, and the relationship between them and Singular Value Decomposition and the link geometric meaning of feature value decomposition

BecauseUAndVVectors are all units vectors. We know thatUColumn vectorU1 ,...,UmConstituteKMA set of standard orthogonal basis of space. Similarly,VColumn vectorV1 ,...,VnIt also makes upKNA set of standard orthogonal basis of space (according to the standard dot product law of vector space ).

Linear transformationT:KNKM, Put the VectorXConvertMx. Considering these standard orthogonal bases, this transformation is easy to describe:T(Vi) =σ I ui,I= 1,..., min (M,N), Whereσ IIs the numberIElements; whenI> Min (M,N,T(VI) = 0.

In this way, the geometric meaning of SVD theory can be summarized as follows: For each linear ingT:KNKM,TSetKNTheIBase vector ing isKMTheIAnd map the remaining base vectors to zero vectors. Map these base VectorsTIt can be expressed as a non-negative diagonal array.

Simplified SVD norm

1. The concept of matrix norm is set to A ε Cm × n, and A real-value function is defined. | A | if the following conditions are met:

(1) Non-negative: | A | ≥0, and | A | = 0 When and only when A = 0; (2) homogeneous: | aA | = | a | A |, a, C; (3) Triangle Inequality: | A + B | ≤ | A | + | B |, A, bε Cm × n; (4) compatibility: | AB | ≤| | A | B |

| A | is the matrix norm of. For example, if A = (aij) ε Cn × n is set

All are matrix norm.

Theorem 2: The matrix norms induced by the 1-norm, 2-norm, and ∞-norm of the vector are

It is usually called column and norm, spectral norm, row, and norm in sequence.

Theorem 3: both the spectral norm and the F-norm are undo norm, that is, for any matrix P and Q, yes | PAQ ||=|| A ||.

Application pseudo-Inverse

Singular Value Decomposition can be used to calculate the pseudo-inverse of a matrix. If MatrixMIntoM=UΣV*, ThenMThe pseudo-inverse is

In this example, Σ + transposes Σ and calculates the reciprocal of each non-zero element on its main diagonal. The pseudo-inverse is usually used to solve the Linear Least Square problem.

Parallel Singular Value Model

The frequency selective fading channels are decomposed.

Approximate values of value range, zero space, and rank matrix

The main application of Singular Value Decomposition in statistics is Principal Component Analysis (PCA). It is a data analysis method used to find the hidden "pattern" in a large amount of data and can be used in pattern recognition, data compression. The PCA algorithm maps datasets to low-dimensional spaces. The feature values of a dataset (characterized by Singular Values in SVD) are arranged by importance. The Dimensionality Reduction Process is to discard unimportant feature vectors, the remaining feature vector space is the space after dimensionality reduction.

Calculate SVD

Matlab: [B c d] = svd (A) OpenCV: void cvSVD (CvArr * A, CvArr * W, CvArr * U = NULL, CvArr * V = NULL, int flags = 0)

For more information, see external links.
  • LAPACK users manual gives details of subroutines to calculate the SVD (see also [1]).
  • Applications of SVDon PC Hansen's web site.
  • Introduction to the Singular Value Decompositionby Todd Will of the University of Wisconsin -- La Crosse.
  • Los Alamos group's book chapterhas helpful gene data analysis examples.
  • MIT Lectureseries by Gilbert Strang. See Lecture #29 on the SVD.
  • Java SVDlibrary routine.
  • Java appletdemonstrating the SVD.
  • Java scriptdemonstrating the SVD more extensively, paste your data from a spreadsheet.
  • Chapter from "Numerical Recipes in C" gives more information about implementation and applications of SVD.
  • Online Matrix Calculator Performs singular value decomposition of matrices.
  • Demmel, J. and Kahan, W. (1990). Computing Small Singular Values of Bidiagonal Matrices With Guaranteed High Relative Accuracy.Siam j. Sci. Statist. Comput.,11(5), 873-912.
  • Golub, G. h. and Van Loan, C. f. (1996 ). "Matrix Computations ". 3rd ed ., johns Hopkins University Press, Baltimore. ISBN 0-8018-5414-8.
  • Halldor, Bjornsson and Venegas, Silvia. (1997 ). "A manual for EOF and SVD analyses of climate data ". mcGill University, CCGCR Report No. 97-1, Montr éal, qué bec, 52pp.
  • Hansen, P. C. (1987). The truncated SVD as a method for regularization.BIT,27, 534-553.
  • Horn, Roger A. and Johnson, Charles R (1985). "Matrix Analysis". Section 7.3. Cambridge University Press. ISBN 0-521-38632-2.
  • Horn, Roger A. and Johnson, Charles R (1991). Topics in Matrix Analysis, Chapter 3. Cambridge University Press. ISBN 0-521-46713-6.
  • Strang G (1998). "Introduction to Linear Algebra". Section 6.7. 3rd ed., Wellesley-Cambridge Press. ISBN 0-9614088-5-5.
One classification: matrix decomposition

From: Title = % E5 % A5 % 87% E5 % BC % 82% E5 % 80% BC % E5 % 88% E8 % A7 % A3 & oldid = 86%


Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.