Eigenvalues and eigenvectors of matrices of linear algebra

Source: Internet
Author: User

Mathematically, the eigenvector (Eigen-vector) of a linear transformation is a non-degenerate vector whose direction is unchanged under the transformation. The scale in which the vector is scaled is called its eigenvalues (intrinsic values). A linear transformation can usually be fully described by its eigenvalues and eigenvectors. A feature space is a collection of eigenvectors of the same eigenvalues. The word "feature" comes from the German eigen. In 1904 Hilbert first used the word in this sense, and earlier Mholz used it in the relevant sense. The term eigen can be translated as "own", "specific to ... , "characteristic", or "individual". This shows how important a characteristic value is for defining a particular linear transformation.

The eigenvector of a linear transformation is a non-zero vector that is not changed in the direction of the transformation, or simply multiplied by a scaling factor. The characteristic value of a eigenvector is the scale factor that it is multiplying. A feature space is a space consisting of all eigenvectors with the same eigenvalues, and 0 vectors, but note that the 0 vector itself is not a eigenvectors. The main eigenvector of a linear transformation is the eigenvector corresponding to the maximum eigenvalue. The geometric weight of eigenvalues is the dimension of the corresponding feature space. The spectrum of a linear transformation on a finite dimensional vector space is a collection of all its eigenvalues. For example, the eigenvector of a rotation transformation in three-dimensional space is a vector along the axis of rotation, the corresponding eigenvalue is 1, and the corresponding feature space contains all vectors that are parallel to the axis. The feature space is a one-dimensional space, so the geometric magnitude of the eigenvalues 1 is 1. The eigenvalue 1 is the only real eigenvalue in the spectrum of the rotation transformation.

REF:

http://baike.baidu.com/link?url=Y2BO3icowW8sfhBYOT_C9ujaDu6r0mJWmbD8lNB4_r_ID7pMBq5PO1BfB8SJEzwtjX2TqGvT5SSRIRYM7_ZF7a

Description of the Intuition

Let's start with the intuitive content. The characteristic equation of a matrix is:

A * x = LAMDA * x

What does this equation tell you? The last time we mentioned the matrix actually can be regarded as a transformation, the left side of the equation is to change the vector x to another position, the right side is the vector x is a stretch, the amount of stretching is Lamda. Then its meaning is obvious, and the expression of a characteristic of matrix A is that the matrix can lengthen (or shorten) the vector x Lamda times, that's all.

Any given matrix A, not all x can be elongated (shortened). Any vector that can be elongated (shortened) by A is called the eigenvector of a ( eigenvector), and theelongated (shortened) amount is the characteristic value (eigenvalue) corresponding to this eigenvector.

It is important to note that the eigenvector we are talking about is a class of vectors, because any eigenvector that is arbitrarily multiplied by a scalar result certainly satisfies the above equation, and of course both vectors can be regarded as the same eigenvector, and they all correspond to the same eigenvalue.

If the eigenvalues are negative, it means that the matrix not only lengthens (shortens) the vector, but also points the vector to the opposite direction.

A matrix can lengthen (shorten) several vectors, so it may have many eigenvalues. Interestingly, if a is a real symmetric matrix, then the characteristic vectors corresponding to the different eigenvalues are definitely orthogonal to each other, because (? )。

We can also say that all eigenvectors of a transformation matrix comprise a set of bases for this transformation matrix. The so-called base can be understood as the axis of the coordinate system. Most of our usual use of the Cartesian coordinate system, in linear algebra can be distorted, stretched, rotated, called the base of the transformation. We can set the base according to our needs, but the base of the axis must be linear-independent, that is, to ensure that the different axes of the coordinate system do not point to the same direction or can be a combination of other axes, otherwise the original space will "brace" not up. In PCA (Principal Component Analysis) , we can greatly compress the data and reduce distortion by setting the base in the largest direction of stretching, ignoring some small amount.

It is important that all eigenvectors of a transform matrix be used as the basis of space, because the transformation matrix in these directions can pull up the amount without distorting and rotating it, making the computation much simpler. So the eigenvalues are important, and our ultimate goal is eigenvectors.

A few important abstract concepts

We go back to the meaning of the matrix and introduce several abstract concepts:

kernel : A set of vectors consisting of 0 vectors, usually represented by Ker (A), after the transformation matrix . If you are a vector, there is a matrix to transform you, if you unfortunately fall in the core of the matrix, then it is regrettable that after the conversion you become a void of 0. In particular, nuclear is the concept of "transformation" (Transform) , and a similar concept in matrix transformation is called "0 space". Some materials use t to refer to the transformation , when the matrix is connected with a, thematrix is directly regarded as "transformation". The space in which the nucleus resides is defined as V space, which is the space in which all vectors are originally located.

Range: A set of vectors formed by the transformation matrix of all vectors in a space, usually represented by R(A). Suppose you are a vector, there is a matrix to transform you, and the range of this matrix represents your future position, and you cannot run beyond these positions. The dimension of the range is also called Rank. The space where the range is located is defined as W space. the parts of the W space that are not part of the range we'll talk about later.

space : vectors plus Add, multiply operations make up the space. Vectors can (and can only) be transformed in space. Use the coordinate system (base) to describe the vector in space.

Both nuclear and domain, they are closed. It means that if you and your friends are trapped inside the nucleus, you will not run out of the nucleus, either by adding or multiplying. This makes up a subspace. The same range of domains.

The mathematician proves that the dimension of V must be equal to the dimension of the kernel of any of its transformation matrices plus the dimension of the domain.

Dim (V) = Dim (Ker (A)) + Dim (R (a))

Strict proof process can refer to textbooks, here is a visual proof method:

The dimensions of V are the number of bases of V, which are divided into two parts, one part in the nucleus and the other as the primary image of the non-0 image of the range (which can certainly be divided, because both the kernel and the range are independent subspace). If any vector in V is written in the form of a base, then the vector must also be part of the kernel, and the other part in the original image of the non-0 image in the range. Now the transformation of this vector, the part of the kernel is of course zero, and the other part of the dimension is just equal to the dimension of the domain.

Transformation matrix row Space and 0-space relationship

In addition, according to the nature of the matrix, the number of rows of the transformation matrix is equal to the dimension of V, and the rank of the transformation matrix equals the dimension of the range R , so it can also be recorded as:

the number of rows in a = Dim ( 0 Space of a) + the rank of a

Because the rank of a is another dimension of the A- line space (Note that this number is definitely less than the number of rows in a non-full-rank matrix):

number of rows in a = Dim ( 0 Space of a) + Dim ( line Space of a)

Why do you write this form? because from here we can find that the 0 space of a and the line space of a are orthogonal complementary. Orthogonal is because 0 space is the kernel, and the line vector by definition multiplied by a is of course zero. The complementarity is because they add up just spanned the entire v space.

This orthogonal complementarity leads to very good properties, as the 0 space of a and the base of the line space of a can be combined to form a base of V .

The relationship between the transformation matrix column space and the left 0 space

If you transpose the above equation, you can get:

number of columns in a = Dim ( 0 Space of A^t) + Dim ( column Space of a)

Because the practical meaning of a^t is to reverse the range and definition domain, so the space of the A^t 0 is the space from the area outside the domain to all vectors of the V 0 points (a bit of a mouthful! ). ), someone called it "left 0 space" (leftnull). This:

number of columns in a = Dim ( left 0 space of a) + Dim ( column Space of a)

the left 0 space of a is also orthogonal to the column space of a, and they add up to just one w space. Their bases also form the base of W.

Relationship between row space and column space of transformation matrices

Don't forget that the transformation matrix actually transforms the target vector from the line space into the column space.

The line space of the matrix, the column space, 0 space, and the left 0 space make up all the space in the study of linear algebra, and make clear the relationship between them, which is very important for the transformation of the basis of the distinction.

The secret of the characteristic equation

We are trying to construct a transformation matrix a: it transforms the vector into a range space, and the base of the range space is orthogonal; not only that, but also any one of the base v has a * u = lamda * v form, U is a known base of the original space. This allows us to transform complex vector problems into an exceptionally simple space.

If the number of U is not equal to V, then replace a with a^t*a, can become a symmetric and semi-positive matrix, its eigenvector is the required base v!

Again, the matrix is not equal to the transformation, and the matrix as a transformation simply provides a way to understand the transformation matrix. Or, a matrix is just one form of transformation.

REF:

http://blog.csdn.net/wangxiaojun911/article/details/6737933

Matrix multiplication corresponds to a transformation, which transforms any vector into a new one that is mostly different in another direction or length. In the process of this transformation, the original vector changes mainly in rotation and scaling. If the matrix only has a scaling transformation to a vector or some vectors, and does not produce a rotational effect on those vectors, then these vectors are called the eigenvectors of the matrix, and the scale of the scaling is the eigenvalues.

In fact, the above-mentioned paragraph both describes the matrix transformation eigenvalue and the geometric meaning of the eigenvector (graph transformation) also speaks its physical meaning. The meaning of physics is the picture of motion: the characteristic vector is scaled by the action of a matrix, and the amplitude of the scaling is determined by the eigenvalue. Eigenvalues greater than 1, all characteristics of this eigenvalues of the feature vector body is very long, the eigenvalues are greater than 0 is less than 1, the eigenvector is shrinking, the eigenvalues are less than 0, the eigenvectors are indented, the opposite direction to 0 Point over there.

Note: Often textbooks say that eigenvector is a vector that does not change direction under matrix transformation, in fact, when the eigenvalues are less than zero, the matrix will change the eigenvector completely in reverse direction, of course the eigenvector or eigenvectors. I agree that the eigenvector does not change direction: The eigenvector never changes direction, but only the eigenvalues (the inverse of the characteristic value is negative). This is a bit similar to say that the winter Shenzhen Outdoor "Temperature" is 10 ℃, Harbin Outdoor "Temperature" is -30℃ (called temperature and not temperature), also similar to say that unmanned aerial vehicles at altitude "altitude" 100 meters flying and nuclear submarines at altitude "height"-50 meters (called height and not high) at the same cruising.

For eigenvalues and eigenvectors, please note two highlights here. These two bright spots one is the meaning of linear invariants, two are the spectral meaning of vibration.

eigenvectors are linear invariants

One of the highlights of the so-called eigenvectors concept is the invariant, which is called linear invariants. Because we often say, linear transformation, linear transformation, is not a line (vector) into another line (vector), the change of line is mostly the direction and length of a piece of change. A vector called "eigenvector" is special, and the invariant direction is only variable in length under the action of the Matrix. The invariant direction is called linear invariants.

If the reader insists that the negative vector is the idea of changing the direction of the vector, you might as well look at the linear invariants: the invariance of eigenvectors is that they become vectors that are collinear with themselves, and their lines remain constant under linear transformations; the eigenvector and his transformed vectors are in the same line, The transformed vectors are either elongate or shortened, or reverse-elongate or reverse-shorten, or even become 0 vectors (eigenvalues are 0 o'clock)

REF:

http://blog.163.com/[email protected]/blog/static/1624014002011711114526759/

A transformed eigenvector is a vector that, after this particular transformation, remains in the same direction, just stretching the length.

A matrix is a linear change in which the eigenvector is the constant vector in this change.

REF:

Http://www.cnblogs.com/isabelincoln/archive/2009/06/18/1504623.html

Wikipedia's feature vector

REF:

Https://zh.wikipedia.org/wiki/%E7%89%B9%E5%BE%81%E5%90%91%E9%87%8F

Eigenvalues and eigenvectors of matrices of linear algebra

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.