This article mainly simply records the so-called "linear correlations".
The linear correlation object is the vector r^n, for the vector equation, if said x1v1 + x2v2 + ... +xmvm = 0 (where Xi is a constant, vi is a vector) has and only one trivial solution, then we call the M-vector set {V1,V2,V3...VM} is a linear correlation set, conversely, It is said that the vector set {v1,v2,v3,... VM} is linearly independent.
This definition seems a bit abrupt, and we understand the so-called "linear correlation", that is, in the case of a group of non-0 solutions, we move a vector of a factor XI not 0 to the other side of the equation, from this form, we get a linear combination of vector vi about other vectors.
That is, we can understand the so-called linear correlation, M vector r^n, a vector can be expressed in the form of a linear combination of the rest of the vector, and the coefficients are not all 0, then we can call this m vector is linearly related.
That's the theorem. (It clearly has a more rigorous process of proving, and the above is just a very vague, intuitive introduction)
So now we are faced with the question of how do we determine the linear correlation for a given m-vector r^n?
There is a definition of what kind of algorithm, through the beginning of our definition of linear correlation, we can find that we only need to discuss the vector equation x1v1 + x2v2 + x3v3 +...+XMVM = 0 solution can, this goes back to our previous section introduced the use of the Jingzhen matrix to solve the matrix equation, Vector equation and the problem of linear systems.
An example is given below.
Similarly, based on the discussion of the linear correlation between vectors, we can also discuss the linear correlation of the columns of matrices.
"Linear Algebra and its Applications"-linear correlation