@, if there is no solution to the linear equations, the equations are said to be incompatible (inconsistent).
@, if there is at least one solution in a linear equation group, the equations are said to be compatible (consistent).
@, the equivalent equations group (equivalent systems).
@, definition: If two equations with the same variables have the same set of solutions, they are called equivalent (equivalent).
@, get an equivalent set of equations:
1, exchange the order of any two equations.
2, any equation on both sides of the same multiply a nonzero real number.
3. The multiple of the any equation is added to the other side of the process.
@, definition: If the coefficients of the first k-1 variables of the K-equation are zero, and the coefficients of xk (k=1, ..., n) are nonzero, the equations are said to be strictly triangular (strict triangular form).
@, the method of solving the strict triangle Equation Group: return (back substitution).
@, the coefficient matrix of the equation Group (coefficient matrix).
@, augmented matrix of equations (augmented matrix).
@, Elementary row operations:
1, Exchange two.
2. Multiply a row by a non-0 real number.
3. Replace a line with a multiple of its other rows.
@, main row (pivotal row): Used to eliminate other line elements.
@, main element (pivot): The first non-0 element of the primary row.
@, first variable (lead variables): A variable that corresponds to the first non-0-tuple of each row of the augmented matrix after simplification.
@, free variables: The variable for the skipped column in the simplification process (that is, the variable except the first variable).
@, definition: if a matrix satisfies
1. The first non-0-1 in each non-0 line;
2, the number of the K line is zero, the first variable in the beginning of the k+1 0 is more than the number of zeros before the beginning of the first variable of k;
3. All rows with zero elements must be followed by lines that are not all zeros.
is called the row ladder-shaped matrix (row echelon form).
@, Gaussian elimination method (Gaussian elimination): definition: Using row operations 1, 2, and 3, the process of the augmented matrix of a linear equation group into a row ladder shape is called the Gaussian elimination element method.
@, over-set equations: If the number of equations in a linear equation is more than the number of unknowns, it is called the overdetermined. The set of equations is usually (but not always) incompatible.
@, the set of equations: If the number of equations in a linear equation (n) is more than the number of unknowns (m), it is called the underdetermined. The set of equations may be incompatible, but they are usually compatible and have infinitely multiple solutions. Because the row ladder form has R (r<=m) first variable, then there must be a n-r (n-r>=n-m>0) free variable, when the free variable takes different values, can get different solutions.
@, define if a matrix satisfies
1, the matrix is row ladder-shaped;
2. The first non-$ 0 of each row is the only non-$ 0 of the column,
The matrix is said to be the simplest of rows (reduced row echelon form).
@, Gauss-if works when the elimination method (Gauss-jordan reduction): The basic line operation of the matrix into the simplest form of the process.
@, homogeneous Equation Group: If the right end of a linear equation group is all zeros, it is called homogeneous (homogeneous). Homogeneous equations are always compatible, with at least one trivial solution.
@, Ordinary solution: (0, 0, ..., 0). That is, all variables have a value of zero.
@, non-trivial solution: not the solution (0, 0, ..., 0).
@, theorem 1.2.1 If n>m, then the homogeneous linear equations of M * N have non-trivial solutions.
The @, the element in the matrix is called a scalar. is usually a real or complex number.
@, in general, if a is used to denote a matrix, then AIJ represents the element of column J of line I of matrix A. The matrix précis-writers is: A = (AIJ).
@, vector (vectors): n-tuple of real numbers.
@, row vector (row vector): An n-tuple is represented by a 1*n matrix.
@, column vectors (column vector): An n-tuple is represented by a n*1 matrix.
@, when using matrix equations, it is convenient to use column vectors to represent the solution, so the geometry of n*1 is called n-dimensional Euclidean space (Euclidean n-space), usually denoted by RN.
@, define if two m*n matrices A and B pair either I and j satisfy AIJ = bij, they are said to be equal (equal).
@, scalar multiplication: Defines a matrix with a m*n, and α is a scalar, then Αa is a m*n matrix whose (i, j) element is Αaij.
@, matrix addition: definitions: Set A= (AIJ) and b= (bij) are m*n matrices, then their sum (sum) a+b is also a m*n matrix, for each ordered pair (i, J), its (i, j) element is AIJ + bij.
@, 0 matrix (zero Matirx): a matrix with all zeros in the element. denoted by O.
@, a, and O are all m*n matrices, and O is the 0 matrix, then:
1, a + O = o + a = A;
2, A + ( -1) a = O = ( -1) A + A;
@, definition if A1, A2, ..., an is a vector in Rm, and C1, C2, ..., CN is scalar, then the Type C1A1 + c2A2 + ... + cnAN called Vector A1, A2, ..., a linear combination of an (linear combination).
@, Theorem 1.3.1 (theorem of compatibility of linear equations) the necessary and sufficient condition for a line equation Group Ax = b is that vector B can be written as a linear combination of matrix A-column vectors.
@, defined if a= (AIJ) is a m*n matrix, and b= (bij) is a n*r matrix, the product ab=c= (CIJ) is a m*r matrix whose elements are defined as: CIJ = a→ibJ =∑aikbkj. (Note:a→I represents the first line vector of matrix A)
@, matrix multiplication does not meet the exchange rate, that is, AB! = BA.
@, define a m*n matrix a transpose (transpose) to n*m Matrix B, defined as Bji = AIJ where j=1, ..., N and I=1, ..., M. a transpose is at.
@, defines a n*n matrix A, if satisfied at=a, is called symmetric (symmetric).
@, the Law of matrix algebra:
@, Unit matrix (Identity matrix): I = (Δij), where Δij = 1 (When I = j) or 0 (when i≠j). BI = B, IC = C.
@, use column vectors to denote i, I = (e1, E2, ..., en), do not use IJ.
@, define if there is a matrix B that makes ab=ba=i, then the N*n matrix A is either non-singular (nonsingular) or reversible (invertible). Matrix B is called a multiplicative inverse (multiplication inverse).
@, if B and C are multiplicative inverses of a, then b = BI = B (AC) = (BA) C = IC = C, so a matrix has at most one multiplicative inverse. The inverse of a is recorded as A-1.
@, define a n*n matrix if there is no multiplicative inverse, it is called singular (singular).
@, only the phalanx (that is, the number of rows in the matrix equals the number of columns) has a multiplicative inverse. For non-phalanx, the term singular or non-singular should not be used.
@, theorem 1.4.2 If A and B are non-singular n*n matrices, then AB is also non-singular, and (ab)-1 = b-1a-1.
The algebraic law of the @, transpose:
@, theorem 1.4.3 A is a n*n adjacency matrix of a graph, and a (k) IJ denotes the (i, j) element of AK, then a (k) Ji equals the number of paths between Vertex VI and VJ length K.
Linear algebra--Linear Algebra with Applications