This section mainly reviews some simple knowledge about linear algebra.
Matrix and vector Matrix
Number of $ m \ times N $ A _ {IJ} (I = ,..., m; j = 1, 2 ,..., n) $ the number table of $ M $ row $ N $ column, which is called the matrix of $ M $ row $ N $ column, for short, $ m \ times N $ matrix. It is recorded:
$ \ Matrix {A }=\begin {bmatrix} A _ {11} & A _ {12} & \ cdots & A _ {1N} \ CRA _ {21 }& A _ {22} & \ cdots & A _ {2n} \ Cr \ vdots & \ vdots \ CRA _ {M1} & A _ {m2} & \ cdots & A _ {Mn} \ Cr \ end {bmatrix} $
Note: $ \ matrix {A }=\ matrix {A }_{ m \ times N }= (A _ {IJ}) _ {M \ times n} $
$ \ Matrix {A }_{ IJ} = $ "$ I, j $ entry" in the $ I ^ {th} $ row, $ J ^ {th} $ column.
Vector
The so-called $ N $ dimension vector is the matrix of $ n \ times 1 $:
$ \ Matrix {x }=\begin {bmatrix} X _ 1 \ CRX _ 2 \ Cr \ vdots \ CRX _ n \ Cr \ end {bmatrix} $
Two starting methods of vectors:
$ \ Matrix {y }=\begin {bmatrix} y _ 1 \ cry _ 2 \ Cr \ vdots \ cry _ n \ Cr \ end {bmatrix} \ matrix {y} = \ begin {bmatrix} y _ 0 \ cry _ 1 \ Cr \ vdots \ cry _ {n-1} \ Cr \ end {bmatrix} $
Math may prefer the first type of expression, while programmers may prefer the second type of expression.
Basic matrix operations Addition and subtraction
$ \ Matrix {A \ pm B }=\begin {bmatrix} A _ {11} \ pm B _ {11} & A _ {12} \ pm B _ {12} & \ cdots & A _ {1N} \ pm B _ {1N} \ CRA _ {21} \ pm B _ {21} & A _ {22} \ pm B _ {22} & \ cdots & A _ {2n} \ pm B _ {2n} \ Cr \ vdots & \ vdots \ CRA _ {M1} \ pm B _ {M1} & A _ {m2} \ pm B _ {m2} & \ cdots & A _ {Mn} \ pm B _ {Mn} \ Cr \ end {bmatrix} $ $
Multiply number by Matrix
$ \ Lambda \ times \ matrix {A }=\ matrix {A} \ times \ Lambda =\begin {bmatrix} \ Lambda \ times a _ {11} & \ Lambda \ times A _ {12} & \ cdots & \ Lambda \ times a _ {1N} \ Cr \ Lambda \ times a _ {21} & \ Lambda \ times a _ {22} & \ cdots & \ Lambda \ times a _ {2n} \ Cr \ vdots & \ vdots \ Cr \ Lambda \ times a _ {M1} & \ Lambda \ times A _ {m2} & \ cdots & \ Lambda \ times a _ {Mn} \ Cr \ end {bmatrix} $
Division is simply multiplied by $ \ frac {1} {\ Lambda} $, all of which are the same.
Matrix vector multiplication
$ \ Begin {bmatrix} A _ {11} & A _ {12} & \ cdots & A _ {1N} \ CRA _ {21} & A _ {22 }& \ cdots & A _ {2n} \ Cr \ vdots & \ vdots \ CRA _ {M1} & A _ {m2} & \ cdots & A _ {Mn} \ Cr \ end {bmatrix} \ times \ begin {bmatrix} X _ 1 \ CRX _ 2 \ Cr \ vdots \ CRX _ n \ Cr \ end {bmatrix} = \ begin {bmatrix} A _ {11} \ times X _ 1 + A _ {12} \ times X _ 2 + \ cdots + A _ {1N} \ times X _ n \ CRA _ {21} \ times X _ 1 + A _ {22} \ times X _ 2 + \ cdots + A _ {2n} \ times X _ n \ Cr \ vdots \ CRA _ {M1} \ times X _ 1 + A _ {m2} \ times X _ 2 + \ cdots + A _ {Mn} \ times X _ n \ Cr \ end {bmatrix} $
Note: $ \ matrix {A} \ times \ matrix {x }=\ matrix {y} $
In fact, a closer look is another way of expressing the equations:
$ \ Begin {cases} A _ {11} \ times X _ 1 + A _ {12} \ times X _ 2 + \ cdots + A _ {1N} \ times X _ n = Y _ 1 \ CRA _ {21} \ times X _ 1 + A _ {22} \ times X _ 2 + \ cdots + A _ {2n} \ times X _ n = Y _ 2 \ Cr \ vdots \ CRA _ {M1} \ times X _ 1 + A _ {m2} \ times X _ 2 + \ cdots + A _ {Mn} \ times X _ n = Y _ m \ Cr \ end {cases} $
Matrix Multiplication
$ \ Matrix {A} \ times \ matrix {B} $
$ \ Matrix {B} $ can be viewed: $ \ begin {bmatrix} X _ {11} \ Cr X _ {21} \ Cr \ vdots \ Cr X _ {N1} \ Cr \ end {bmatrix} & \ begin {bmatrix} X _ {12} \ Cr X _ {22} \ Cr \ vdots \ Cr X _ {N2} \ Cr \ end {bmatrix }&\ cdots & \ begin {bmatrix} X _ {1 k} \ Cr X _ {2 k} \ Cr \ vdots \ Cr X _ {nk} \ Cr \ end {bmatrix }\ end {bmatrix }=\ begin {bmatrix} \ matrix {X _ 1} & \ matrix {X _ 2} & \ cdots & \ matrix {X _ k} \ end {bmatrix} $
$ \ Matrix {A} \ times \ matrix {B} =\begin {bmatrix} A \ times \ matrix {X _ 1} & A \ times \ matrix {X _ 2} & \ cdots & A \ times \ matrix {X _ k} \ end {bmatrix} $
$ \ Matrix {A _ {M \ times N }}\ times \ matrix {B _ {n \ times K }}=\ matrix {C _ {M \ times k }} $
The main focus is the subscript, which cannot be multiplied if it does not match.
Matrix multiplication rule
$ (\ Matrix {A} \ times \ matrix {B }) \ times \ matrix {C }=\ matrix {A} \ times (\ matrix {B} \ times \ matrix {c}) $
$ \ Matrix {A} \ times (\ matrix {B} + \ matrix {c }) = \ matrix {A} \ times \ matrix {B} + \ matrix {A} \ times \ matrix {c} $
$ (\ Matrix {B} + \ matrix {c }) \ times \ matrix {A }=\ matrix {B} \ times \ matrix {A} + \ matrix {c} \ times \ matrix {A} $
In general, multiplication does not satisfy the exchange law:
$ \ Matrix {A} \ times \ matrix {B} \ not = \ matrix {B} \ times \ matrix {A} $
Special Matrix
$ \ Matrix {I }=\ matrix {I _ {n \ times N }}=\ begin {bmatrix} 1 & 0 & \ cdots & 0 & 0 \ Cr0 & 1 & \ cdots & 0 & 0 \ Cr \ vdots & \ vdots \ Cr0 & 0 & \ cdots & 1 & 0 \ Cr0 & 0 & \ cdots & 0 & 1 \ Cr \ end {bmatrix} $
For any matrix $ \ matrix {A} $:
$ \ Matrix {A} \ times \ matrix {I }=\ matrix {I} \ times \ matrix {A }=\ matrix {A} $
Inverse Matrix and inverted Matrix Inverse Matrix
$ \ Matrix {A} \ times \ matrix {A ^ {-1 }}=\ matrix {A ^ {-1 }}\ times \ matrix {A} = \ matrix {i} $
Inverted Matrix
$ \ Matrix {B }=\ matrix {A ^ t} $
$ \ Matrix {B _ {IJ }}=\ matrix {A _ {Ji} $
References
This article mainly references the following materials:
- Andrew Ng, machine learning