0x00 Introduction
Matrix, as the name implies, is a rectangular array composed of numbers
Like this: $\begin{array}{l}\begin{bmatrix}2&3&4\0&7&13\c&\alpha&\sqrt5\end{bmatrix}\\end{. array}$
is a 3*3 matrix.
The use of matrices in informatics and even mathematics is very extensive, the following is to introduce some of its basic knowledge, as well as the commonly used places. In this paper, we also introduce the fast matrix multiplication.
0x01 What is the definition of matrix matrix
In fact, that's the way it is.
Defines an n-row m-column matrix as a rectangular array consisting of the number of n*m, where "number" can be imaginary, real, 01 ...
The definition of a matrix actually originates from linear mappings in linear algebra, but the definition is too complex to define the matrix in general (including high school mathematics) in the form of columns.
Basic operations of matrices
The basic operations of matrices include addition and multiplication
Matrix addition
Matrix addition defined between matrices with the same number of two rows
Two n*m matrices are added to get a n*m matrix in which the elements at each position of the resulting new matrix are equal to the sum of the elements of the original two matrices at the corresponding positions
Cases:
$\begin{bmatrix}2&4\5&3\8&1\end{bmatrix}+\begin{bmatrix}5&3\6&7\9&2\end{bmatrix}=\ begin{bmatrix}7&7\11&10\17&3\end{bmatrix}$
Matrix addition satisfies commutative law and binding law
Matrix subtraction is similar, do not repeat
Multiplication of matrices
The matrix number multiplication is defined between a matrix and a number
A number and a n*m matrix multiply, get a n*m matrix, each element of the new matrix equals the corresponding element of the old matrix multiplied by that number
Cases:
$2\ast\begin{bmatrix}2&4\5&3\8&1\end{bmatrix}=\begin{bmatrix}4&8\10&6\16&2\end{bmatrix }$
Matrix multiplication satisfies the commutative law (exchange between numbers), the binding law (in the case of only one matrix), and the allocation rate (a multiplication of two matrices or a matrix by two numbers can be)
Matrix multiplication
Matrix multiplication is a big topic that deserves to be taken out alone.
The first definition of matrix multiplication is the "product" of two mappings between two linear spaces in linear algebra, since one such linear mapping can be represented by a matrix, so two mappings (that is, a mapping to b,b maps to c) can be multiplied to represent
Of course, this blog will not drag out the linear algebra, this article will only give the matrix multiplication method-because it is very useful in Oi
Two matrices A and B are capable of matrix multiplication (a*b), when and only if the number of columns of a is the same as the number of rows in B (so matrix multiplication does not satisfy the commutative law)
Set A is the matrix of n*m, B is the matrix of m*k, at the same time set A*b=c
So:
The C matrix is n*k, and the value of each element of the C matrix is computed as follows:
$c \left[i\right]\left[j\right]={\textstyle\sum_{p=1}^m}a\left[i\right]\left[p\right]\ast B\left[p\right]\left[j \right]$
That is, the first (I,J) element of the new matrix is equal to the sum of each of the elements in column J of A and B, which is why the matrix multiplication requires the same number of rows
Cases:
$\begin{bmatrix}10&1\end{bmatrix}\ast\begin{bmatrix}a&0\c&1\end{bmatrix}=\begin{bmatrix}10a+c& 1\end{bmatrix}$
Matrix multiplication does not satisfy the commutative law, but satisfies the binding law (continuous matrix multiplication only for a*b*c) and the distribution rate (as long as it is a valid operation)
The main application of matrix multiplication in Oi
In Oi, matrix multiplication is mainly used to solve linear recurrence problems.
Linear recursion is such a class of recursion:
$h _n=\sum_{i=1}^{n-1}a_i h_i +b$
Where $h$ is a recursive function, $b $ is a constant term, $a _i$ is the coefficient (can be 0), and the number of all $h_i$ in the right-hand is a
Then we can use matrix multiplication to simulate the transfer of
such as the Fibonacci sequence $f_{i+2}=f_i+f_{i+1}$
Its recursion can be obtained by multiplying the two matrices:
State Matrix $\begin{bmatrix}f_i&f_{i+1}\end{bmatrix}$
Transfer matrix $\begin{bmatrix}0&1\1&1\end{bmatrix}$
Obviously the matrix product of the above two matrices is $\begin{bmatrix}f_{i+1}&f_{i+2}\end{bmatrix}$
However, there is a problem here: Matrix multiplication is $o\left (nmk\right) $, obviously less efficient than the original time, then why do you do that?
The answer is the matrix fast power
The base operation template of the matrix is here (there is no multiplication and addition part, because the OI is not commonly used and is not often tested, and presumably is very simple)
structma{ll a[ -][ -],n,m; Ma () {memset (A,0,sizeof(a)); n=m=0;}voidClear () {memset (A,0,sizeof(a)); n=m=0;}ConstMaoperator*(ConstMa &b) {ma re;re.n=n;re.m=b.m;ll i,j,k; for(i=1; i<=n;i++) { for(j=1; j<=b.m;j++) { for(k=1; k<=m;k++) {re.a[i][j]+=a[i][k]*b.a[k][j]; Re.a[i][j]%=mod; } } }returnRe }Const void operator=(ConstMa &b) {n=b.n;m=b.m;ll i,j; for(i=1; i<=n;i++) for(j=1; j<=m;j++) A[i][j]=b.a[i][j]; }};
0x02 Matrix Fast Power
Fast Power algorithm Everyone must be familiar with: in $o\left (Log_2 n\right) $ in the time complexity of finding a number of n-th square
So, the fast power of the matrix is the same as it
Or consider the Fibonacci sequence recursion, we find that if we set the initial matrix $\begin{bmatrix}f_1&f_2\end{bmatrix}$ to $a$, and the transfer matrix is $b$, then the $f_n$ we ask for is this:
Set the Matrix $c=a\ast b^{n-1}$, then obviously the second column of the first row of the C matrix is the value of the $f_n$
And because matrix multiplication satisfies the binding law, the idea of a fast power can still be used here, so we can use the fast power of thinking, in a very short time to find out the $b^{n-1}$
What advantages does this algorithm have?
Of course there are, for example, I asked you to ask for the $10^{15}$ of the Fibonacci sequence.
At this point the matrix can be quickly $log 10^{15} \ast$ Matrix multiplication complexity of the time to obtain the solution, and the matrix multiplication of this problem because it is 1*2 matrix multiply 2*2, so the complexity is very low
The template for the quick power of matrices is as follows:
ma ppow(ma x,ma y,ll t){ while(t){ if(t&1) x=x*y; y=y*y;t>>=1; } return x;}
Does it look much like a real fast power? Actually, they're both the same.
When can I apply a matrix to a fast power?
In the first part we said that matrix multiplication can be used to solve linear recurrence problems
Then, when we have linear recursion problem, we can easily use matrix fast power.
The Fibonacci sequence above is an example, and there are several examples:
luoguP2044
luoguP1707
luoguP1357
As you can see, the fast power of the matrix can solve the recursion of single recursion, multiple groups of recursion, multiple groups of recursion plus nonlinear constants, even can be done together with DP, as DP optimization
In general, we construct a single row of K-column state matrices and a transfer matrix of K-row K-Columns
Since each lattice of the new matrix is independent of each other in this case, it is more convenient for us to design the transfer matrix
In the fast power of matrix, we just have to determine the state matrix and transfer matrix, then the problem is generally solved.
0x03 Fast Matrix multiplication
Dig Pit + + ... In a week.
The Learning-matrix and matrix fast power of the provincial selection algorithm