Matrix multiplication is one of the most common problems in linear algebra, and it is widely used in numerical computation.
For a long time, the matrix multiplication algorithm is as straightforward as its own definition.
> Set A is the matrix of MXN, B is the NXP matrix, then C=ab is the product of matrix A and matrix B, and C is the MXP matrix.
The specific algorithm of > i.e. C[i][j]=∑a[i][k]xb[k][j] (k from 1 to N) is described as:
for (int i = 1;i <= m;++i) {for
(int j = 1;j <= p;++j) {for
(int k = 1;k <=) {
n;++k] + = C[i][j ]*B[K][J];}}
The algorithm is simple and straightforward, and it also clearly reveals the definition of matrix multiplication.
From the algorithm to achieve a glimpse of the algorithm time complexity is O (n^3) (another matrix is all NXN size)
We know another method of matrix multiplication--block matrices--for linear algebra.
This can be
C11=a11b11+a12b21
C12=a11b12+a12b22
C21=a21b11+a22b21
C22=a21b12+a22b22
The above is the theory of linear algebraic block matrices, when the algorithm is implemented, the essence of the block matrix is partition.
This algorithm, of course, does not resemble the 3 for loop algorithm so simple, then its time complexity is how much. > Algorithm Analysis: (assuming that all matrices are nxn squares)
The simplest case of recursion is that A and B are all 2-order matrices, and the A11 is actually a number, that is, the product of 2 2-order matrices can be calculated directly, requiring a total of 8 multiplication and 4 addition. When the order of the sub matrix is greater than 2 o'clock, in order to find the product of the 2 matrices, we can continue to block the sub matrix until the order of the sub matrix is 2. Thus, a recursive algorithm is produced to divide and treat descending order. According to this algorithm, the product of 2 N-order matrices is calculated to calculate the product of 8 N/2 order matrices and the addition of 4 N/2 order matrices. The addition of a 2 (N/2) x (N/2) matrix can be done in O (n^2) time obviously. Therefore, the calculation time consuming T (N) of the above partition method should satisfy:
The solution of this recursive equation is still t (N) =o (n3).
Therefore, the method is not more efficient than the original definition of direct calculation.
The main reason is that this method does not reduce the number of multiplication of matrices. Matrix multiplication consumes much more time than matrix addition. So the breakthrough of the optimization algorithm is to reduce the multiplication operation . ~~~
According to the thought of the divide-and-conquer method, it can be seen that if we want to reduce the number of multiplication, the key is to calculate the product of 2 2-order matrices and use less than 8 times.
Strassen proposes a new algorithm to compute the product of 2 2-order matrices, the new algorithm only needs to use 7 multiplication operations, the specific idea is:
M1=A11 (B12-B22)
M2= (A11+A12) B22
m3= (A21+A22) B11
M4=a22 (B21-B11)
M5= (A11+A22) (B11+B22)
M6= (A12-A22) (B21+B22)
M7= (A11-A21) (B11+B12)
After doing this 7 multiplication, and then do several times add and subtract operations can be obtained:
C11=m5+m4-m2+m6
C12=m1+m2
C21=M3+M4 C22=M5+M1-M3-M7
Intuitively, the algorithm becomes more complex. But the reduction of a multiplication, but added a number of addition and subtraction operations, then the complexity of time.
This is done directly by changing the 8 multiplication before the optimization to 7 times:
The solution to the recursive equation is t (n) =o (Nlog7) ≈o (n2.81).
Although only from 3 to 2.81, but this change is on the index, so compared to the general algorithm is a great improvement.
The specific algorithm is implemented by recursion, and the recursive boundary is computed directly by multiplying the first order of any matrix.
After Strassen, there are many algorithms to improve the computational time complexity of matrix multiplication. The best time to compute the current upper bound is O (n2.376). The best lower bound of matrix multiplication known at present is still its trivial lower bound Ω (N2), after all, the matrix addition and subtraction also need O (N2) time complexity.
Anyway, the time complexity of matrix multiplication has not been known to date. There is still much work to be done on this subject.
╮(╯▽╰)╭ actually first saw from O (n^3) to O (n^2.81) feel nothing great, see the process just think the research algorithm is really Virgo ah, the painstaking effort to reduce the multiplication operation ...
And then suddenly think of a look at the recruitment information, on the screenshot:
Ha ha, virgo ~ ~ ~