the subject of this lesson: Algorithmic efficiency metrics and storage space requirements
Teaching Purpose: To master the meaning and function of asymptotic time complexity and space complexity of the algorithm
Teaching emphases: the significance and function of asymptotic time complexity and its calculation method
Teaching Difficulties: The significance of asymptotic time complexity
Teaching Content:
The measurement of algorithm efficiency
The time of the algorithm execution is the function of the algorithm and the problem scale. To evaluate the advantages and disadvantages of an algorithm, we can judge the length of the algorithm execution time on the same scale. The execution time of a program usually has two methods:
1, after the statistical method.
Disadvantage: not conducive to a larger range of algorithm comparisons. (Offsite, different, different)
2, the method of estimating beforehand.
Factors affecting the time required for a program to run on a computer |
The strategy chosen by the algorithm itself |
|
The scale of the problem |
The larger the scale, the more time you consume |
Language of the Writing program |
The more advanced the language, the more time consumed |
Machine code quality generated by compilation |
|
The speed at which the machine executes instructions |
|
To sum up, in order to compare the advantages and disadvantages of the algorithm itself, other factors affecting the efficiency of the algorithm should be excluded.
The original operation of the basic operation is selected from the algorithm, and the number of repeated execution of the basic operation is used as the time measurement of the algorithm. (The original operation is the same in all the algorithms for the problem)
T (n) =O(f (n))
The above shows that with the increase of the problem scale n, the growth rate of the algorithm execution time is the same as that of F (n), which is called the asymptotic time complexity of the algorithm, referred to as time complexity.
Find 4*4 matrix elements and, T (4) =o (f (4)) F=n*n; |
sum (int num[4][4]) {int i,j,r=0; for (i=0;i<4;i++)
for (j=0;j<4;j++)
R+=NUM[I][J]; /* Original Operation * *
return R; } |
Best case: T (4) =o (0) Worst case scenario: T (4) =o (n*n) |
Ispass (int num[4][4]) {int i,j; for (i=0;i<4;i++)
for (j=0;j<4;j++)
if (num[i][j]!=i*4+j+1)
return 0;
return 1; } |
The number of times the original operation was executed is the same as the frequency of the statement containing it. The frequency of a statement refers to the number of times the statement has been repeatedly executed.
Statement |
Frequency |
Complexity of Time |
{++x;s=0;} |
1 |
O (1) |
for (I=1;i<=n;++i)
{++x;s+=x;}
|
N |
O (N) |
for (J=1;J<=N;++J)
for (K=1;K<=N;++K)
{++x;s+=x;}
|
N*n |
O (N*n) |
|
|
O (log n) |
|
|
|
Time complexity when the execution times of basic operations are uncertain |
Average Time complexity |
Calculate the average by the probability of the basic Operation execution times |
Worst Case time complexity |
In the worst case, the number of basic operations performed |
Second, the algorithm's storage space requirements
Similar to the time complexity of the algorithm, space complexity can be used as a measure of the storage space required by the algorithm.
Recorded as:
S (n) =O(f(n))
If the extra space is constant relative to the amount of input data, the algorithm is called in-situ work.
If the amount of space is dependent on a specific input, the worst-case scenario is analyzed, except as specified.
Third, summary
Asymptotic time complexity
Complexity of space