Algorithm Analysis
The same problem can be solved by different algorithms, and the quality of an algorithm will affect the efficiency of the algorithm and even the program. Algorithm analysis aims to select appropriate algorithms and improve algorithms. The evaluation of an algorithm mainly involves time complexity and space complexity.
1. Time Complexity
(1) Time Frequency
The time it takes to execute an algorithm cannot be calculated theoretically. You must run the test on the computer before you can understand it. However, it is impossible and unnecessary for us to test each algorithm on the machine. We only need to know which algorithm takes more time and which algorithm takes less time. In addition, the time spent by an algorithm is proportional to the number of statements executed in the algorithm. In an algorithm, when the number of statements executed is large, it takes more time. The number of statement executions in an algorithm is called the statement frequency or time frequency. As T (n ).
(2) time complexity
In the Time Frequency just mentioned, n is called the scale of the problem. When n is constantly changing, T (n) will also change. But sometimes we want to know what the rule is when it changes. Therefore, we introduce the concept of time complexity.
In general, the number of repeated executions of the basic operation in an algorithm is a function of the problem scale N. It is represented by T (N). If an auxiliary function f (n) exists ), so that when n approaches infinity, the limit value of T (N)/F (n) is a constant not equal to zero, then f (n) is T (N). It is recorded as T (n) = O (f (n), and O (f (N) is the progressive time complexity of the algorithm.
In different algorithms, if the number of statement executions in the algorithm is a constant, the time complexity is O (1). In addition, the time complexity may be the same when the time frequency is different, for example, the frequencies of T (n) = n2 + 3N + 4 and T (n) = 4n2 + 2n + 1 are different, but the time complexity is the same, they are all O (n2 ).
Sort by order of magnitude, common time complexity:
Constant order O (1), logarithm order o (log2n), linear order O (N ),
Linear logarithm order o (nlog2n), square order O (n2), cubic order o (N3 ),...,
K to the power of O (NK), exponential order o (2n ). As the problem scale N increases, the time complexity increases and the algorithm execution efficiency decreases.
2. spatial complexity
Similar to time complexity, spatial complexity refers to the measurement of the storage space required by an algorithm for execution in a computer. Note:
S (n) = O (f (n ))
We generally discuss the scale of auxiliary storage units except for normal memory consumption.