1. Complexity of Time (1) Time frequencyThe time taken by an algorithm can not be calculated theoretically, it must be tested on the machine to know. But we can not and do not need to test each algorithm on the machine, just know which algorithm spends more time, which algorithm spends less time on it. And an algorithm spends time with the algorithm in the execution of the number of statements in direct proportion, which algorithm in the number of statements executed more times, it takes more time. The number of statements executed in an algorithm is called the statement frequency or time frequency. is recorded as t (N).
(2) Complexity of timeIn the time frequency mentioned earlier, n is called the scale of the problem, and when N is constantly changing, the time frequency t (n) changes constantly. But sometimes we want to know what laws it presents when it changes. To this end, we introduce the concept of time complexity. In general, the number of times the basic operation is repeated in the algorithm is a function of the problem scale N, expressed in T (n), if there is an auxiliary function f (n), so that when N is approaching infinity, the limit value of T (n)/f (n) is a constant that is not equal to zero, then it is said that f (n) is the same order of magnitude function as T (n). denoted as t (n) =o (f (n)), called O (f (n)), is the asymptotic time complexity of the algorithm, referred to as time complexity. In a variety of different algorithms, if the execution number of statements in the algorithm is a constant, the time complexity is O (1), in addition, when the time frequency is not the same, the time complexity may be the same, such as T (N) =n 2+3n+4 and T (n) =4n 2+2n+1 Their frequency is different, but the time complexity is the same, all are O(n2). Increased by order of magnitude, the common time complexity is: Constant order O (1), logarithmic order O (log 2n), linear order O (n), linear logarithmic order O (Nlog 2n), square o (n 2), Cubic O (n 3),..., K-Order O (n-k), Exponential order O (2 N). With the increasing of the scale n of the problem, the time complexity is increasing and the execution efficiency of the algorithm is lower. 2. Space complexity Similar to time complexity, spatial complexity refers to the measurement of the storage space required by an algorithm when it is executed within a computer. Note: S (n) =o (f (n)) We are generally talking about the size of the secondary storage unit in addition to the normal memory overhead. The discussion method is similar to the time complexity, no longer repeat.
(3) The time performance of incremental time complexity evaluation algorithm mainly evaluates the time performance of an algorithm by the order of magnitude of the algorithm time complexity (that is, the asymptotic time complexity of the algorithm). "Example 3. 7 "There are two algorithms A1 and A2 solve the same problem, and the time complexity is T1 (n) =100n2,t2 (n) =5n3 respectively. (1) When the input quantity is n<20, there is T1 (n) >t2 (n), the latter takes less time. (2) as the problem scale n increases, the time cost of the two algorithms increases with the 5N3/100N2=N/20. When the scale of the problem is large, the algorithm A1 than the algorithm A2 more effectively. Their asymptotic time complexity O (n2) and O (N3) Evaluate the time quality of these two algorithms macroscopically. When the algorithm is analyzed, often, the time complexity and asymptotic time complexity of the algorithm are not distinguished, and the asymptotic time complexity T (n) =o (f (n)) is often referred to as time complexity, in which f (n) is generally the most frequently used statement frequency in the algorithm.
"Example 3. 8 The time complexity of the algorithm matrixmultiply is generally t (n) =o (N3), and F (n) =n3 is the frequency of the statement (5) in the algorithm. The following example shows how to find the time complexity of the algorithm.
"Example 3. 9 "Exchange contents of I and J". Temp=i; I=j; J=temp; The frequency of the above three individual statements is 1, and the execution time of the program segment is a constant that is independent of the problem scale N. The time complexity of the algorithm is constant order, which is recorded as T (N) =o (1). If the execution time of the algorithm does not grow with the increase of the size n of the problem, even though there are thousands of statements in the algorithm, the execution time is only a larger constant. The time complexity of such algorithms is O (1).
"Example 3. 10 "One of the variable counts." (1) x=0;y=0; (2) for (k-1;k<=n;k++) (3) x + +; (4) for (i=1;i<=n;i++) (5) for (j=1;j<=n;j++) (6) y++; In general, the step Loop statement only needs to consider the execution times of the statement in the loop body, ignoring the component of step Increment 1, Final value discriminant and control transfer. Therefore, the maximum frequency in the above program section is (6), its frequency is f (n) =n2, so the time complexity of the program section is T (n) =o (n2). When there are several loop statements, the time complexity of the algorithm is determined by the frequency f (n) of the most inner statement in the loop statement with the most nesting layers.
"Example 3. 11 "Variable Count bis". (1) X=1; (2) for (i=1;i<=n;i++) (3) for (j=1;j<=i;j++) (4) for (k=1;k<=j;k++) (5) x + +; The most frequently-executed statement in the program section is (5), although the execution times of the inner loop are not directly related to the problem scale N. However, it is related to the variable value of the outer loop, and the outermost loop is directly related to N, so you can loop from the inner layer to the outer parse statement (5): The time complexity of the program segment is T (n) =o (n3/6+ Lower) =o (N3). (4) The time complexity of the algorithm depends not only on the scale of the problem, but also on the initial state of the input instance.
"Example 3. 12 "in numerical a[0..n-1] to find the given value K algorithm is as follows: (1) i=n-1; (2) while (i>=0&& (A[i]!=k)) (3) i--; (4) return i; The frequency of the statement (3) In this algorithm is not only related to the problem scale N, but also to the values of the elements of a in the input instance and the value of K: ① If A has no element equal to K, the frequency f (n) of the statement (3) =n;② if the last element of a is equal to K, the frequency f (n) of the statement (3) is constant 0. (5) The worst time complexity and the mean time complexity of the worst case of time complexity called the worst-case complexity degree. In general, it is not specifically stated that the time complexity of the discussion is the worst-case scenario. The reason for this is that the worst case time complexity is the upper bound of the algorithm's running time on any input instance, which ensures that the algorithm will not run longer than any other time.
"Example 3. 19 "Lookup Algorithm" Example 1 8 "in the worst-case scenario, the time complexity is T (n) =0 (n), which indicates that the algorithm cannot run longer than 0 (n) for any input instance." mean time complexity refers to the expected running time of the algorithm when all possible input instances are presented with equal probability. The common time complexity is increased by order of magnitude: constant 0 (1), logarithmic order 0 (LOG2N), linear order 0 (N), linear logarithmic order 0 (NLOG2N), square Order 0 (N2) Cubic 0 (n3) 、...、 K-order 0 (NK), exponential order 0 (2n). Obviously, the time complexity of the exponential order 0 (2n) algorithm is extremely inefficient, when the n value is slightly larger can not be applied. is similar to the discussion of time complexity, the spatial complexity of an algorithm (space complexity) S (n) is defined as the storage space consumed by the algorithm, and it is also a function of problem scale N. Asymptotic space complexity is often referred to as spatial complexity. The time complexity and space complexity of the algorithm are called the complexity of the algorithm. |
|