Asymptotic growth of 2.8 functions
Let's now judge which of the two algorithms A and B is better. Assuming that the input scale of two algorithms are N, algorithm A to do 2n + 3 operations, you can understand that there is a n-time cycle, after the completion of the execution, then there is an n-cycle, the last three times assignment or operation, a total of 2n + 3 times. Algorithm B to do 3n + 1 times operation. Who do you think is quicker?
To be exact, the answer is not necessarily (as shown in table 2-8-1).
When n = 1 o'clock, algorithm A is less efficient than algorithm B (more times than algorithm B). And when n = 2 o'clock, the efficiency of the two is the same; when n > 2 o'clock, Algorithm a begins to outperform algorithm B, and as n increases, algorithm A is better than algorithm B (less executed than B). So we can conclude that algorithm A is generally better than algorithm B.
At this point we give the definition that the input size n is always greater than the other when there is no limit to more than one value N. We call the function to grow asymptotically.
Asymptotic growth of functions: given two functions f (n) and g (n), if an integer n is present so that all n > n,f (n) is always larger than g (n), we say that the growth of f (n) is asymptotically faster than g (n).
From this we find that, with the increase of N, the +3 or +1 behind is actually not affecting the final algorithm, such as algorithm a ' and Algorithm B '. So, we can ignore these addition constants. In the following example, the significance of such constants being ignored may be more pronounced.
Let's look at a second example. Algorithm C is 4n + 8, and algorithm D is 2n2 + 1 (as shown in table 2-8-2).
When n < = 3, the algorithm C is worse than the algorithm D (because the algorithm C times more), but when n > 3, the advantage of the algorithm C is more and more superior to the algorithm D, and later is far better than. And when the following constants are removed, we find that the results have not changed. Even if we look at it again, even if we remove the constant that is multiplied by N, the result does not change, and the number of algorithms C ' increases with N, or far less than the algorithm d '. In other words, constants that are multiplied by the highest-level items are not important.
Let's look at a third example. Algorithm E is 2n2 + 3n + 1, and algorithm F is 2n3 + 3n + 1 (as shown in table 2-8-3).
When n = 1, the algorithm e is the same as the algorithm F, but when n > 1, the advantage of the algorithm e will begin to be better than the algorithm F, with the increase of N, the difference is very obvious. By observation, it is found that the exponential of the highest term, the function increases with n, and the result grows particularly fast.
Let's look at the last example. The algorithm g is 2n2, the algorithm h is 3n + 1, the algorithm i is 2n2 + 3n + 1 (as shown in table 2-8-4).
This set of data should be seen very clearly. When the value of n is getting bigger, you will find that 3n+1 has not been able to compare with the results of 2n2, and it is almost negligible in the end. In other words, as the n value becomes very large, the algorithm G is already approaching algorithm I. So we can get the conclusion that when we judge the efficiency of an algorithm, the constants and other minor items in the function can often be ignored, and more attention should be paid to the order of the item (the highest order).
Judging an algorithm is good, we can not make accurate judgments only through a small amount of data. According to just a few examples, we find that if we can compare the asymptotic growth of the key execution times function of these algorithms, we can basically analyze that an algorithm, with the increase of N, will be more and more superior to another algorithm, or worse to another algorithm. This is in fact the theoretical basis for estimating the method beforehand, and the time efficiency of the algorithm is estimated by the algorithm time complexity.
Source: http://www.cnblogs.com/cj723/archive/2011/03/05/1964884.html