Reprinted from: http://www.cnblogs.com/xiu619544553/tag/%E6%95%B0%E6%8D%AE%E7%BB%93%E6%9E%84%E5%92%8C%E7%AE%97%E6%B3%95/
The time it takes for a program written in a high-level language to run on a computer depends on the following factors:
1. The strategy adopted by the algorithm, the scheme;
2. Compile the resulting code quality (compiler);
3. Input size of the problem (number of inputs);
4. Speed of machine execution instructions.
Research the complexity of the algorithm, focusing on the research algorithm with the increase in the size of the input of an abstraction!!
The concept of algorithm time complexity
Under normal circumstances, the number of iterations of the basic operation of the algorithm is a function of the problem size n, denoted by T (n), if there is an auxiliary function f (n), so that when n approaches infinity, the limit value of T (n)/f (n) is not equal to zero constant, then f (n) is the same order of magnitude function of t As T (n) =o (f (n)), called O (f (n)) is the progressive time complexity of the algorithm, which is referred to as the complexity of time. This use of capital O () to reflect the algorithm time complexity of the excitation, we call it the Big O notation. Analysis: With the increase of the module n, the growth rate of the algorithm execution time is proportional to the growth rate of f (n), so the smaller the F (n), the lower the complexity of the algorithm, the higher the efficiency of the algorithm.
1. Derivation of large O-order method(If you analyze the time complexity of an algorithm)
:1) Replace all addition constants in the run time with constant 1 2) only the highest items (e.g. 2 + 2n + n^2 + n^3, then reserved n^3) are retained in the modified run times 3) if the highest order exists and is not 1, then the constant multiplied by the item (for example: 3n^3, reserved n^3) is removed 4) The final result is the Big O-order
1 int sum = 0, n = 100;2 printf ("I ll move on.\n"), 3 printf ("I'll Move On.\n"), 4 printf ("I'll move on.\n") ; 5 printf ("I ll move on.\n"); 6 printf ("I'll move On.\n"); 7 printf ("I'll move on.\n");
The large O-order of the above code is not O (8), according to the concept "T (n) is about the problem size n function", "1" Proof, O (1) is the correct answer;
Generally, non-nested loops involve linear order, and the linear order increases linearly with the problem scale N.
1 int i, sum = 0, n = 100;2 3 for (i = 0; i < n; i + +) {4 sum + = i;5 }
In this code, the time complexity of the loop is O (n), because the code in the loop body needs to be executed n times.
1 int I, j, n = 100;2 for (i = 0; i < n; i + +) {3 for (j = 0; J < N; j + +) {4 printf ("I ll move on. \ n "); 5 }6 }
n is equal to 100, that is, the outer loop executes once, the inner loop executes 100 times, that altogether wants to come out from these two loops, needs to execute 100*100 times, namely N squared. So the complexity of this code is O (n^2).
If three nested loops, that's n^3. So we can conclude that the time complexity of the loop is equal to the complexity of the loop body multiplied by the number of times the loop runs.
1 int I, j, n = 100;2 for (i = 0; i < n; i + +) {3 for (j = i; J < N; j + +) {//here J=i4 printf ("I" ll move on.\n "); 5 }6 }
The code is parsed and the last execution times are: N * (n + 1)/2 = N^2/2 + N/2;
Using the derivation of the large O-order, the first one ignores (no constant addition). The second article only retains the highest item (minus). Third, remove the constant multiplied by the highest item, and finally get O (n^2).
1 int i = 1, n = 100;2 while (i < n) {3 i = i * 2;4 }5 printf ("%d", I);
Assuming that there are x 2 multiplied by greater than or equal to N, exit the loop.
Get 2^x = n→x = log (2) n, so the time complexity of this loop is O (LOGN).
Time complexity analysis for function calls:
1)
1//Large O-order O (1) 2 void function (int count) {3 printf ("%d", count); 4} 5 6 7 int main (int argc, const char * a Rgv[]) {8 9 //Large O-order is O (n) ten int i, n = 100;11 for (i = 0; i < n; i + +) {Three function (i); return 0;16}
2)
1//Large O-order O (n) 2 void function (int count) {3 //Large O-order O (n) 4 int J, n = +, 5 for (j = 0; J < N; j + +) {6< c3/>printf ("%d", j); 7 } 8} 9 int main (int argc, const char * argv[]) {One- to-one//large O-order is O (n^2) int i, n = 100;14 for (i = 0; I < n; i + +) { function (i); }17 return 0;19}
Common time Complexity:
The time complexity that is commonly used is taken from small to large, in turn:
O (1) < O (Logn) < O (n) < O (Nlogn) < O (n^2) < O (n^3) < O (2^n) < O (n!) < O (n^n).
Worst case and average situation:
The average run time is the desired run time.
The worst run time is a guarantee. In the application, this is one of the most important requirements, usually unless specifically specified, the runtime we are talking about is the worst-case run time.
Second, the spatial complexity of the algorithm
Spatial complexity (space complexity) is a measure of how much storage space is temporarily occupied by an algorithm while it is running, and is recorded as S (n) =o (f (n)). For example, the time complexity of direct insertion sequencing is O (n^2), and the spatial complexity is O (1). The general recursive algorithm will have O (n) space complexity, because each recursive to store the return information. The advantages and disadvantages of an algorithm are mainly measured from the execution time of the algorithm and the storage space required to occupy two.
The complexity of what is often referred to is the complexity of time!
The complexity of time and complexity of space