In general, the number of repeated executions of the basic operation in an algorithm is a function with the problem scale N.F (N)Algorithm Time Measurement
T (n) = O (f (n ))He indicates the algorithm execution time growth andF (N)With the same growth rate.Progressive time complexity(Asymptotic time complexity), shortTime Complexity.
The algorithm time complexity is from small to large:
O (1) <O (logn) <O (n) <O (nlogn) <O (n2) <O (N3) <O (2n) <O (N !) <O (NN)
The higher the time complexity, the higher the CPU consumption of the algorithm, and the slower the execution speed.
The time complexity analysis code is as follows:
Int sum = 1, n = 100;
O (1 ):
Sum = (1 + n) * n/2;
O (logn ):
While (sum <n ){
Sum = sum * 2;
}
O (n ):
For (INT I = 0; I <n; I ++ ){
Sum + = N;
}
O (N2 ):
For (INT I = 0; I <n; I ++ ){
For (Int J = 0; j <n; j ++ ){
Sum + = 1;
}
}
For (INT I = 0; I <n; I ++ ){
For (Int J = I; j <n; j ++ ){
Sum + = 1;
}
}
// The above loop is actually not executed to N square times, executed N + (n-1) + (n-2) +... + 1 = n (n + 1)/2 = N2/2 times
Because constants do not need to be calculated, n2/2 the final time complexity is O (n2)
The algorithm has the best time complexity, worst and average complexity. For example, to search for a number in a data set, the best time complexity is O (1), the worst time complexity is O (n), and the average complexity is n/2.
The time complexity of this algorithm is O (n). Generally, the time complexity of this algorithm is the worst time complexity.