the definition of time complexity
in general, the number of times the basic operations in the algorithm are repeatedly executed is a function of the problem scale N, expressed in T (N), if there is an auxiliary function f (n), So that when n is approaching infinity, the limit value of T (n)/f (n) is a constant that is not equal to zero, then it is said that f (n) is the same order of magnitude function as T (n). T (n) =o (f (n)), which is called O (f (n)), is the asymptotic time complexity of the algorithm (O is the order of magnitude symbol ), referred to as time complexity.
by definition, you can generalize the basic calculation steps
1. Calculates the number of execution times for basic operations T (n)
Basic operations are each of the statements in the algorithm (; As a split), the number of times the statement is executed is also called the frequency of the statement. In doing algorithmic analysis, the default is generally to consider the worst-case scenario.
2. Calculates the order of magnitude of T (n)
ignores constants, lower power, and highest power coefficients, and orders of magnitude for
order f (n) =t (n).
3. Use large o to denote time complexity
when n is approaching infinity, if the value of Lim (T (n)/f (n)) is a constant that is not equal to 0, then f (n) is the same order of magnitude function of T (n). T (n) =o (f (n)), which is the time complexity.
An example:
(1) int num1, num2;
(2) for (int i=0; i<n; i++) {
(3) NUM1 = 1;
(4) for (int j=1; j<=n; j*=2) {
(5) num2 + + num1;
(6)}
(7)}
Analysis:
1.
statement int NUM1, num2; frequency is 1;
The frequency of the statement i=0 is 1;
Statement i<n; i++; Num1+=1; J=1; The frequency is n;
Statement j<=n; j*=2; The frequency of NUM2+=NUM1 is n*log2n;
T (N) = 2 + 4n + 3n*log2n
2.
Ignore the coefficients of the constants, the lower power and the highest power in T (N)
F (n) = n*log2n
3.
Lim (T (n)/f (n)) = (2+4n+3n*log2n)/(N*LOG2N)
= 2* (1/n) * (1/log2n) + 4* (1/log2n) + 3
When n tends to infinity, 1/n tends to 0,1/log2n toward 0
So the limit equals 3.
So
T (n) = O (n*log2n)
Simplified Calculation Steps
Again to analyze, you can see that the decision algorithm complexity is the most execution of the statement, here is num2 + = NUM1, generally also the most internal loop of the statement.
Also, it is usually omitted to solve the limit for constants.
Thus, the above steps can be simplified to:
1. Find the most frequently executed statements
2. Calculate the order of magnitude of the statement execution times
3. Use big O to indicate the result
Continue to take the above algorithm for example, analysis:
1.
The most frequently executed statement is num2 + = NUM1
2.
T (n) = n*log2n
F (n) = n*log2n
3.
Lim (T (n)/f (n)) = 1
T (n) = O (n*log2n)
--------------------------------------------------------------------------------
some additional notes
The worst of time complexity
The time complexity of the algorithm is not only related to the sentence frequency, but also to the size of the problem and the value of the elements in the input instance. In general, it is not specifically stated that the time complexity of the discussion is the worst-case scenario. This ensures that the algorithm will not run longer than any other time.
Quantity level
The numeric value (log) is the default base of 10, which is simply "the index of 10 after a number of standard scientific notation". For example, 5000=5x10 3 (log5000=3), the order of magnitude is 3. In addition, the order of magnitude of an unknown quantity is its closest order of magnitude, that is, the maximum possible order of magnitude.
The skill to find the limit
Make good use of 1/n. When n tends to infinity, 1/n tends to be 0.
--------------------------------------------------------------------------------
Some rules (citation: Time complexity calculation)
1) Addition rule
T (n,m) = T1 (n) + T2 (n) = O (max (f (n), G (m))
2) Multiplication Rules
T (n,m) = T1 (n) * T2 (m) = O (f (N) * g (M))
3 A special case (the problem scale is constant time complexity)
There is a special case in the big O notation, if T1 (n) = O (c), C is an arbitrary constant that is independent of N, T2 (n) = O (f (n)) has
T (n) = T1 (n) * T2 (n) = O (c*f (n)) = O (f (n))
In other words, in the large O notation, any non 0 normal numbers belong to the same order of magnitude, and are recorded as O (1).
4) An empirical rule
The relationship between complexity and time efficiency:
C < log2n < n < n*log2n < N2 < N3 < 2n < 3n < n! (c is a constant)
|--------------------------|--------------------------|-------------|
Better generally poor
where c is a constant, if an algorithm of the complexity of C, LOG2N, N, n*log2n, then this algorithm time efficiency is high, if it is 2n, 3n, n!, then a slightly larger n will make this algorithm can not move, the middle of a few are passable.
--------------------------------------------------------------------------------------------------
analysis of complex situations
All of this is an analysis of the case of a single nested loop, but there may actually be other cases, as illustrated below.
1. Complexity analysis of the parallel cycle
Adds the time complexity of each nested loop.
For example:
for (I=1; i<=n; i++)
x + +;
for (I=1; i<=n; i++)
For (J=1 j<=n; j + +)
x + +;
Solution:
First for Loop
T (n) = n
F (n) = n
Time complexity of 0 (N)
A second for loop
T (n) = N2
F (n) = N2
Time complexity of 0 (N2)
The time complexity of the whole algorithm is 0 (n+n2) =0 (n2).
2. Analysis of the complexity of function calls
For example:
public void Printsum (int count) {
int sum = 1;
for (int i= 0; i<n; i++) {
sum + = i;
}
System.out.print (sum);
}
Analysis:
Remember that only a running statement increases the complexity of the time, so the contents of the above method except the loop, the rest of the operational statement is the complexity of O (1).
So printsum's time complexity = for O (n) +o (1) = Ignore constant = O (n)
* Here you can actually use the formula num = N (n+1)/2 to optimize the algorithm, instead:
public void Printsum (int count) {
int sum = 1;
sum = Count * (count+1)/2;
System.out.print (sum);
}
The time complexity of the algorithm is reduced from the original O (n) to O (1), which greatly improves the performance of the algorithm.
3. Complexity analysis of mixing situations (multiple method invocations and loops)
For example:
public void Suixiangmethod (int n) {
Printsum (n);//1.1
for (int i= 0; i<n; i++) {
Printsum (n); 1.2
}
for (int i= 0; i<n; i++) {
for (int k=0; k
System.out.print (I,K); 1.3
}
}
The time complexity of the Suixiangmethod method requires calculating the complexity of each member of the method body.
That is, 1.1+1.2+1.3 = O (1) +o (n) +o (n2)----> Ignore constants and non-primary items = O (n2)
--------------------------------------------------------------------------------------------------
more examples
O (1)
Exchange the contents of I and J
Temp=i;
I=j;
J=temp;
The frequency of the above three individual statements is 1, and the execution time of the program segment is a constant that is independent of the problem scale N. The time complexity of the algorithm is constant order, which is recorded as T (N) =o (1). If the execution time of the algorithm does not grow with the increase of the size n of the problem, even though there are thousands of statements in the algorithm, the execution time is only a larger constant. The time complexity of such algorithms is O (1).
O (N2)
sum=0/* Execution times 1 * *
for (i=1;i<=n;i++)
for (j=1;j<=n;j++)
sum++/* Execution times N2 * *
Solution: T (N) = 1 + n2 = O (n2)
for (i=1;i<n;i++)
{
y=y+1; ①
For (j=0;j<= (2*n); j + +)
x + +; Ii
}
Solution: The frequency of statement 1 is n-1
The frequency of Statement 2 is (n-1) * (2n+1) = 2n2-n-1
T (n) = 2n2-n-1+ (n-1) = 2n2-2
F (n) = N2
Lim (T (n)/f (n)) = 2 + 2* (1/n2) = 2
T (n) = O (n2).
O (N)
a=0;
B=1; ①
for (i=1;i<=n;i++) ②
{
S=a+b; ③
B=a; ④
C=1; ⑤
}
Solution: The frequency of statement 1:2,
The frequency of statement 2: N,
The frequency of statement 3: N,
The frequency of statement 4: N,
The frequency of statement 5: N,
T (n) = 2+4n
F (n) = n
Lim (T (n)/f (n)) = 2* (1/n) + 4 = 4
T (n) = O (n).
O (LOG2N)
I=1; ①
while (i<=n)
i=i*2; Ii
Solution: The frequency of statement 1 is 1,
To set the frequency of statement 2 to be T, then:nt<=n; t<=log2n
Consider the worst case, take the maximum value t=log2n,
T (n) = 1 + log2n
F (n) = log2n
Lim (T (n)/f (n)) = 1/log2n + 1 = 1
T (n) = O (log2n)
O (N3)
for (i=0;i<n;i++)
{
for (j=0;j<i;j++)
{
for (k=0;k<j;k++)
x=x+2;
}
}
Solution: When I=m, J=k, the number of internal cycles for K when I=m, J can take 0,1,..., m-1, so here the most internal cycle has been 0+1+...+m-1= (m-1) m/2 times So, I from 0 to N, then the cycle has been a total of: 0+ (1-1) *1/2+ ... + (n-1) n/2=n (n+1) (n-1)/2 times
T (n) = N (n+1) (n-1)/2 = (n3-n)/2
F (n) = N3
So the time complexity is O (N3).
(Reprinted from http://univasity.iteye.com/blog/1164707)