Calculation of algorithm time complexity [collation]
Blog Category:
- Algorithmic Learning
Algorithm of time complexity
Basic Calculation Steps
definition of time complexity
Under normal circumstances, the number of iterations of the basic operation of the algorithm is a function of the problem size n, denoted by T (n), if there is an auxiliary function f (n), so that when n approaches infinity, the limit value of T (n)/f (n) is not equal to zero constant, then f (n) is the same order of magnitude function of t As T (n) =o (f (n)), called O (f (n)) is the progressive time complexity of the algorithm (O is the order of magnitude ), referred to as the time complexity.
according to the definition, the basic calculation steps can be summed up
1. Calculate the number of executions of the basic operation T (n)
The basic operation is that each statement in the algorithm (by; number as a partition), the number of executions of the statement is also called the frequency of the statement. When doing algorithmic analysis, the general default is to consider the worst case scenario.
2. Calculate the order of magnitude of T (N)
For the order of magnitude of T (N), just do the following for T (N):
Coefficients that ignore constants, low power, and highest power
The order of magnitude of F (n) =t (n).
3. Use large o to represent the complexity of time
When n approaches infinity, if the value of Lim (T (n)/f (n) is a constant that is not equal to 0, then f (N) is said to be the same order of magnitude function of T (n) . Note as T (n) =o (f (n)).
An example:
(1) int num1, num2;
(2) for (int i=0; i<n; i++) {
(3) Num1 + = 1;
(4) for (int j=1; j<=n; j*=2) {
(5) num2 + = NUM1;
(6)}
(7)}
Analysis:
1.
statement int NUM1, num2; frequency is 1;
The frequency of the statement i=0 is 1;
Statement i<n; i++; Num1+=1; J=1; The frequency is N;4 bar
Statement j<=n; j*=2; The frequency of NUM2+=NUM1 is n*log2n;3.
T (N) = 2 + 4n + 3n*log2n
2.
ignores the coefficients of constants in T (N), low power, and highest power
F (n) = n*log2n
3.
Lim (T (n)/f (n)) = (2+4n+3n*log2n)/(N*LOG2N)
= (1/n) * (1/log2n) + 1/log2n + 3
When n tends to infinity, 1/n tends to 0,1/log2n toward 0
So the limit equals 3.
T (n) = O (n*log2n)
Simplified calculation steps
To analyze again, it can be seen that the decision algorithm complexity is the most executed statements, here is num2 + = NUM1, is generally the most internal loop statement.
Also, is it often omitted to solve the limit as a constant?
Sothe above steps can be simplified to:
1. Find the most frequently executed statements
2. Count the number of execution times of a statement
3. Use large o to indicate results
Continue with the above algorithm as an example to analyze:
1.
The most frequently executed statement is num2 + = NUM1
2.
T (n) = n*log2n
F (n) = n*log2n
3.
Lim (T (n)/f (n)) = 1
T (n) = O (n*log2n)
--------------------------------------------------------------------------------
some additional notes
worst-case complexity of time
The time complexity of the algorithm is not only related to the frequency of the sentence, but also to the size of the problem and the value of each element in the input instance. In general, it is not particularly stated that the time complexity of the discussion is the worst-case time complexity . This guarantees that the algorithm will run no longer than any other time.
Order of magnitude
That is, the value (log), the default base is 10, simply means "a number in the standard scientific notation, 10 of the index." For example, 5000=5x10 3 (log5000=3), the order of magnitude 3. In addition, an unknown order of magnitude is the closest to the order of magnitude, which is the maximum possible order of magnitude.
the skill of finding the limit
To take advantage of good 1/n. When n tends to infinity, 1/n tends to be 0.
--------------------------------------------------------------------------------
Some rules (citation: Time complexity calculation)
1) Addition rule
T (n,m) = T1 (n) + T2 (n) = O (max (f (n), G (m))
2) Multiplication Rules
T (n,m) = T1 (n) * T2 (m) = O (f (N) * g (M))
3) A special case (the problem scale is a constant time complexity)
There is a special case in large O notation, if T1 (n) = O (c), C is an arbitrary constant unrelated to N, and T2 (n) = O (f (n)) has
T (n) = T1 (n) * T2 (n) = O (c*f (n)) = O (f (n))
In other words, in large O notation, any non-0 normal number is the same order of magnitude, denoted by O (1).
4) A rule of thumb
The relationship between complexity and time efficiency:
C < log2n < n < n*log2n < N2 < N3 < 2n < 3n < n! (c is a constant)
|--------------------------|--------------------------|-------------|
Good General poor
where c is a constant, if the complexity of an algorithm is C, LOG2N, N, n*log2n, then the algorithm time efficiency is higher, if it is 2n, 3n, n!, then a slightly larger n will make this algorithm can not move, in the middle of a few is passable.
--------------------------------------------------------------------------------------------------
Analysis of complex situations
All of these are analyzed for a single nested loop, but there may be other cases, as illustrated below.
1. Analysis of the complexity of parallel loops
Adds the time complexity of each nested loop.
For example:
for (I=1; i<=n; i++)
x + +;
for (I=1; i<=n; i++)
for (j=1; j<=n; j + +)
x + +;
Solution:
First for Loop
T (n) = n
F (n) = n
Time complexity of 0 (N)
A second for loop
T (n) = N2
F (n) = N2
Time complexity of 0 (N2)
The time complexity of the entire algorithm is 0 (n+n2) =0 (n2).
2. Analysis of complexity of function call
For example:
public void Printsum (int count) {
int sum = 1;
for (int i= 0; i<n; i++) {
sum + = i;
}
System.out.print (sum);
}
Analysis:
Remember that only operational statements increase the complexity of time, so the contents of the above method are O (1) In addition to the loop and the rest of the operational statements.
So printsum time complexity = for O (n) +o (1) = Ignore constant = O (n)
* Here you can actually use the formula num = N (n+1)/2 to optimize the algorithm, instead:
public void Printsum (int count) {
int sum = 1;
sum = Count * (count+1)/2;
System.out.print (sum);
}
The time complexity of the algorithm will be reduced from the original O (n) to O (1), greatly improving the performance of the algorithm.
3. Complexity analysis of mixed situations (multiple method invocations and loops)
For example:
public void Suixiangmethod (int n) {
Printsum (n);//1.1
for (int i= 0; i<n; i++) {
Printsum (n); 1.2
}
for (int i= 0; i<n; i++) {
for (int k=0; k<n; k++) {
System.out.print (I,K); 1.3
}
}
The time complexity of the Suixiangmethod method requires calculating the complexity of each member of the method body.
i.e. 1.1+1.2+1.3 = O (1) +o (n) +o (n2)----> Ignore constants and non-main items = = O (n2)
--------------------------------------------------------------------------------------------------
more examples
O (1)
Exchange of contents of I and J
Temp=i;
I=j;
J=temp;
The frequency of the above three individual statements is 1, and the execution time of the program segment is a constant independent of the problem size n. The time complexity of the algorithm is the constant order, which is recorded as T (N) =o (1). If the execution time of the algorithm does not grow with the increase of the problem size n, even if there are thousands of statements in the algorithm, the execution time is only a large constant. The time complexity of such an algorithm is O (1).
O (n2)
sum=0;/* Number of executions 1 */
for (i=1;i<=n;i++)
for (j=1;j<=n;j++)
sum++;/* Number of executions N2 */
Solution: T (N) = 1 + n2 = O (n2)
for (i=1;i<n;i++)
{
y=y+1; ①
For (j=0;j<= (2*n); j + +)
x + +; Ii
}
Solution: The frequency of statement 1 is n-1
The frequency of Statement 2 is (n-1) * (2n+1) = 2n2-n-1
T (n) = 2n2-n-1+ (n-1) = 2n2-2
F (n) = N2
Lim (T (n)/f (n)) = 2 + (1/n2) = 2
T (n) = O (n2).
O (n)
a=0;
B=1; ①
for (i=1;i<=n;i++) ②
{
S=a+b; ③
B=a; ④
A=s; ⑤
}
Solution: Frequency of Statement 1:2,
Frequency of statement 2: N,
Frequency of statement 3: N,
Frequency of statement 4: N,
Frequency of statement 5: N,
T (n) = 2+4n
F (n) = n
Lim (T (n)/f (n)) = (1/n) + 4 = 4
T (n) = O (n).
O (log2n)
I=1; ①
while (i<=n)
i=i*2; Ii
Solution: The frequency of statement 1 is 1,
The frequency of setting statement 2 is T, then:2t<=n; t<=log2n
Consider the worst case, take the maximum value t=log2n,
T (n) = 1 + log2n
F (n) = log2n
Lim (T (n)/f (n)) = 1/log2n + 1 = 1
T (n) = O (log2n)
O (n3)
for (i=0;i<n;i++)
{
for (j=0;j<i;j++)
{
for (k=0;k<j;k++)
x=x+2;
}
}
Solution: When the i=m, J=k, the number of times the inner loop is k when I=m, J can take 0,1,..., m-1, so here the most internal cycle of 0+1+...+m-1= (m-1) m/2 times So, I from 0 to N, then the cycle has been carried out: 0+ (1-1) *1/2+ ... + (n-1) n/2=n (n+1) (n-1)/2 times
T (n) = N (n+1) (n-1)/2 = (n3-n)/2? <--4
F (n) = N3
So the complexity of Time is O (N3).
Source: http://univasity.iteye.com/blog/1164707
From for notes (Wiz)
Computation of time complexity of "data structure and algorithm"