in the previous three articles, we discussed asymptotic analysis , worst, average, and best algorithm cases .
The main idea of asymptotic analysis is to measure the efficiency of an algorithm that does not depend on machine-specific constants, and do not need to perform algorithms and programs to compare the time.
The asymptotic symbol is a mathematical tool used to represent the time complexity of an algorithm for asymptotic analysis.
The following 3 asymptotic symbols are mainly used to represent the time complexity of the algorithm.
1) θ notation: The θ symbol defines the upper and lower function, thus defining the exact asymptotic behavior.
One simple way to get the theta symbol for an expression is to lower the lower order, ignoring the vertex constants.
For example, consider the following expression.
3n3 + 6n2 + 6000 =θ (n3)
It is always good to delete low order items, because regardless of the constants involved, there will always be a N0, in θ (n 3 ) has a ratio of θ (n 2 ) for a larger value.
For a given function g (n), we mean that θ (g (n)) is the following set of functions
For all n> = n0,0 <= C1 * g (n) <= f (n) <= C2 * g (N)
The above definition means that if f (n) is Θ of G (n) , then for large n (n> = n0), the value f (n) is always C1 * g (N) and C2 * g N0). The definition of θ also requires that f (n) be non-negative for values greater than N0 for N.
2) Large O notation: The large o notation defines the upper limit of the algorithm, which only defines a function from above.
For example, consider inserting a sort case.
in the worst case, linear time is required in the best case and two times. We can safely say that the time complexity of inserting a sort is O (n ^ 2).
Please note that O (n ^ 2) also includes linear time.
If we use the θ notation to represent the time complexity of inserting a sort, we must use two statements to achieve the best and worst case scenario:
1. The worst time complexity for insert sorting is θ (n ^ 2).
The best case for inserting a sort is the time complexity of θ (n).
When we have only one algorithm for the time complexity limit, the large O symbol is useful.
Many times, it's easy to find an upper limit, just a simple look at the algorithm.
All n> = N0}
3) Ω symbol: as the large O notation provides an asymptotic upper bound on the function, the Ω representation provides the asymptotic lower bound.
When we have lower limits on the time complexity of the algorithm, we can use the Ω notation.
as discussed in the previous article, the best case for algorithms is generally not very useful , so the three algorithms Use the smallest symbol.
For a given function g (n), we use Ω (g (n)) to represent the set of functions.
Ω (g (n)) = {f (n)): Normal constants C and n0 exist, making 0 <= CG (n) <= f (n) all n> = N0}.
let's consider the same insert sort example here. The time complexity of inserting a sort can be written as Ω (n), but it is not very useful information about the insertion sort,
Because we are usually interested in the worst case scenario, sometimes we are interested in the average situation.
Practice:
Which of the following statements is valid?
1. The time complexity of fast sorting is θ (N ^ 2)
2. The time complexity of fast sorting is O (n ^ 2)
3. for any two functions f (n) and g (n), we have f (n) =θ (g (n)) when and only if f (n) = O (g (n)) and f (n) =ω (g (n))).
4. All computer algorithm time complexity can be written as Ω (1)
Algorithmic Analysis | Episode 3 (Asymptotic notation)