Complexity of algorithm and Master theorem _ complexity

Source: Internet
Author: User

Http://www.gocalf.com/blog/algorithm-complexity-and-master-theorem.html

Usually when designing or reading an algorithm, the complexity of the algorithm (including time complexity and space complexity) must be mentioned. Like we say. The average time complexity of a binary lookup algorithm is O (Logn), and a quick sort may be O (n logn). What does the O here mean. Is this expression accurate?

Let's review today's knowledge of algorithmic complexity: function asymptotic order, Mark O, Ω, theta, and O;master theorems.

First of all, in algorithmic complexity analysis, log usually represents a 2-base logarithm.

Algorithmic complexity is used to measure the amount of computer resources (time, space) needed for an algorithm to run. Usually we use asymptotic state to describe the complexity of the algorithm.

n indicates the scale of the problem, and T (n) represents the complexity of a given algorithm. The so-called progressive state is the part that makes the fastest growth in T (n) when n→∞. The strict definition is: if there is a T˜ (n)

, when n→∞, there is a T (n) −t˜ (n) t (n) that →0

Just say T˜ (n)

Is the asymptotic behavior of T (N) when n→∞.

For example, t (n) = 2 * N ^ 2 + N log n + 3, then obviously its asymptotic behavior is 2 * n ^2, because when n→∞, the latter two growth rate is much slower, can be ignored. The introduction of asymptotic state is to simplify the algorithm complexity of the expression, only to consider the main factors. When comparing the complexity of two algorithms, if the order of their asymptotic complexity is different, it is only necessary to compare each other's order (ignoring constant coefficients).

In a word, when the complexity of the algorithm is analyzed, it is not necessary to calculate a specific formula, but to analyze the order of complexity in a progressive sense when the scale of the problem is sufficiently large. The Mark O, Ω, theta, and O can help us understand the size of the function's progressive order.

Suppose there are two functions f (n) and g (n), all positive functions defined on the set of positive integers. The meanings of the above four notations are: f (n) = O (g (n)): ∃c>0,n0∈n,∀n≥n0,f (n) ≤CG (n); F is not higher than the order of G. F (N) =ω (g (n)): ∃c>0,n0∈n,∀n≥n0,f (n) ≥CG (n); The order of F is not lower than the order of G. F (N) =θ (g (n)): ⟺f (n) =o (g (n)) &&f (n) =ω (g (n)); The order of F equals the order of G. F (n) = O (g (n)): The Order of ∀ε>0,∃n0∈n,∀n≥n0,f (n)/g (n) <ε;f is lower than the order of G.

It is visible that the notation o gives the upper bound (but not necessarily the smallest) of the function f (n) in a progressive sense, whereas the mark Ω gives the lower bound (not necessarily the largest). If the upper bound is the same as the lower bound, it means that f (n) and g (n) are of the same order (θ) in a progressive sense, which is the same as complexity.

Enumerate some of the most common functions of the asymptotic order relationship: logn!=θ (NLOGN)
Logn2=θ (LOGN)
Logn2=o (n√)
n=ω (LOG2N)
log2n=ω (LOGN)
2n=ω (N2)
2n=o (3n)
N!=o (NN)
2n=o (n!)

Some people may confuse these tokens with the worst, best, average complexity of the algorithm, and they have a difference and a certain connection.

Even if the scale of the problem is the same, the processing time of the algorithm may vary, depending on the attributes of the input data itself. Then there is the difference between the worst-case scenario, the best case, and the average algorithm complexity. They reflect the efficiency of the algorithm from different angles, and each has its own usefulness and limitations.

Sometimes it is possible to roughly estimate the performance of an algorithm using the worst-case scenario and the optimal algorithm complexity. For example, an algorithm in the worst case time complexity of θ (n^ 2), preferably Theta (n), the algorithm must be the complexity of O (n ^2), Ω (n). That is, N ^ 2 is the upper bound of the complexity of the algorithm, and N is its lower bound.

Next look at the master theorem.

Some algorithms, when dealing with a larger scale problem, tend to split the problem into several sub problems, deal with one or more of them recursively, and do some preprocessing and summary processing before or after the partition. At this time we can get a recursive equation about the complexity of the algorithm, and the complexity of the algorithm can be obtained by solving this equation. One of the most common recursive equations is this:

Set constant a >= 1,b > 1,f (n) as function, T (n) is a non-negative integer, t (n) = a T (n/b) +f (n), then there is: if f (n) =o (nlogba−ε), ε>0, then T (n) =θ (Nlogba). If f (n) =θ (Nlogba), then T (n) =θ (NLOGBALOGN). If f (n) =ω (nlogba+ε), ε>0, and for a constant C < 1 and sufficiently large n has af (n/b) ≤CF (n), then T (n) =θ (f (n)).

For example, the common binary lookup algorithm, the time complexity of the recursive equation is t (n) = t (N/2) +θ (1), obviously have nlogba=n0=θ (1)

, the time complexity of T (n) =θ (log n) can be obtained by satisfying the second rule of Master theorem.

Look at one more example, t (n) = 9 T (N/3) + N, known as Nlogba=n2

, so that ε takes 1, apparently satisfying the first rule of Master theorem, and can get T (n) =θ (n ^2).

To a slightly more complex example, t (n) = 3 T (N/4) + N Logn. Nlogba=o (n0.793)

, take ε= 0.2, obviously when C = 3/4, for full large n can satisfy A * f (n/b) = 3 * (N/4) * log (N/4) <= (3/4) * n * log n = c * F (n), conform to master theorem third so that T (n) =θ (n log n) is obtained.

When using Master's theorem, it is important to pay special attention to the fact that ε in the first and third articles must be greater than 0. If ε greater than 0 cannot be found, these two rules cannot be used.

For instance, t (n) = 2 T (N/2) + N log n. It is clear that nlogba=n1, while f (n) = N Logn, obviously does not satisfy the second rule of Master theorem. But for the first and third articles, it is not possible to find ε greater than 0 to make nlogn=o (n1−ε) or nlogn=ω (n1+ε), so the master theorem can not be solved, only to find other ways to solve. For example, a recursive tree can be used to find out the complexity of the algorithm is T (n) =o (nlog2n)

。 Simply say the calculation process:

The process of establishing a recursive tree is like a recursive process of an analog algorithm. The tree root corresponds to the input of the size of n problem, in the recursive processing of child problems, but also need to NLOGN processing time. Then, the child nodes are added to the root node according to the recurrence formula, and each child node corresponds to a child problem. This requires two sub nodes, each of which handles the problem of scale N/2, and requires (N/2) * log (N/2) time respectively. Therefore the time of the n * (log n-1) is needed altogether in the second layer. The third-tier node is to split the two nodes in the second layer and get four nodes with the required (N/4) * log (N/4) time, with a total time of n * (log n-2). And so on, K (set root is k = 0) layer has 2 ^ k node, the total time is n * (log n-k). It is also known that the tree has a total of logn layers (the last layer of each node deals with only 1 of the child problems, no more partition). Finally, add up the time of each layer, get: ∑K=0LOGNN (logn−k) =12nlogn (logn+1) =o (nlog2n)


Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.