Such complex, very wow–a guide to algorithmic complexity

Source: Internet
Author: User

Such complex, very wow–a guide to algorithmic complexity

Prerequisites:

    • exponentiation
    • Basic algebra
    • Functions and Asymptotes
    • Prior knowledge of insertion Sort and other basic sorts of sorts

This post was roughly divided into three parts, so feel free to switch to either part depending on your level:

    • Introduction
    • Apprenticeship
    • Mastery

Algorithmic complexity and growth of functions form the very foundation of algorithms. They give us a idea of what fast or slow an algorithmare so can ' t go anywhere near designing and analyzing ' EM without have these as tools.

? Algosaurus wants to know what ' growth of functions ' means. So he decides to get the Bunnies. And the Bunnies start have kids, hypothetically four per couple. Lots of them, because well, erm ...

If we examine the growth of bunnies, we find the IT grows at the rate of 2^n, where n is the n Umber of generations. This isn ' t very good news for Algosaurus, because despite bunnies being cute and cuddly, there is something called Having too many bunnies.

Roughly speaking, Bunny breeding algorithm has a complexity of O (2^n).

This was because the number of bunnies in next generation (the output) grows exponentially with respect to the NUM ber of Bunnies we started with (the input).

Let's dissect this.

"Order of growth of the running time of an algorithm gives a simple characterization of the algorithm ' s efficiency and ALS o Allow us to compare the relative performance of alternative algorithms ".

–holy Gospel of CLRS

Damn, that ' s a mouthful.

Imagine Algosaurus is descending a very odd staircase with n stairs. To descend each stair, he had to walk ntiles horizontally, and only then he can step down.

So for each stair, he had to walk n tiles. Therefore, for n stairs, he must walk n^2 tiles.

def weirdstairs(nofstairs):=0 for in range (0, nofstairs ):+ =+1print steps          

Total time taken to walk the entire flight of stairs?

Similarly, let's take the example of the insertion Sort.

DefInsertionsort(List): ForIInchRange(1,Len(List)):CurrentValue=List[I]Position=IndexWhilePosition> 0 and list [position - 1 ] > Currentvalue:< Span class= "PLN" > List[position] = List[position -  1] List[ position] =                

Suppose the list is in descending order and every number have to was "inserted" into the right place, ie. Each line of code in the loops are executed for every element.

As you go through the code, you'll notice that for each element present in the list, it'll has to iterate through the R EST of elements in the list.

If N is the length of the list, the performance isn ' t linearly proportional to n, it's proportional to t He Square of N.

So, the worst case running time of the insertion Sort comes off to be O (n^2), the ' Big-oh ' being part of the notation and ' n ' being the total number of elements in the list.

This are the worst case performance of the insertion Sort, and usually the only kind of the performance of We are about.

We have only talked about lists with ten, maybe elements. What is about those with even a million elements? Something like the total number of Facebook users?

Oh Boy, then algorithmic complexity really begins to play a role.

The number of elements large in these cases, which is the highest power, ie. degree, of the polynomial matters< /c1>, and nothing more. The rest of the terms is rendered insignificant.

The priority order for growth of functions is as follows:
Also, roughly speaking, we can estimate the complexity of a algorithm by analyzing its loop structure.

Traversing a list is an O (n) operation, so is finding the length of one. This means running time was directly proportional to the number of elements.

def traversal(list): For in range(lenlist):print  List[i]               

Algosaurus knows enough about the growth of functions-to-figure out whether one algorithm is faster than The other. Yaaaaay.

For a more formal treatment of complexity analysis, read on. Or maybe come back-to-it after a cup of tea "All" this have sunk in. *subtle plug for bookmarking this page*

Opening CLRS page, Algosaurus sees ...

Let's first talk about ' asymptotic notation '. As you probably know, when the input of certain functions tends to infinity, the output sometimes approaches a line, but D OESN ' t quite touch it. This was called an asymptote. Should care because when designing a particular algorithm or implementing one, it's important to know how it'll Perform with huge input sizes. You already know what Big-oh notation Means:it tells you the worst-case performance of an algorithm. Let's switch to Big-omega. I ' ll use the last example again. Suppose our input list was sorted to begin with, ie. The best-case performance.

DefInsertionsort(List): ForIInchRange(1,Len(List)):CurrentValue=List[I]Position=IndexWhilePosition> 0 and list [position - 1 ] > Currentvalue:< Span class= "PLN" > List[position] = List[position -  1] List[ position] =                

We ' ll still has to traverse every element in the list to determine whether it's sorted or not, even if we don ' t go into The nested while loop. The outer loop is a still executed ' n ' times in the best case scenario.

The best case is performance turns out to Ω (n).

To digest the rest of the godforsaken symbols mean, we need some math, namely ...

functions! and graphs!

Formally speaking, we can define Θ, O, and Ω as the following.

This doesn ' t really mean much, to be honest. Let's bring out the big guns:graphs. (Plugged from page, CLRS)

A few finer points. These functions is defined if f (n) is monotonically adhering to the required constraints after n_0.

For example, for Big-omega, Omega (g (n)) is formally defined if it is less than the runtime of the function ( c*g (n)) for all values greater than n_0.

A) This was for Big-theta. It is sandwiched by-functions, showing that theta (g (n)) is defined if it is bound from the upper and the L Ower sides. You can ' t just say it, and you gotta prove it. Hard work.

b) This was for Big-oh. The runtime function is always less than O (g (n)), showing this Big-oh is the *upper bound* for the runtime. Actual performance won ' t is worse than this.

c) for Big-omega. The runtime function is all greater than Ω (g (n)), showing that Big-omega are the *lower bound* for the runt IME. You can ' t do much better than this.

Patience Algosaurus. Here is the practical implications:

1. Big-omega is useful if you need a *lower bound* for the time, to show this an algorithm would always be slower than ω (g (n)). Best-case performance.
Meh, sometimes used.

2. Big-oh is used if you need a *upper bound* for the time, to show this an algorithm would always be faster than O ( g (n)). Worst-case performance.
Used all the time.

3. Finding Big-theta is usually harder than Big-oh, so big-oh are used more frequently, despite Big-theta being more Mathem atically accurate. Math folks must find it blasphemous, but no can do.

Phew, that is a lot of theory.

I hope you found this informal guide to algorithmic complexity analysis useful. If you liked it, want to insult me, or if you want to talk about dinosaurs or whatever, shoot me a mail at [email protecte D

Acknowledgements:
Introduction to Algorithms–cormen, Leiserson, Rivest, and Stein (pages 43–64)
Algorithms–dasgupta, Papadimitrou, and Vazirani (for the order)
Numbers:the Key to the Universe–kjartan Poskitt (Bunny example inspired by this book)

Such complex, very wow–a guide to algorithmic complexity

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.