The ten most popular algorithms __ in the world today

Source: Internet
Author: User

In today's world, there are countless classic algorithms that have been discovered or created. If you must vote for the top ten algorithms that you value most, what would you choose?

Recently, someone in the Stackexchange launched a question, to the netizens to collect the world's most classic ten algorithms. The crowd voted in a whole bunch of shortlisted algorithms, and finally came up with the top 10 algorithms.

Ten algorithms from the Bible:

Description of the promoter: "Proof from the Bible" collected dozens of concise and elegant mathematical proofs, quickly won the popularity of a large number of math enthusiasts. If there is a Bible algorithm, which algorithms will be included in it. Now, friends, here are dozens of candidate algorithms, if you think it is the most classic algorithm in the world today, please vote for it .....

The result is the top ten classic algorithms that have the highest number of votes:

Tenth place: Huffman Coding (Hoffman code)

Huffman coding (Huffman Coding) is a coding method, which is an entropy coding (weight coding) algorithm for lossless data compression. In 1952, David A. Huffman was invented at MIT to pursue a PhD, and was published in the article "a method for constructing very minimal extra code" (A-way for the construction of Minimum-redundancy Codes).

Nineth Place: Binary search (two-point search)

To find elements in an ordered collection, you can use a binary lookup algorithm, also called a binary search. The binary lookup algorithm compares the size of the elements and keys at the center of the set, with three cases (assuming the collection is arranged from small to large):

1. The key is less than the middle position element, then the matching element must be on the left side (if any), then apply a binary search to the area on the left.

2. The key equals the middle position element, so the element is found.

3. The key is greater than the middle position of the element, then the matching element must be on the right (if any), then apply a binary search to the right area.

Additionally, when the collection is empty, the representation cannot be found.

Eighth place: A similar test test conducted by Miller-rabin

The idea is to use the small probabilities of prime numbers (such as using the Flt theorem) to find a witness in the number of primes. If there is no evidence that the random test is sufficient, this number is prime.

Seventh place: Depth first search, breadth search (depth, breadth preference)

They are the basis of many other algorithms. For a detailed description of the depth and breadth First search algorithm, please refer to this article: teach you a thorough understanding of the BFS and DFS first search algorithm.

Sixth place: Gentry ' s Fully homomorphic encryption Scheme (gentleman fully homomorphic encryption mechanism) algorithm.

This algorithm is beautiful and allows third parties to perform arbitrary cryptographic data operations without a private key (not very well understood).

Fifth place: Floyd-warshall All-pairs Shortest Path algorithm

The introduction of this algorithm, can refer to my writing this article: a few shortest path algorithm comparison

D[]: two-dimensional array. D[I,J] Minimum cost, or adjacent edge of the shortest path.

For k from 1 to n:for I from 1 to n:for J from 1 to n:d[i,j] = min (D[i,j], d[i,k] + d[k,j])

Fourth place: Quicksort (Quick Sort)

The fast sort algorithm covers almost all the lists of all classical algorithms. It was voted the greatest of the ten algorithms of the 20th century (refer to this: The great 10 algorithms of the 20th century). For a detailed description of the fast sorting algorithm, please refer to the article I wrote: A continuation of the fast sorting algorithm in-depth analysis.

Third place: Bfprt algorithm

In 1973, Blum, Floyd, Pratt, Rivest and Tarjan collectively dispatched a paper titled "Time Bounds for selection", which gives an algorithm for selecting the K-elements in the array, commonly known as the median median algorithm. Relying on a well-designed pivot selection method, the algorithm theoretically guarantees the linear time complexity in the worst case, and defeats the traditional algorithm of the average linearity and the worst O (n^2) complexity. A group of Daniel put the complexity analysis of recursive algorithm into the palm of the bone, and constructed a well-deserved algorithm from the Bible.

Here is a brief introduction to the algorithm of O (N) with the time complexity of selecting K-Elements in the array:

Similar to the segmentation algorithm in the fast line:

The position of the pivot point in the array is returned after each partition s, and then the size of S and K is compared.

If large, then again recursively divided array[s. N],

Small words, recursive array[left...s-1]//s as the intermediate pivot point element.

Otherwise, return Array[s], which is the value returned in partition. is to find this s.

After finding the S value that meets the requirements, traverse the element on the side of the output that is smaller than S.

You can refer to: In the introduction to the algorithm, the Nineth chapter, the desired linear time to do the selection, in the section,

I found this. Finding the element K small in the array, the average time complexity is O (n) Proof: The expected run time of the above program, and finally proved that O (n), and the assumption that the elements are different.

Second place: Knuth-morris-pratt string matching algorithm

For an introduction to this algorithm, please refer to this article: six, teach you to thoroughly understand the KMP algorithm from beginning to end. KMP algorithm once lost in 20th century the greatest ten algorithms, but people obviously can't accept, so beautiful, efficient KMP algorithm will be unsuccessful. So, the final vote produced, the KMP algorithm ranked second.

First place: Union-find

Strictly speaking, the check set is a data structure that is specifically designed to handle the merge and query operations of the collection. And the collection cleverly borrowed the tree structure, so that the complexity of programming is reduced to an unbelievable extent; with some recursive techniques, almost every operation can be done with two lines of code. And path compression is a good idea, but also the entire data structure of the finishing touch of the pen. And the efficiency of the collection is very high, the time complexity of single operation can be regarded as the constant level almost, but because the actual behavior of data structure is difficult to predict, accurate time complexity analysis needs a lot of advanced techniques. Parallel lookups, which ultimately occupy the first place on the list.

Supplementary: The number of votes in the top three, only 4 votes, 8 votes. So the rankings will continue to change in the future. But whatever the end result, the first 10 algorithms have been largely finalized.

How about the above algorithms, are you familiar with it? If, now, I give you a vote, which algorithm column will you vote for? OK, let's have a vote, please write your opinion and decision right in the comments below this article.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.