# The top ten classical algorithms in the world that are most valued by people today

Source: Internet
Author: User

In today's world, there are countless classical algorithms that have been discovered or created. If you must vote for the top ten algorithms that you value most, what would you choose?

Recently, someone in the Stackexchange launched a question, to the netizens to collect the world's most classic ten algorithms. The crowd voted in a pile of shortlisted algorithms, culminating in the 10 most vocal algorithms.

Top ten algorithms from the Bible:

The author's description: "Proof from the Bible" collected dozens of concise and elegant mathematical proofs, quickly won the popularity of a large number of mathematical enthusiasts. If there is a Bible algorithm, which algorithms will be included. Now, friends, here are dozens of candidate algorithms, and if you think it's the most classic algorithm in the world today, please vote for it ...

The final generation of the highest number of votes in the following ten classic algorithms:

Tenth place: Huffman Coding (Hoffman code)

Huffman Coding (Huffman coding) is a coding method, which is a entropy coding (weight coding) algorithm for lossless data compression. David A. Huffman, who was a PhD student at MIT, was invented in 1952 and published in the article "a method of constructing a minimal redundant code" (A for the construction of minimum-redundancy codes).

Nineth Place: Binary Search (two-point search)

To find elements in an ordered set, you can use a binary lookup algorithm, also called a binary search. The binary lookup algorithm first compares the size of the elements and keys in the middle of the collection in three cases (assuming the collection is from small to large):

1. The key is less than the middle of the element, then the matching element must be on the left (if any), then the left side of the area to apply a binary search.

2. The key equals the middle position of the element, so the element is found.

3. The key is greater than the middle of the element, then the matching element must be on the right (if any), then the right side of the area to apply a binary search.

In addition, when the collection is empty, the representation cannot be found.

Eighth place: A similar test conducted by Miller-rabin

The idea is to take advantage of the small probability of prime numbers (such as using the Flt theorem) to look for the number of prime witnesses. If there is no evidence that sufficient random tests are found, this number is prime.

They are the basis for many other algorithms. About depth, breadth first search algorithm specific introduction, please refer to this article: teach you to understand thoroughly: BFS and DFS first search algorithm.

Sixth place: Gentry ' s fully homomorphic encryption scheme (gentleman full homomorphism encryption mechanism) algorithm.

This algorithm is beautiful, which allows third parties to perform arbitrary cryptographic data operations without the private key (not very well known).

Fifth place: Floyd-warshall All-pairs Shortest Path algorithm

D[]: a two-dimensional array. D[I,J] Minimum cost, or the adjacent edge of the shortest path.

For k from 1 to n:for I from 1 to n:for J from 1 to n:d[i,j] = min (d[i,j), d[i,k] + d[k,j]

Fourth place: Quicksort (Quick Sort)

The fast sort algorithm covers all the lists of all the classic algorithms. It was selected for the 20th century's greatest ten algorithms (refer to this: Count the greatest 10 algorithms of the 20th century). On the detailed introduction of the fast sorting algorithm, please refer to the article I wrote: A continuation of the fast sorting algorithm in-depth analysis.

Third place: Bfprt algorithm

In 1973, Blum, Floyd, Pratt, Rivest, Tarjan collectively sent out a paper titled "Time Bounds for selection", and gave an algorithm to select the K-large element in the array, commonly known as "median median number algorithm". Based on a well-designed pivot selection method, the algorithm theoretically guarantees the linear time complexity in the worst-case scenario, defeating the traditional algorithm of mean linear and worst O (n^2) complexity. A bunch of Daniel. The complexity analysis of recursive algorithms is played in the palm of the bone, and a well-deserved algorithm from the Bible is constructed.

I'm here to briefly introduce the algorithm for the time complexity of O (N) in the array to select the K-Large element:

Similar to the fast row in the segmentation algorithm:

Each partition can return the pivot point in the array position s, and then compare the size of S and K

If the big words, then again recursively divided array[s. N],

In small words, the recursive array[left...s-1]//s is the intermediate pivot point element.

Otherwise, return Array[s] is the value returned in partition. is to find this s.

After you find the S value that matches the requirement, then iterate over the element on the side of the output that is smaller than S.

You can refer to in: Introduction to the algorithm, in the Nineth chapter, to the desired linear time to do the selection, a section,

I found this. The average time complexity of O (n) is proved by looking for a small element in the array: the desired running time of the above program, which is finally proved to be O (n), and assumes that the elements are different.

Second name: Knuth-morris-pratt string matching algorithm

For an introduction to this algorithm, please refer to this article: VI, teach you thoroughly understand the KMP algorithm from beginning to end. KMP algorithm was unsuccessful in 20th century the greatest of the ten algorithms, but it is obvious that people can not accept such a beautiful, efficient KMP algorithm will not be defeated. So, the final vote output, KMP algorithm ranked second.

First place: Union-find

Strictly speaking, and lookup set is a kind of data structure, it is specially used to handle the merging operation of collection and query operation. And the collection cleverly borrowed the tree structure, which reduced the complexity of the programming to an unbelievable level; with some recursive techniques, almost all operations can be done with two lines of code. And path compression is a good idea, but also the entire data structure of the finishing touches of the pen. And the efficiency of the collection is very high, the time complexity of the single operation can be regarded as constant level, but because the actual behavior of the data structure is difficult to predict, accurate time complexity analysis needs to use a lot of advanced skills. Parallel lookup, which ultimately occupies the first place in this list.

Add: The first three votes were only 4, 8. So the rankings will continue to change in the future. But whatever the end result, the top 10 algorithms have been basically finalized.

How about the above algorithms, are you familiar with it? If, now, I give you a vote, you will vote for which algorithm column? OK, let's have a vote, please write your opinion and decision in the comments below in this article.

Related Keywords:

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

## A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

• #### Sales Support

1 on 1 presale consultation

• #### After-Sales Support

24/7 Technical Support 6 Free Tickets per Quarter Faster Response

• Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.