Stability analysis of various sequencing time-space complexity

Source: Internet
Author: User

The following is a comparison of the speed of a common sorting algorithm: (from slow to fast)
1. Bubble sort O (n^2)
2. Simple Selection sort O (n^2)
3. Direct Insert sort O (n^2)
4. Binary Insert sort O (n^2)
5, Hill Sort, approximate O (n^1.25) (not yet conclusive, but can be determined to be the polynomial time complexity between n~n^2)
6. Heap sort O (Nlogn)
7. Merge sort O (Nlogn)
8. Quick Sort O (nlogn)
In general, the fastest in the fast line. But there are exceptions!
6,7 are performance-stable algorithms, the complexity of which is their worst-case complexity, and the complexity of the fast-line is only the average time complexity. the worst-case complexity of the n^2 is O (a), and the worst cases occur when the array is completely ordered and no longer arranged.
Therefore, the quickest is the merge sort. The second is the heap ordering (the complexity of merging and heap sorting is the same, the difference is only the constant factor in front of the Big O notation). Bubbling itself is a very slow algorithm, at this time the performance of the bubble and the fast row should be almost in fact, not good comparison, in short, they must be the slowest two.

(P.S. Additional: The time complexity of the direct insertion sort is O (n^2), but when the data is ordered, the execution is the best, and the time complexity is O (N). That is, in the case of the array you are talking about ascending order, the direct insertion is the fastest! )

1. Select sort: Unstable, time complexity O (n^2)

The basic idea of choosing a sort is to treat a sequence of records to be processed n-1 times, and the first I-pass processing is to be l[i. N] The smallest person in the l[i] exchange position. Thus, after I pass the processing, the position of the first I record is already correct.

2. Insert sort: Stable, time complexity O (n^2)

The basic idea of inserting a sort is that, after i-1-through, l[1..i-1] is in the right order. I-pass processing only l[i] into the appropriate position of l[1..i-1], so that l[1..i] is a sequence of orderly. To achieve this, we can use a sequential comparison method. First compare L[i] and l[i-1], if l[i-1]≤l[i], then L[1..I] has been ordered, the first time the processing is finished, otherwise exchange l[i] and l[i-1] position, continue to compare l[i-1] and l[i-2] until a certain position J (1≤j≤i-1) is found, Make l[j]≤l[j+1]. Figure 1 illustrates the process of inserting a sequence of 4 elements, which requires (a), (b), (c) three insertions.

3. Bubble sort: Stable, time complexity O (n^2)

The bubble sort method is the simplest sort method. The basic idea of this approach is to think of the elements to be sorted as "bubbles" that are vertically arranged, smaller elements lighter and thus upward. In the bubble sorting algorithm we have to deal with this "bubble" sequence several times. The so-called process, is to check the sequence from the bottom up, and always pay attention to the sequence of two adjacent elements is correct. If the order of two adjacent elements is found to be incorrect, that is, the "light" elements are below, exchanging their positions. Obviously, after processing, the "lightest" element floats to its highest position, and after two times, the "light" element floats to the next high position. In the second pass, you do not have to check because the element at the highest position is already the lightest element. In general, when I pass the processing, I do not have to check the higher position above the elements, because after the previous i-1 the processing, they have been correctly sequenced.

4. Heap sequencing: Unstable, time complexity O (NLOGN)

Heap sorting is a sort of tree selection, in which the a[n] is considered as the sequential storage structure of a complete binary tree, and the smallest element is selected using the intrinsic relationship between the parent node and the child node in the complete binary tree.

5. Merge sort: Stable, time complexity O (NLOGN)

With two sequential (ascending) sequences stored in adjacent positions in the same array, it may be set to a[l. M],a[m+1..h], merge them into an ordered sequence, and store them in A[l. H].

6. Quick sort: Unstable, time complexity ideal O (nlogn), Worst O (n^2)

A quick sort is an essential improvement to the bubbling sort. Its basic idea is that the length of the sequencing sequence can be drastically reduced after a scan. In a bubbling sort, a scan can only ensure that the number of maximum values is moved to the correct position, while the length of the sequence to be sorted may be reduced by only 1. Quick sort by a scan, you can make sure that the number of the left is smaller and the number on the right is larger than it. It then uses the same method to manipulate the left and right sides of the number until there is only one element to the left of the datum point.

7. Hill sort: Unstable, time complexity average O (nlogn), Worst O (n^s)1

In the direct insertion sorting algorithm, inserting one number at a time causes the ordered sequence to increment only 1 nodes, and does not provide any help for inserting the next number. If you compare the number of distant distances (called increments) so that the number moves across multiple elements, a comparison can eliminate multiple element exchanges. D.l.shell realized this idea in 1959 in a sort algorithm named after him. The algorithm first sorts the group of numbers by an increment d into groups, each group of records of the subscript difference D. Sort all the elements in each group, then use a smaller increment to do it, and then sort them in each group. When the increment is reduced to 1 o'clock, the entire number to be sorted is divided into a group, and the sort is completed.

Complexity of Time

Complexity of space

Stability

Insert Sort

O (N2)

1

Hill sort

O (N2)

1

X

Bubble sort

O (N2)

1

Select sort

O (N2)

1

X

Quick Sort

O (NLOGN)

O (LOGN)

X

Heap Sort

O (NLOGN)

1

X

Merge sort

O (NLOGN)

O (N)

Stability analysis of various sequencing time-space complexity

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.