8. Graphic explanations of sorting algorithms

Source: Internet
Author: User

Sorting algorithms can be divided into internal sorting and external sorting. Internal sorting means that data records are sorted in the memory, while external sorting means that the sorting data is large and cannot accommodate all sorting records at a time, you need to access external storage during sorting.

Common internal sorting algorithms include insert sorting, Hill sorting, select sorting, Bubble sorting, Merge Sorting, fast sorting, heap sorting, and base sorting.

This article will introduce the above eight sorting algorithms in turn.

Algorithm 1: insert sorting

 

Insert sort

Insert sorting is the simplest and most intuitive sorting algorithm. It works by constructing an ordered sequence. For unordered data, It scans the sorted sequence from the back to the front, locate the corresponding location and insert it.

Algorithm steps:

1) Think of the first element of the first to be sorted sequence as an ordered sequence, and treat the second element to the last element as an unordered sequence.

2) scan the unordered sequence from start to end, and insert each element into the appropriate position of the ordered sequence. (If the element to be inserted is equal to an element in an ordered sequence, the element to be inserted is inserted after the same element .)

Algorithm 2: Hill sorting

Hill sorting

 

Hill sorting, Also knownDescending incremental Sorting AlgorithmIs a more efficient version improvement for inserting sorting. However, Hill sorting is a non-stable sorting algorithm.

Hill sorting proposes an improvement method based on the following two attributes of insert sorting:

  • Insert sorting is highly efficient when performing operations on data in almost sorted order, that is, linear sorting can be achieved.

  • However, insert sorting is generally inefficient, because insert sorting can only move one bit of data at a time.

The basic idea of hill sorting is: first, the whole sequence of records to be sorted is divided into several subsequences for direct insertion and sorting. When the records in the whole sequence are "basic order, then, all records are inserted and sorted in sequence.

Algorithm steps:

1) Select an incremental sequence T1, T2 ,..., TK, where Ti> TJ, TK = 1;

2) sort the sequence K by the number of incremental sequences;

3) Sort each row. Based on the corresponding incremental Ti, separate the columns to be sorted into several sub-sequences with a length of M, and insert and sort the sub-tables directly. When the increment factor is 1, the whole sequence is processed as a table. The table length is the length of the whole sequence.

 

Algorithm 3: Select sorting

Select sort

Select sort(Selection sort) is also a simple and intuitive sorting algorithm.

Algorithm steps:

1) First, find the smallest (large) element in the unordered sequence and store it to the starting position of the sorting sequence.

2) Search for the smallest (large) element from the remaining unordered elements and put it at the end of the sorted sequence.

3) Repeat Step 2 until all elements are sorted.

 

Algorithm 4: Bubble Sorting

Bubble Sorting

 Bubble Sorting(Bubble sort) Is also a simple and intuitive sorting algorithm. It repeatedly visits the series to be sorted, compares two elements at a time, and exchanges them if their order is wrong. The work of visiting a sequence is repeated until there is no need for exchange, that is, the sequence has been sorted. The name of this algorithm comes from because the smaller elements will slowly "float" to the top of the series through the exchange.

Algorithm steps:

1) Compare adjacent elements. If the first is bigger than the second, exchange the two of them.

2) perform the same operation on each adjacent element, starting from the first pair to the last one. After this step is completed, the final element will be the largest number.

3) Repeat the preceding steps for all elements except the last one.

4) continue to repeat the above steps for fewer and fewer elements until there is no need to compare any number.

 

Algorithm 5: Merge Sorting

Merge Sorting

Merge sort)Is an effective Sorting Algorithm Based on the merge operation. This algorithm is a very typical application of divide and conquer.

Algorithm steps:

1. Apply for a space so that the size is the sum of the two sorted sequences. This space is used to store the merged sequences.

2. Set two pointers. The initial position is the start position of the two sorted sequences respectively.

3. Compare the elements pointed to by the two pointers, select a relatively small element, and move the pointer to the next position.

4. Repeat Step 3 until a pointer reaches the end of the sequence.

5. Copy all the remaining elements of another sequence directly to the end of the merging sequence.

Details:MergeSort

Algorithm 6: Fast sorting

Quick sorting

Quick sortingIt is a sort algorithm developed by Tony Hall. Sort by averageNProject (s)Bytes(NLogN. In the worst caseBytes(N2) comparison, but this situation is not common. In fact, quick sorting is usually more obvious than otherBytes(NLogN) The algorithm is faster because its inner loop can be implemented efficiently in most architectures.

Quick sorting uses the divide and conquer policy to divide a serial (list) into two sub-serial (sub-lists ).

Algorithm steps:

1. Pick out an element from the sequence, which is called a benchmark ),

2. Re-sort the series. All elements are placed before the benchmark values smaller than the benchmark values, and all elements are placed behind the benchmark values larger than the benchmark values (the same number can come to either side ). After the partition exits, the benchmark is in the middle of the series. This is calledPartition)Operation.

3. recursively sort the subseries smaller than the reference value element and the subseries greater than the reference value element.

The bottom of recursion is that the number of columns is zero or one, that is, they are always sorted. Although this algorithm is always recursive, it always exits, because in each iteration, it will at least place an element at its final position.

Detailed introduction: quick sorting

Algorithm 7: heap sorting

Heap sorting

Heap sortingHeapsort is a sort algorithm designed by using the data structure of heap. Accumulation is an almost complete binary tree structure that meetsStacked nature: That is, the key value or index of a child node is always smaller than (or greater than) its parent node.

The average time complexity of heap sorting isBytes(NLogN).

Algorithm steps:

1) create a heap H [0 .. n-1]

2) swap the beginning (maximum) and end of the heap

3) Reduce the heap size by 1 and call shift_down (0) to adjust the top data of the new array to the corresponding position.

4) Repeat Step 2 until the heap size is 1.

Details: heap sorting

Algorithm 8: Base sorting

Base sortIs a non-Comparative integer sorting algorithm. The principle is to cut an integer into different digits by the number of digits, and then compare them by each digit. Since integers can also express strings (such as names or dates) and floating point numbers in specific formats, the base sorting is not only applicable to integers.

Before speaking about base sorting, we will briefly introduce the bucket sorting:

Algorithm idea:Is to divide the array into a limited number of buckets. Each bucket can be sorted individually (other sorting algorithms may be used or the bucket sorting can be performed in a progressive manner ). Bucket sorting is an inductive result of nest sorting. When the values in the array to be sorted are evenly distributed, the linear time (n) is used for Bucket sorting )). However, bucket sorting is not a comparative sorting, and is not affected by the lower limit of O (n log n.
Simply put, data is grouped into buckets and sorted in each bucket.

For example, you want to sort n integers A [1. N] in the range [1. 1000 ].

First, you can set the bucket size to 10. Specifically, set Set B [1] to store [1 .. the integer of 10], which is a set of B [2] storage (10 .. 20] integer ,...... Set the integer of B [I] storage (I-1) * 10, I * 10], I = 100. A total of 100 buckets are available.

Then, scan a [1. N] From start to end and put each a [I] in the corresponding bucket B [J. Sort the numbers in each of these 100 buckets. In this case, you can use bubble, select, or even fast sorting. Generally, any sorting method is acceptable.

Finally, the numbers in each bucket are output sequentially, and the numbers in each bucket are output from small to large. In this way, a sequence of all numbers is obtained.

Suppose there are n numbers and M buckets. If the numbers are evenly distributed, there are N/m numbers on average in each bucket. If

The complexity of the entire algorithm is

O (N + M * N/m * log (N/m) = O (N + nlogn-nlogm)

From the above formula, when m is close to N, the complexity of Bucket sorting is close to O (n)

Of course, the above complexity calculation is based on the assumption that N numbers are evenly distributed. This assumption is very strong, and the results in practical applications are not so good. If all the numbers are in the same bucket, the sorting will degrade.

As mentioned above, most of the time complexity of sorting algorithms is O (n2), and some sorting algorithms are O (nlogn ). However, bucket sorting can achieve O (n) time complexity. However, the disadvantage of Bucket sorting is:

1) The first is the high space complexity and the extra overhead required. There is a space overhead for sorting two arrays. One is to store the array to be sorted, and the other is the so-called bucket. For example, if the value to be sorted is from 0 m-1, M buckets are required, this bucket array requires at least m space.

2) The elements to be sorted must be within a certain range.

 

Summary

The stability, time complexity, space complexity, and stability of various sorting types are summarized as follows:

 

Time Complexity:

(1) Order of squares (O (n2)
Various types of simple sorting: Direct insertion, Direct selection, and Bubble Sorting;

(2) linear rank (O (nlog2n)
Fast sorting, heap sorting, and Merge Sorting;
(3) O (N1 + §) sorting. § is a constant between 0 and 1.

Hill sorting
(4) linear order (O (N) sorting
Base sorting, in addition to bucket and box sorting.

 

Stability:

Stable sorting algorithms: Bubble sorting, insert sorting, Merge Sorting, and base sorting

Not a stable Sorting Algorithm: Select sorting, fast sorting, Hill sorting, and heap sorting


Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.