Linear sorting algorithm

Source: Internet
Author: User
Preface

For comparison-based sorting algorithms such as insert, fast, merge, and heap sorting, the lower limit of the worst case is Ω (nlogn). In the worst case, you need to perform an Ω (nlogn) comparison. Suppose there is an array composed of n elements (assuming that each element is not equal), there are n in total! And n! The permutation and combination results should all be on the leaf node of the decision tree (1). In Figure 1, n = 3, so there are 3! = The six combinations are all on the leaf nodes of the decision tree. For a binary tree with a height of h, the number of leaf nodes can be at most 2 h (2 h for a full binary tree, the root node is Layer 1 ). So n! <= 2 h, so h> = log (n !) = Ω (nlogn ). The proof is as follows:

 

Linear sorting algorithm count sorting

Assume that there is a set of n numbers, and the range of n numbers is 0 ~ Between k (k = O (n.

Running time: running (n + k)

As shown in A2.1, the secondary array B (storing the final sorting result) and array C (storing the number of elements) are required ). Based on the preceding assumptions, the size of array C is k, and C [I] indicates the number of I (0 <= I <k) elements in array A (2.2 ), to ensure the stability of counting sorting, array C changes to Figure 2.3, and C [I] indicates the number less than or equal to I. The Code is as follows:

   1: /*

2: input: array A to be sorted, storage of sorted array B, size of array A, size of array C

3: function: sort by count

   4: */

   5: void CountingSort(int A[], int B[], int len, int k)

   6: {

   7:     int *CountArr = new int[k];

   8:     int i;

   9:     for (i = 0; i < k; i++)

  10:     {

  11:         CountArr[i] = 0;

  12:     }

  13:  

  14:     for (i = 0; i < len; i++)

  15:     {

  16:         CountArr[A[i]]++;                

  17:     }

  18:  

  19:     for (i = 1; i < k; i++)

  20:     {

  21:         CountArr[i] += CountArr[i-1];

  22:     }

  23:  

24: // The stability of the algorithm is ensured from right to left.

  25:     for (i = len-1; i >=0; i--)

  26:     {

  27:         B[CountArr[A[i]]-1] = A[i];

  28:         CountArr[A[i]]--;

  29:     }

  30: }

The runtime of rows 9-12 and 19-22 is running (k), and rows 14-17 and 25-29 are running (n ), therefore, the total running time is minute (2 (n + k) = minute (n + k ).

Base sort

Base sorting: all the values to be compared (positive integers) are unified into the same digit length, and the number before the shorter digit is filled with zero. Then, sort the data by bit. In this way, the sequence is changed to an ordered sequence after the ranking is completed until the sorting is completed by the highest bit.

Base sorting is divided into two types: LSD and MSD.

LSD (Least significant digital): the lowest priority is given, that is, sorting starts from right to left.

MSD (Most significant digital): highest priority, that is, sorting from left to right.

The following is the pseudocode of base sorting in LSD mode.

   1: RadixSort(A,d)

   2:     for i <- 1 to d

3: UseStable Sorting AlgorithmArrange the I-bits of elements in array

3: First digit, then ten digits, and last hundred digits. A stable algorithm is required for sorting a bit of an array.

The running time is d (n + k )). In the base sorting, the algorithms used to sort arrays are count sorting, so the running time is round (n + k), and d is the maximum number of digits in the array.

Sort buckets

Sort buckets: sort the arrays into a limited number of buckets, and then sort the sequences in the buckets at running time (n ). Bucket sorting is based on the assumption that the input data is composed of a random process. Otherwise, all data is allocated to a bucket in the worst case. If the bucket sorting does not meet the assumption requirements, you can only use a comparison-based Sorting Algorithm to sort data, and the running time degrades to Ω (nlogn ).

Sorting Algorithm Stability

Sort Algorithm stability: Assume that two elements in the sequence to be sorted are equal, and the relative positions of the two equal elements before and after the sequence are unchanged, that is, a = B, and a is before B, after sorting, a is still before B. The stability of the sorting algorithm depends on the specific algorithm implementation. For example, in general cases, direct selection of sorting, fast sorting, Hill sorting, and heap sorting are not stable sorting algorithms, such as base sorting and counting sorting, merge Sorting, insert sorting, and Bubble sorting are all stable sorting algorithms.

Fast sorting: A = {2, 2, 1}, after sorting A = {1, 2 }.

Hill sorting: A = {, 4,}, after sorting (k = 2); A = {1, 2, 4, 4, 5, 7 }.

Heap sorting: A = {2, 2, 1}. After sorting, A = {1, 2 }.

Directly select sorting: A = {4, 4, 2, 5}, after sorting A = {2, 4, 5 }.

The above examples do not meet the stability requirements.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.