Nine sorting algorithm summary __ algorithm

Source: Internet
Author: User
Tags comparison sorts

Sorting algorithms can be divided into internal and external sorting, the internal sorting is the data records in memory to sort, and the external sort is because of the data is very large, one can not hold all the sort records, in order to access the external memory.

Common internal sorting algorithms are: Insert sort, hill sort, select sort, bubble sort, merge sort, quick sort, heap sort, cardinality sort, etc. algorithm one: Insert sort

Insertion sorting is the simplest and most intuitive sort algorithm, which works by constructing an ordered sequence, for unsorted data, to scan from backward forward in a sorted sequence, to find the appropriate position and insert. algorithm Steps

1) Consider the first element of the first order sequence as an ordered sequence, and the second element to the last element as an unordered sequence.

2) scan the unordered sequence from beginning to end, inserting each element scanned into the appropriate position of the ordered sequence. (If the element you want to insert is equal to an element in an ordered sequence, insert the element you want to insert behind the equal element.) ) Implementation Code

void Insert_sort (int *num, int len)
{for
    (int i = 1; i < Len; i++)
    {
        int temp = num[i];
        int j = i;
        while (J > 0 && num[j-1] > Temp)
        {
            num[j] = num[j-1];
            j--;
        }
        NUM[J] = temp;
    }
}
algorithm two: Hill sort

Hill sort, also called descending incremental sorting algorithm, is a more efficient and improved version of insertion sequencing. But Hill sort is a non-stable sorting algorithm.

The hill sort is based on the following two-point properties of the insertion sort: the insertion sort is efficient in the case of almost ordered data operations, i.e. the efficiency of a linear sort can be achieved, but the insertion sort is generally inefficient because the insertion sort can only move data one at a time.

The basic idea of Hill sort is: First, the whole sequence of records is divided into several sub-sequences to be directly inserted into the sort, and then the whole record is sequentially inserted in order when the records are "basically ordered". algorithm Steps

1) Select an incremental sequence T1,T2,...,TK, where ti>tj,tk=1;

2) According to the number of increment series K, the sequence of K-trip sequencing;

3) Each order, according to the corresponding increment ti, the waiting sequence is divided into several sub-sequences of length m, respectively, the sub-table is directly inserted sort. Only the increment factor is 1 o'clock, the entire sequence is treated as a table, and the length of the table is the length of the entire sequence. Implementation Code 1

void Shell_sort (int num[], int len)
{
    int i, J, K, group, temp;
    for (group = LEN/2; group > 0; Group/= 2)
    {
        //insert sort for each grouping for
        (i = 0; i < group; i++)
        {for
            ( j = i + group; J < Len; j + = Group)
            {
                if (Num[j-group] > Num[j])
                {
                    temp = num[j];
                    K = J-group;
                    while (k >= 0 && num[k] > Temp)
                    {
                        num[k + group] = num[k];
                        K-= group;
                    }
                    Num[k + Group] = temp;}}}}
Implementation Code 2
void Shell_sort (int *num, int len)
{for
    (int d = LEN/2; d > 0; d/= 2)
    {
        //each element is inserted with elements within its group
        for (int i = D; i < Len; i++)
        {
            int temp = num[i];
            int j = i;
            while (J >= D && num[j-d] > Temp)
            {
                num[j] = num[j-d];
                J-= D;
            }
            NUM[J] = temp;}}
}

The difference between the two implementations is whether they are handled in groups, the first implementation is handled in groups, the 1th group is processed and then the second group is processed, ..., and the second way is to sort one of the elements in one group with the elements within its group, and then do the same for an element in another group ... ...。 algorithm Three: Select Sort

Select sort (Selection sort) is also a simple and intuitive sorting algorithm. algorithm Steps

1) First find the smallest (large) element in the unordered sequence, and place it in the starting position of the sort sequence

2) continue to find the smallest (large) element from the remaining unsorted elements, and then place it at the end of the sorted sequence.

3) Repeat the second step until all the elements are sorted. Implementation Code

void Selection_sort (int num[], int len)
{for
    (int i = 0; i < len-1; i + +)
    {
        int j = i;
        for (int k = i + 1; k < Len; k++)
        {
            if (Num[k] < num[j])
            {
                j = k;
            }
        }

        if (j! = i)
        {
            int temp = num[j];
            NUM[J] = Num[i];
            Num[i] = temp;}}
}
algorithm four: Bubble sort

Bubble sort (Bubble sort) is also a simple and intuitive sorting algorithm. It repeatedly visited the sequence to sort, comparing two elements at a time, and swapping them out if they were wrong in the order. The work of the sequence of visits is repeated until no more need to be exchanged, that is, the sequence is sorted. The algorithm is named because the smaller elements will slowly "float" through the switch to the top of the sequence. algorithm Steps

1) compare adjacent elements. If the first one is bigger than the second one, swap them both.

2) for each pair of adjacent elements to do the same work, from the beginning of the first pair to the end of the last pair. When this is done, the final element will be the maximum number.

3) Repeat the above steps for all elements except the last one.

4) Repeat the above steps each time for less and fewer elements until there is no pair of numbers to compare. Implementation Code

void Bubble_sort (int num[], int len)
{
    bool exchange;
    for (int i = 0; i < len-1; i++)
    {
       exchange = false;
       for (int j = 0; J < len-i-1; j + +)
       {
           if (Num[j] > num[j + 1])
           {
               int temp = num[j];
               NUM[J] = num[j + 1];
               Num[j + 1] = temp;
               Exchange = true;
           }
       }
       if (!exchange)
       {
           return;}}
}
algorithm Five: merge sort

Merge sort is an efficient sorting algorithm based on merging operations. This algorithm is a very typical application of the partition method (Divide and Conquer). algorithm Steps

The space is applied to the sum of two sorted sequences, which are used to store the merged sequence

Set two pointers where the initial position is the starting position of the two sorted series

Compare the elements pointed to by two pointers, select a relatively small element into the merge space, and move the pointer to the next position

Repeat step 3 until a pointer reaches the end of the sequence

Copy all remaining elements of another sequence directly to the end of the merge sequence implementation code

First and end are the indexes of number one and the last element of Num void merge_sort (int num[], int, int, end) {if ' < end} {int
        Mid = (first + end)/2;
        Merge_sort (Num, first, mid);
        Merge_sort (num, mid + 1, end);
    Merge (num, first, mid, end);
    }}//merge sequence [Start, mid], [mid+1, end] void merge (int num[], int first, int mid, int end) {int N1 = Mid-first + 1;
    int n2 = End-mid;
    int* L = new INT[N1];
    int* R = new INT[N2];
    for (int i = 0; i < N1; i++) {L[i] = Num[first + i];
    } for (int j = 0; J < N2; J + +) {R[j] = Num[mid + j + 1];
    } int i = 0;
    int j = 0;
    int k = First;
        while (I < N1 && J < N2) {if (L[i] < r[j]) {num[k++] = l[i++];
        } else {num[k++] = r[j++];
    }} while (I < N1) {num[k++] = l[i++];
    } while (J < N2) {num[k++] = r[j++]; } DeleTe [] L;
delete [] R; }
algorithm Six: Quick sort

Fast sequencing is a sort of algorithm developed by Donny Holl. On average, sort n items to 0 (n log n) comparisons. In the worst case scenario, a 0 (N2) comparison is required, but this is not a common situation. In fact, fast sequencing is usually much faster than the other 0 (n log n) algorithms because its internal loop (inner Loop) can be implemented efficiently on most architectures.

Quick sort uses the divide-and-conquer (Divide and conquer) strategy to divide a serial (list) into two sub-serial (sub-lists). algorithm Steps

1 Select an element from the series, called the "Datum" (pivot),

2 Reorder the columns, where all elements are placed in front of the datum in a smaller position than the base value, and all elements are larger than the base value behind the datum (the same number can be on either side). After the partition exits, the datum is in the middle of the sequence. This is called partition (partition) operation.

3 recursively (recursive) sorts sub-columns that are less than the base value element and that are larger than the base value element.

At the bottom of the recursive scenario, the size of the sequence is 0 or one, which is always sorted. Although it is always recursive, the algorithm always exits, because in each iteration (iteration), it will at least put an element to its last position. Implementation Code 1

int partition (int num[], int left, int. right)
{
    int x = num[right];
    int i = left;
    int j = left-1;
    for (; i < right; i++)
    {
        if (Num[i] < x)
        {
            j + +;
            if (j! = i)
            {
                swap (num[j], num[i])
            ;
    }}} Swap (Num[j + 1], num[right]);

    return j + 1;  Return split point
}

void Quick_sort (int num[], int left, int. right)
{
    if (left < right)
    {
        int index = partition (num, left, right);
        Quick_sort (num, left, index-1);
        Quick_sort (num, index + 1, right);
    }
}
Implementation Code 2
void Quick_sort (int num[], int left, int. right)
{
    if (left < right)
    {
        int i = left;
        int j = right;
        int x = num[i];
        while (I < j)
        {
            while (i < J && Num[j] >= x)
            {
                j--;
            }
            if (i < j)
            {
                num[i++] = num[j];
            }

            while (I < J && Num[i] < x)
            {
                i++;
            }
            if (i < j)
            {
                num[j--] = Num[i];
            }
        }
        Num[i] = x;

        Quick_sort (num, left, i-1);
        Quick_sort (num, i + 1, right);
    }
}

The implementation code 1 is based on the rightmost point, and implementation code 2 is based on the leftmost point. algorithm Seven: heap sequencing

Heap ordering (heapsort) refers to a sort algorithm designed using the data structure of the heap. A heap is a structure that approximates a complete binary tree and satisfies the properties of the heap at the same time: that is, the key value or index of the child node is always less than (or greater than) its parent node.

The average time complexity for heap sorting is 0 (NLOGN). algorithm Steps

1) Create a heap h[0..n-1]

2) Swap the stack head (maximum) with the end of the heap

3) Reduce the size of the heap by 1 and call Shift_down (0) to adjust the new array top data to the corresponding position

4) Repeat step 2 until the size of the heap is 1 implementation code

void Heap_build (int num[], int root, int len)
{
    int lchild = root * 2 + 1;
    if (Lchild < len)
    {
        int largest = Lchild;
        int rchild = lchild + 1;
        if (Rchild < len)
        {
            if (Num[rchild] > Num[largest])
            {
                largest = Rchild;
            }
        }
        if (Num[root] < num[largest])
        {
            swap (Num[root], num[largest]);
            Heap_build (num, largest, len);

}}} void Heap_sort (int num[], int len)
{
    for (int i = LEN/2; I >= 0; i--)
    {
        heap_build (num, I, Len); c26/>}

    for (int j = len-1; J >= 1; j--)
    {
        swap (num[0], num[j]);
        Heap_build (num, 0,--len);
    }
}
algorithm eight: Cardinal sort

Radix sorting is a non-comparative integer sorting algorithm, which is based on cutting the number of digits into different numbers and then comparing them by each bit. Because integers can also express strings (such as names or dates) and floating-point numbers in a particular format, the cardinality sort is not only used for integers.

Before we say the Cardinal sort, we'll simply describe the bucket sort :

Algorithm idea: is to divide the array into a finite number of buckets. Each bucket is sorted separately (it is possible to use a different sorting algorithm or to sort by using the bucket sort recursively). Bucket sequencing is an inductive result of pigeon nest sorting. When the values in the array to be sorted are evenly distributed, the bucket sort uses linear time (Θ (n)). But the bucket sort is not a comparison sort, and he is not affected by the O (n log n) lower bound.
Simply put, the data is grouped, placed in a bucket, and then the inside of each bucket is sorted.

For example, to sort n integers in the [1..1000] range of size A[1..N]

First, the bucket can be set to a range of size 10, specifically, set the set b[1] store [1..10] integer, set b[2] Store (10..20] integer, ... Set B[i] Store ((i-1) *10, i*10] integer, i =,.. 100. There are a total of 100 barrels.

Then, scan the A[1..N] from beginning to end, and put each a[i] into the corresponding bucket b[j]. Then the 100 barrels in each bucket in the number of sorting, then can be bubbling, selection, and even fast, in general, any sort method can be.

Finally, the numbers in each bucket are output sequentially, and the numbers in each bucket are output from small to large, so that a sequence of all the numbers is ordered.

Suppose there are n numbers, there are m buckets, and if the numbers are evenly distributed, there is an average number of n/m in each bucket. If the numbers in each bucket are sorted quickly, the complexity of the entire algorithm is

O (n + M * N/m*log (n/m)) = O (n + nlogn–nlogm)

As seen from the above, when M approaches N, the sorting complexity of buckets is close to O (n)

Of course, the calculation of the above complexity is based on the assumption that the input n numbers are evenly distributed. This hypothesis is very strong, the actual application of the effect is not so good. If all the numbers fall into the same bucket, it will degenerate into a general sort.

Some of the above-mentioned sorting algorithms, most of the time complexity are O (N2), there are some sorting algorithm time complexity is O (NLOGN). But the bucket sort can realize the time complexity of O (n). But the downside of bucket sequencing is:

1) First, the space complexity is higher, the additional overhead is required. Sorting has two of the space cost of the array, one for the array to be sorted, and one is the so-called bucket, such as the value to be sorted from 0 to m-1, then need M bucket, this bucket array will be at least m space.

2) The next element to be sorted must be within a certain range and so on. Implementation Code

Gets the maximum number of digits int max_bit (int num[], int len) {int bit = 1;
    int radix = 10;
            for (int i = 0, i < len; i++) {while (Num[i] >= radix) {radix *= 10;
        bit++;
}} return bit;
    } void Radix_sort (int num[], int len) {int bitcount = max_bit (num, len);
    int *tmp = new Int[len];  int *count = new INT[10];
    counter int radix = 1;
    int I, j, K; for (i = 0; i < Bitcount; i++)//Bitcount Sort {for (j = 0; J <; J + +) {Count[j]
        = 0;
            } for (j = 0; J < Len; j + +)//Count the number of records in each bucket {k = (Num[j]/radix)% 10;
        count[k]++;
        } for (j = 1; J <; J + +)//Assign the position in TMP once to each bucket {count[j] = Count[j] + count[j-1];
            } for (j = len-1; J >= 0; j--)//Collect all the records in the bucket into tmp {k = (Num[j]/radix)% 10;
            TMP[COUNT[K]-1] = num[j]; count[k]--;
        } for (j = 0; J < Len; j + +)//Copy the contents of the temporary array to data {num[j] = tmp[j];
    } Radix *= 10;
    } delete [] tmp;
delete [] count; }
algorithm nine: counting sort

The individual thinks that the counting sort limitation is relatively big, but still summarizes it together. Algorithmic Thinking

Assuming that the input is a small range of integers (such as age, etc.), using an extra array to record where elements should be arranged, the idea is relatively simple.
Features: Under certain limits time complexity is O (n), additional space O (n) (requires two arrays), stable ordering. Implementation Code

Assuming that the range of elements in the array is [0,k], you need to allocate k+1 memory space
void Counting_sort (int num[], int len, int k)
{
    int *count = new Int[k + 1] ;
    int *tmp = new Int[len];
    for (int i = 0; i < K; i++)
    {
        Count[i] = 0;
    }
    for (int i = 0; i < len; i++)
    {
        count[num[i]]++;
    }
    for (int i = 1; i < K; i++)
    {
        Count[i] + = count[i-1];
    }

    int index;
    for (int i = 0; i < len; i++)
    {
        index = count[num[i]];
        TMP[INDEX-1] = Num[i];
        count[num[i]]--;
    }

    for (int i = 0; i < len; i++)
    {
        Num[i] = Tmp[i];
    }
}
Summary

The stability of various sorts, time complexity, space complexity, stability summary are as follows:

Reprint Please specify the Source:
http://blog.csdn.net/foreverling/article/details/43798223
http://www.cricode.com/3212.html

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.