A detailed summary of the ten classic sorting algorithms (with Java code implementation)

Source: Internet
Author: User
Tags array length sorts

Original source: HTTP://MP.WEIXIN.QQ.COM/S/FEQDJBY4UYGRLBYUJQ7LPG

0. Sorting algorithm Description 0.1 Definition of sorting

Sorts a sequence of objects according to a keyword.

0.2 term description

    • stability : If A is originally in front of B, while A=b, A is still in front of B;
    • Instability: If A is originally in front of B, and A=b, then a may appear behind B;
    • sort within : All sorting operations are done in memory;
    • out of order : Because the data is too large, so the data is placed on the disk, and the sorting through the disk and memory data transfer can be carried out;
    • time Complexity: The time it takes for an algorithm to execute.
    • spatial Complexity : The amount of memory required to run a program.
0.3 Algorithm Summary

Picture noun Explanation:

    • N: Data size
    • K: The number of "barrels"
    • In-place: Consumes constant memory and does not consume extra memory
    • Out-place: Consumes extra memory
0.5 algorithm Classification

0.6 comparison and non-comparative differences

Common quick-Sort, merge-sort, heap-sort , bubble-sort , etc. are sort of comparison . in the final result of the ordering, the order between the elements depends on the comparison between them. Each number must be compared to other numbers to determine its location.
In sorts such as bubble sort , the problem size is n, and because of the need to compare n times, the average time complexity is O (n²). In the sort of merge sort, quick sort , the problem scale is reduced to logn times by divide-and- conquer method , so the time complexity is average O (NLOGN).
The advantage of comparison sorting is that it is suitable for data of all sizes, and does not care about the distribution of data, it can be sorted. It can be said that the comparison sort applies to all situations that require sorting.

Count sort, cardinal sort, bucket sort are non-comparison sorts . Non-comparison sorting is done by determining how many elements should be sorted before each element. For an array of arr, the number of elements before the calculation of Arr[i], the only one that determines the position of arr[i] in the sorted array.
Non-comparison sorting can be resolved by simply determining the number of elements that exist before each element. Algorithm time complexity O (n).
Non-comparison sort time complexity bottom, but because non-comparison ordering requires space to determine a unique location. Therefore, the data size and data distribution have certain requirements.

1. Bubble sort (Bubble sort)

Bubble sort is a simple sort algorithm. It repeatedly visits the sequence to sort, compares two elements at a time, and swaps them if they are in the wrong order. The work of the sequence of visits is repeated until no more need to be exchanged, that is, the sequence is sorted. The algorithm is named because the smaller elements will slowly "float" through the switch to the top of the sequence.

1.1 Algorithm Description
    • Compares the adjacent elements. If the first one is larger than the second, swap them two;
    • For each pair of adjacent elements to do the same work, from the beginning of the first pair to the end of the last pair, so that the final element should be the largest number;
    • Repeat the above steps for all elements except the last one;
    • Repeat steps until the sort is complete.

1.2 Motion Diagram Demo

1.3 Code implementation
1  /** 2      * Bubble sort 3      * 4      * @param array 5      * @return 6 *      /7 public     static int[] Bubblesort (int[] Array) {8         if (Array.Length = = 0) 9             return array;10 for         (int i = 0; i < array.length; i++) one for             (int j = 0; J < array.length-1-I; J + +)                 (array[j + 1] < Array[j]) {1                     int temp = array[j + 1];14                     array[j +] = array[j];15                     array [j] = temp;16                 }17         return array;18     }
1.4 Algorithm Analysis

Best case: t (n) = O (n) worst case: t (n) = O (n2) Average condition: t (n) = O (n2)

2. Select sort (Selection sort)

One of the most stable sorting algorithms , because no matter what data goes in is the time complexity of O (N2) , so when it is used, the smaller the data size, the better. The only advantage might be that you don't take up extra memory space. In theory, choosing a sort might also be the most common sort of sorting method that people usually think of.

Select Sort (selection-sort) is a simple and intuitive sorting algorithm. It works by first finding the smallest (large) element in the unordered sequence, holding it to the starting position of the sort sequence, and then continuing to find the smallest (large) element from the remaining unsorted elements, and then dropping it to the end of the sorted sequence. And so on until all elements are sorted.

2.1 Algorithm Description

The direct selection of n records can be sorted by n-1 direct selection to get an ordered result. The specific algorithm is described as follows:

    • Initial state: Unordered area is R[1..N], ordered area is empty;
    • At the beginning of the first sequencing (i=1,2,3...n-1), the current ordered and unordered regions are r[1..i-1] and R (I.) respectively. N). The sequencing from the current unordered area-Select the smallest key record r[k], and the 1th record of the unordered zone R Exchange, so that r[1..i] and R[I+1..N) respectively to increase the number of records added to a new ordered area and the number of records reduced by 1 new unordered area;
    • N-1 the end of the trip, the array is ordered.
2.2 Motion Diagram Demo

  

2.3 Code Implementation
/**     * Select sort     * @param array     * @return */public    static int[] Selectionsort (int[] array) {        if ( Array.Length = = 0)            return array;        for (int i = 0; i < Array.Length; i++) {            int minindex = i;            for (int j = i; J < Array.Length; J + +) {                if (Array[j] < Array[minindex])//Find the smallest number                    Minindex = j;//The index of the minimum number is guaranteed Save            }            int temp = Array[minindex];            Array[minindex] = Array[i];            Array[i] = temp;        }        return array;    }
2.4 Algorithm Analysis

Best case: t (n) = O (n2) worst case: t (n) = O (n2) Average: t (n) = O (n2)

3. Insert sort (Insertion sort)

The algorithm description of Insert sort (insertion-sort) is a simple and intuitive sorting algorithm. It works by constructing an ordered sequence, for unsorted data, to scan from backward forward in the sorted sequence, to find the appropriate position and insert it. The insertion sort is implemented on an implementation, usually in the order of In-place (that is, the ordering of extra space using only O (1), so that in the backward-forward scanning process, the ordered elements need to be moved back and forth gradually, providing the insertion space for the newest elements.

3.1 Algorithm Description

In general, the insertion sort is implemented using In-place on the array. The specific algorithm is described as follows:

    • Starting with the first element, the element can be thought to have been sorted;
    • Takes the next element and scans the sequence of elements that have been sorted from the back forward;
    • If the element (sorted) is greater than the new element, move the element to the next position;
    • Repeat step 3 until you find the sorted element is less than or equal to the position of the new element;
    • After inserting a new element into the position;
    • Repeat step 2~5.
3.2 Motion Diagram Demo

3.2 Code Implementation
/**     * Insert Sort     * @param array     * @return     *    /public static int[] Insertionsort (int[] array) {        if ( Array.Length = = 0)            return array;        int current;        for (int i = 0; i < array.length-1; i++) {Current            = array[i + 1];            int preindex = i;            while (preindex >= 0 && Current < Array[preindex]) {                Array[preindex + 1] = Array[preindex];                preindex--;            }            Array[preindex + 1] = current;        }        return array;    }
3.4 Algorithm Analysis

Best case: t (n) = O (n) worst case: t (n) = O (n2) Average condition: t (n) = O (n2)

4. Hill sort (Shell sort)

The hill sort was a sort algorithm proposed by Hill (Donald Shell) in 1959. Hill sort is also an insert sort, which is a more efficient version of the simple insert sort after improved, also known as narrowing the incremental sort, and the algorithm is one of the first algorithms to Break O (N2). It differs from the insertion sort in that it takes precedence over elements that are farther apart. Hill sort is also known as narrowing the incremental sort.

Hill sort is to group records by a certain increment of the table, using the direct insertion sorting algorithm for each group; As the increments gradually decrease, each group contains more and more keywords, when the increment is reduced to 1 o'clock, the entire file is divided into a group, the algorithm terminates.

4.1 Algorithm Description

Let's take a look at the basic steps of the hill sort, where we choose the incremental gap=length/2, the narrowing increment continues in the way of Gap = GAP/2, which we can represent with a sequence,{N/2, (N/2)/2...1}, called The increment sequence . The selection and proving of the delta sequence of the hill sort is a mathematical puzzle, and the increment sequence we choose is more commonly used and is also a proposed increment of Hill, called the Hill increment, but in fact the increment sequence is not optimal. Here we do examples using hill increments.

First, the entire sequence of records to be sorted into a number of sub-sequences for direct insertion of the sort, the specific algorithm description:

    • Select an incremental sequence T1,T2,...,TK, where ti>tj,tk=1;
    • According to the number of increment series K, the sequence is sorted by K-trip;
    • Each order, according to the corresponding increment ti, the backlog sequence is divided into several sub-sequences of length m, respectively, the sub-table is directly inserted sort. Only the increment factor is 1 o'clock, the entire sequence is treated as a table, and the length of the table is the length of the entire sequence.
4.2 Process Demonstration

4.3 Code implementation
/**     * Hill Sort     *     * @param array     * @return     *    /public static int[] Shellsort (int[] array) {        int len = array.length;        int temp, gap = LEN/2;        while (Gap > 0) {for            (int i = gap; i < Len; i++) {                temp = array[i];                int preindex = I-GAP;                while (preindex >= 0 && array[preindex] > Temp) {                    Array[preindex + gap] = Array[preindex];                    Preindex-= Gap;                }                Array[preindex + gap] = temp;            }            Gap/= 2;        }        return array;    }
4.4 Algorithm Analysis

Best case: t (n) = O (nlog2 N) worst case: t (n) = O (nlog2 N) Average case: T (n) =o (nlog2n)

5, merge sorting (merge sort)

As with select Sort, the performance of the merge sort is not affected by the input data, but behaves much better than the selection, since it is always the time complexity of O (n log n). The cost is that additional memory space is required.

Merge sort is an efficient sorting algorithm based on the merging operation. This algorithm is a very typical application of the partition method (Divide and Conquer). Merge sort is a stable sort method. The ordered Subsequence is merged to obtain a fully ordered sequence, i.e., the order of each subsequence is ordered, and then the sequence of sub-sequences is ordered. If you combine two ordered tables into an ordered table, it is called a 2-way merge.

5.1 Algorithm Description
    • The input sequence of length n is divided into two sub-sequences of length n/2;
    • The two sub-sequences were sorted by merging;
    • Merges two sorted sub-sequences into a final sort sequence.
5.2 Motion Diagram Demo

5.3 Code Implementation
  /** * Merge Sort * * @param array * @return */public static int[] MergeSort (int[] array) {if (        Array.Length < 2) return array;        int mid = ARRAY.LENGTH/2;        Int[] left = arrays.copyofrange (array, 0, mid);        Int[] right = Arrays.copyofrange (array, Mid, array.length);    Return merge (MergeSort (left), MergeSort (right)); }/** * Merge sort--combine two sorted arrays into a sorted array * * @param left * @param right * @return */public static I        Nt[] Merge (int[] left, int[] right) {int[] result = new Int[left.length + right.length];                for (int index = 0, i = 0, j = 0; index < result.length; index++) {if (I >= left.length)            Result[index] = right[j++];            else if (J >= right.length) Result[index] = left[i++];            else if (Left[i] > Right[j]) result[index] = right[j++];        else Result[index] = left[i++]; } returnResult }
5.4 Algorithm Analysis

Best case: t (n) = O (n) worst case: t (n) = O (nlogn) Average condition: t (n) = O (Nlogn)

6. Fast sorting (Quick sort)

The basic idea of quick sorting: to separate the pending records into two separate parts by a single pass, in which some of the recorded keywords are smaller than the other, the two parts of the record can be sorted separately to achieve the order of the whole sequence.

6.1 Algorithm Description

Quick sort use the divide-and-conquer method to divide a string (list) into two substrings (sub-lists). The specific algorithm is described as follows:

    • Select an element from the series, called the "datum" (pivot);
    • Reorder the columns, where all elements are placed in front of the datum in a smaller position than the base value, and all elements are larger than the base value behind the datum (the same number can be on either side). After the partition exits, the datum is in the middle of the sequence. This is called partition (partition) operation;
    • recursively (recursive) sorts sub-columns that are smaller than the base value elements and sub-columns that are larger than the base value elements.
6.2 Motion Diagram Demo

6.3 Code Implementation
  /** * Quick Sort method * @param array * @param start * @param end * @return */public static int[] Quic  Ksort (int[] array, int start, int end) {if (Array.Length < 1 | | Start < 0 | | End >= Array.Length | | start        > end) return null;        int smallindex = partition (array, start, end);        if (Smallindex > Start) QuickSort (array, start, smallIndex-1);        if (Smallindex < end) QuickSort (array, Smallindex + 1, end);    return array; }/** * Fast sorting algorithm--partition * @param array * @param start * @param end * @return */Public STA         TIC int partition (int[] array, int start, int end) {int pivot = (int) (Start + math.random () * (End-start + 1));        int smallindex = start-1;        Swap (array, pivot, end);                for (int i = start; I <= end; i++) if (Array[i] <= array[end]) {smallindex++;                   if (i > Smallindex) Swap (array, I, smallindex);    } return Smallindex; }/** * Swap array of two elements * @param array * @param i * @param j */public static void Swap (int[] array, I        NT I, int j) {int temp = Array[i];        Array[i] = Array[j];    ARRAY[J] = temp; }
6.4 Algorithm Analysis

Best case: t (n) = O (Nlogn) worst case: t (n) = O (n2) Average: t (n) = O (Nlogn)

7. Heap sort (heap sort)

Heap ordering (heapsort) refers to a sort algorithm designed using the data structure of the heap. A heap is a structure that approximates a complete binary tree and satisfies the properties of the heap at the same time: that is, the key value or index of the child node is always less than (or greater than) its parent node.

7.1 Algorithm Description
    • The initial order-to-sort keyword sequence (r1,r2....rn) is constructed into a large top heap, which is the initial unordered area;
    • Swap the top element of the heap r[1] with the last element R[n] to get a new unordered area (R1,R2,...... RN-1) and the new ordered area (Rn), and satisfies the r[1,2...n-1]<=r[n];
    • Because the new heap top r[1] may violate the nature of the heap, it requires a current unordered zone (R1,R2,...... RN-1) adjusts to the new heap, then swaps the r[1] with the last element of the unordered zone, resulting in a new unordered area (r1,r2....rn-2) and a new ordered area (RN-1,RN). This process is repeated until the number of elements in the ordered area is n-1, and the entire sorting process is complete.
7.2 Motion Diagram Demo

7.3 Code Implementation

Note: Here is the partial nature of the complete binary tree: see "Data Structure binary tree Knowledge Point Summary" for details

Declares a global variable that records the length of an array of arrays;
static int len; /** * Heap Sorting algorithm * * @param array * @return */public static int[] Heapsort (int[] array) {len = Array.Length; if (Len < 1) return array; 1. Build a maximum heap buildmaxheap (array); 2. Cycle the first (maximum) heap to the bottom of the exchange, and then re-adjust the maximum heap while (Len > 0) {swap (array, 0, len-1); len--; Adjustheap (array, 0); } return array; /** * Build Maximum heap * * @param array */public static void Buildmaxheap (int[] array) {//from last non-leaf node Starts to construct the maximum heap for (int i = (len-1)/2; I >= 0; i--) {adjustheap (array, i); }}/** * Adjusts to make it the largest heap * * @param array * @param i */public static void Adjustheap (int[] array, int i) {int maxindex = i; If there is a left dial hand tree, and the left subtree is greater than the parent node, point the maximum pointer to the left dial hand tree if (i * 2 < len && array[i * 2] > Array[maxindex]) maxindex = i * 2; If there is a right subtree and the right subtree is greater than the parent node, the maximum pointer is pointed to the right subtree if (i *2 + 1 < len && Array[i * 2 + 1] > Array[maxindex]) maxindex = i * 2 + 1; If the parent node is not the maximum value, the parent node is exchanged with the maximum value, and the location of the exchange with the parent node is adjusted recursively. if (maxindex! = i) {swap (array, maxindex, i); Adjustheap (array, maxindex); } }
7.4 Algorithm Analysis

Best case: t (n) = O (Nlogn) worst case: t (n) = O (Nlogn) Average: t (n) = O (Nlogn)

8. Sort by count (counting sort)

The core of a count sort is to convert the input data value into a key stored in an additional array space. As a sort of linear time complexity, counting ordering requires that the input data be an integer with a definite range.

The Count sort (counting sort) is a stable sorting algorithm. The count sort uses an extra array of C, where the I element is the number of elements in the array A to be sorted with the value equal to I. The elements in a are then ranked in the correct position according to the array C. It can only sort integers.

8.1 Algorithm Description
    • Find the largest and smallest elements in the array to be sorted;
    • The number of occurrences of each element in the statistic array that is I, in the array C;
    • Summation of all counts (starting with the first element in C, each item and the previous one);
    • Reverse-Populate the target array: Place each element I in the C (i) of the new array, minus 1 for each element that is placed.
8.2 Motion Diagram Demo

8.3 Code implementation
/**     * Count Sort     *     * @param array     * @return     *    /public static int[] Countingsort (int[] array) {        if (Array.Length = = 0) return array;        int bias, Min = array[0], max = array[0];        for (int i = 1; i < Array.Length; i++) {            if (Array[i] > Max)                max = array[i];            if (Array[i] < min)                min = array[i];        }        bias = 0-min;        int[] Bucket = new Int[max-min + 1];        Arrays.fill (bucket, 0);        for (int i = 0; i < Array.Length; i++) {            bucket[array[i] + bias]++;        }        int index = 0, i = 0;        while (Index < array.length) {            if (bucket[i]! = 0) {                Array[index] = I-bias;                bucket[i]--;                index++;            } else                i++;        }        return array;    }
8.4 Algorithm Analysis

When the input element is an integer of n 0 to K, its run time is O (n + k). The count sort is not a comparison sort, and the sort is faster than any comparison sort algorithm. Because the length of the array C used to count depends on the range of data in the array to be sorted (equal to the difference between the maximum and minimum values of the array to be sorted plus 1), this makes the count sort for arrays with a large data range, which requires a lot of time and memory.

Best case: t (n) = O (n+k) worst case: t (n) = O (n+k) Average: t (n) = O (n+k)

9, bucket sort (bucket sort)

Bucket sorting is an upgraded version of the Count sort. It takes advantage of the mapping of functions, the key to efficiency is the determination of this mapping function.

Bucket sorting (bucket sort) works: Assuming that the input data is uniformly distributed, the data is divided into a limited number of buckets, and each bucket is sorted separately (it is possible to use a different sorting algorithm or recursively continue to use the bucket sort to row

9.1 Algorithm Description
    • Artificially set a bucketsize, as how many different values each bucket can place (for example, when bucketsize==5, the bucket can hold {1,2,3,4,5} these numbers, but unlimited capacity, that can store 100 3);
    • Traverse the input data, and put the data into the corresponding bucket one by one;
    • For each bucket that is not empty, you can use other sorting methods, or you can use the bucket sort recursively.
    • From a bucket that is not empty, stitch up the sorted data.

Note that if you use buckets to sort the buckets in a recursive order, you can manually reduce the number of bucketsize when the number of buckets is 1 to increase the quantity of the next loop bucket, otherwise you will get into a dead loop, causing memory overflow.

9.2 Photo Demo

9.3 Code Implementation
  /** * Bucket Sort * * @param array * @param bucketsize * @return */public static Arraylist<intege            R> bucketsort (arraylist<integer> array, int bucketsize) {if (array = = NULL | | array.size () < 2)        return array;        int max = array.get (0), min = array.get (0); Find maximum minimum value for (int i = 0; i < array.size (); i++) {if (Array.get (i) > max) max =            Array.get (i);        if (Array.get (i) < min) min = Array.get (i);        } int bucketcount = (max-min)/bucketsize + 1;        arraylist<arraylist<integer>> Bucketarr = new arraylist<> (bucketcount);        arraylist<integer> Resultarr = new arraylist<> ();        for (int i = 0; i < Bucketcount; i++) {Bucketarr.add (New arraylist<integer> ()); } for (int i = 0; i < array.size (); i++) {Bucketarr.get ((Array.get (i)-min)/bucketsize). Add (array . GET (i));            } for (int i = 0; i < Bucketcount; i++) {if (Bucketcount = = 1) bucketsize--;            arraylist<integer> temp = Bucketsort (Bucketarr.get (i), bucketsize);        for (int j = 0; J < Temp.size (); j + +) Resultarr.add (Temp.get (j));    } return Resultarr; }
9.4 Algorithm Analysis

Bucket sorting is best used with linear time O (n), and the time complexity of bucket sequencing depends on the time complexity of sorting the data between buckets, because the time complexity of the other parts is O (n). Obviously, the smaller the bucket, the less data between buckets, and the less time it takes to sort. But the corresponding space consumption will increase.

Best case: t (n) = O (n+k) worst case: t (n) = O (n+k) Average: t (n) = O (n2)

10. Base sort (Radix sort)

Cardinality sorting is also a non-comparative sorting algorithm, sorting each one, starting from the lowest bit, the complexity is O (kn), the array length, and k the maximum number of digits in the array;

The cardinality sort is sorted by low, then collected, sorted by high order, then collected, and so on, until the highest bit. Sometimes some properties are prioritized, sorted by low priority, and sorted by high priority. The final order is high priority high in front, high priority is the same low priority high in front. The cardinality sort is based on sorting separately and is collected separately, so it is stable.

10.1 Algorithm Description
    • Gets the maximum number in the array and obtains the number of digits;
    • Arr is the original array, starting from the lowest bit to take each bit to make up the radix array;
    • The radix is counted (the use of the counting sort applies to the characteristics of the small range number);
10.2 Motion Diagram Demo

10.3 Code Implementation
  /** * Radix sort * @param array * @return */public static int[] Radixsort (int[] array) {if (array = = NULL | |        Array.Length < 2) return array;        1. Calculate the maximum number of digits first; int max = array[0];        for (int i = 1; i < Array.Length; i++) {max = Math.max (max, array[i]);        } int maxdigit = 0;            while (max! = 0) {max/= 10;        maxdigit++;        } int mod = ten, div = 1;        arraylist<arraylist<integer>> bucketlist = new arraylist<arraylist<integer>> ();        for (int i = 0; i < i++) Bucketlist.add (New arraylist<integer> ());                for (int i = 0; i < maxdigit; i++, mod *=, Div. *=) {for (int j = 0; J < Array.Length; J + +) {                int num = (array[j]% mod)/div;            Bucketlist.get (num). Add (Array[j]);            } int index = 0;    for (int j = 0; J < Bucketlist.size (); j + +) {            for (int k = 0; k < Bucketlist.get (j). Size (); k++) array[index++] = Bucketlist.get (j). GE                T (k);            Bucketlist.get (j). Clear ();    }} return array; }
10.4 Algorithm Analysis

Best case: t (n) = O (n * k) worst case: t (n) = O (n * k) Average case: t (n) = O (n * k)

There are two ways to sort the cardinality:

MSD sequencing from high level LSD starts with a low order

Cardinal Sort vs count sort vs bucket sort

These three sorting algorithms use the concept of buckets, but there are significant differences in how buckets are used:

    • Base sort: Allocates buckets based on each digit of the key value
    • Count Sort: Stores only a single key value per bucket
    • Bucket sort: Each bucket stores a range of values

A detailed summary of the ten classic sorting algorithms (with Java code implementation)

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.