Data Structure Sorting Algorithm (eight sorting algorithms)-(eight), eight Algorithms

Source: Internet
Author: User

Data Structure Sorting Algorithm (eight sorting algorithms)-(eight), eight Algorithms

The sorting algorithms can be classified into stable and unstable sorting. In A simple form, if A [I] = A [j], A [I] is before the original location, after sorting, A [I] still needs to be placed before A [j]. This can be called stable sorting. If the Sorting Algorithm is stable, sort from one key and then from another key. The result of the first key sorting can be used by the second key sorting. In this case, the base sorting is performed first by the low position, and then by the high position. The order of the elements with the same low position and the high position will not change at the same time. In addition, if the Sorting Algorithm is stable, for comparison-based sorting algorithms, the number of element exchanges may be less (personally, not confirmed ).

Go back to the topic and analyze the stability of common sorting algorithms. Each gives a simple reason.

(1) Bubble Sorting

Bubble Sorting is to call a small element forward or a large element backward. The comparison is an adjacent comparison between two elements, and the Exchange also occurs between these two elements. Therefore, if the two elements are equal, I think you will not be bored to exchange them. If the two equal elements are not adjacent, even if the two are adjacent through the previous two exchanges, at this time, the sequence of the same elements is not changed, so the Bubble Sorting is a stable sorting algorithm.

(2) Select sorting

The sorting mode selects the smallest element for each position. For example, you can specify the smallest element for the first position, select the second smallest element for the second element in the remaining element, and so on, until the n-1 element, the n element does not need to be selected, because only one of its largest elements is left. If the current element is smaller than an element, and the small element appears after an element equal to the current element, then the stability will be damaged after the exchange. Compare interfaces. For example, in the sequence 5 8 5 2 9, we know that the first selection of 1st elements 5 will exchange with 2, therefore, the relative order of the two 5 in the original sequence is damaged. Therefore, selecting sorting is not a stable sorting algorithm.

(3) Insert sorting
Insert sorting inserts an element at a time based on an ordered small sequence. Of course, at the beginning, this ordered small sequence had only one element, which was the first element. The comparison starts from the end of the ordered sequence, that is, the element to be inserted is compared with the already ordered sequence. If it is larger than it, it is directly inserted after it, otherwise, search until you find the inserted position. If you encounter an element that is equal to the inserted element, the inserted element is placed behind the element that you want to insert. Therefore, the order of equal elements is not changed, and the order from the original unordered sequence is the order after sorting, so insertion sorting is stable.

(4) Fast sorting
There are two directions for quick sorting. The I subscript on the left is always directed to the right. When a [I] <= a [center_index], center_index is the array subscript of the central element, it is generally set to an array of 0th elements. The j subscript on the right goes to the left, when a [j]> a [center_index]. If I and j cannot move, I <= j, exchange a [I] And a [j], repeat the above process until I> j. Exchange a [j] And a [center_index] to complete a quick sorting. When the central element is exchanged with a [j], it is very likely to disrupt the stability of the preceding elements. For example, the sequence is 5 3 3 4 3 8 9 10 11, now the exchange of central elements 5 and 3 (5th elements, subscript starting from 1) will disrupt the stability of element 3, so fast sorting is an unstable sorting algorithm, instability occurs when the central element is exchanged with a [j.

(5) Merge and sort
Merge Sorting refers to recursively dividing a sequence into short sequences. The recursive exit means that a short sequence has only one element (that is, directly ordered) or two sequences (one comparison and exchange ), then, the ordered segments are merged into an ordered long sequence until all the original sequences are sorted. It can be found that when one or two elements, one element will not be exchanged. If two elements are equal in size, no one will intentionally exchange them, which will not damage stability. So, in the process of merging short ordered sequences, is stability damaged? No. During the merge process, we can ensure that if the two current elements are equal, we store the elements in the previous sequence before the result sequence, thus ensuring stability. Therefore, Merge Sorting is also a stable sorting algorithm.

(6) Base sorting
Base sorting is sorted first by low position, then collected; then sorted by high level, and then collected; and so on until the highest bit. Sometimes some attributes have a priority order. They are first sorted by low priority and then by high priority. The final order is the highest priority, and the highest priority is the highest priority. Base sorting is based on separate sorting and collected separately, so it is a stable sorting algorithm.

(7) shell)
Hill sorting sorts elements by insertion of different step sizes. When the elements are unordered at the beginning, the step size is the largest, so the number of elements inserted for sorting is very small and the speed is very fast; when the elements are basically ordered, the step size is very small, and insertion sorting is very efficient for ordered sequences. Therefore, the time complexity of hill sorting is better than that of O (n ^ 2. Because of the multiple insertion sorting, we know that one insertion sorting is stable and does not change the relative order of the same elements. However, in different insertion sorting processes, the same elements may move in their respective insert sorting, and the final stability will be disrupted, so shell sorting is unstable.

(8) Heap sorting
We know that the heap structure is that node I has 2 * I and 2 * I + 1 nodes, and the parent node must be greater than or equal to its 2 child nodes, the child top heap requires that the parent node be smaller than or equal to its two child nodes. In a sequence with a length of n, the heap sorting process starts from n/2 and selects the maximum value (large top heap) or the minimum value (small top heap) for its subnodes ), of course, the choice between these three elements will not undermine stability. However, when selecting elements for the n/2-1, n/2-2,... 1 parent nodes, the stability will be damaged. It is possible that the nth/second parent node swaps the next element, while the nth/2-1 parent node does not swap the next element, then the stability between the two identical elements is damaged. Therefore, heap sorting is not a stable sorting algorithm.

To sum up, it is concluded that selection of sorting, fast sorting, Hill sorting, and heap sorting are not stable sorting algorithms, while Bubble sorting, insert sorting, Merge Sorting, and base sorting are stable sorting algorithms.

In fact, I think the stability of any algorithm is determined based on the code you write.

The eight sorting algorithms I implemented in java are as follows:

<Span style = "font-size: 12px;"> public class Sort {/** first, we should all know the stability of the sorting algorithm, in general, it can ensure that the first two equal numbers are sorted in the same order before and after the sequence. * In A simple form, if A [I] = A [j], A [I] is before the original location, after sorting, A [I] still needs to be placed before A [j]. This can be called stable sorting. */Public static void main (String [] args) {int [] array = {4, 12, 15, 17, 8, 12, 10, 41, 22, 19, 69, 35, 68, 1}; // {5, 1, 7, 8, 4, 2, 3}; // bubbleSort (array); // selectSort (array ); // insertSort (array); // quickSort (array); // binarySort (array); // heapSort (array); // radixSort (array); shellSort (array ); print (array);}/** bubble sort * principle: compare two to three, bubble to the back of large * stable sorting: time complexity is O (n ^ 2) */public static void bubbleSort (int [] array) {for (int I = 0; I <array. length; I ++) for (int j = 0; j <array. length-I-1; j ++) if (array [j]> array [j + 1]) {int temp = array [j + 1]; array [j + 1] = array [j]; array [j] = temp ;}}/** select sort * principle: always select a minimum or maximum value as the element to be sorted * unstable sorting: time complexity O (n ^ 2) */public static void selectSort (int [] array) {for (int I = 0; I <array. length; I ++) for (int j = I; j <array. length-1; j ++) {if (array [I]> array [j + 1]) {int temp = array [I]; array [I] = array [j + 1]; array [j + 1] = temp ;}}/ ** insert sort * principle: the sorted array is an ordered sequence, and the elements to be sorted are inserted into the ordered sequence * stable sorting: time complexity O (n ^ 2) */public static void insertSort (int [] array) {for (int I = 1; I <array. length; I ++) for (int j = I-1; j>-1; j --) {if (array [j]> array [j + 1]) {int temp = array [j]; array [j] = array [j + 1]; array [j + 1] = temp ;}}} /** fast sorting * principle: Number of pitchers and score calculation method. Consider the location of the central number as a pitfall. Appropriate numbers are filled into the pitfall to form a new pitfall * unstable sorting: time complexity O (nlgn) */public static void quickSort (int [] array) {quickSort (array, 0, array. length-1);} private static void quickSort (int [] array, int start, int end) {if (start <end) {// select the central element, if it is larger than the central element, it is placed on the right side. int I = start, j = end; int x = array [I]; while (I <j) {// compare the while (I <j & x <array [j]) j --; if (I <j) array [I ++] = array [j]; while (I <j & x> array [I]) I ++; if (I <j) array [j --] = array [I];} array [I] = x; quickSort (array, start, i-1); quickSort (array, I + 1, end);}/** merge sort * principle: binary sort, first binary array, then merge and sort * stable sorting: time complexity (nlgn) */public static void binarySort (int [] array) {binarySort (array, 0, array. length-1);} private static void binarySort (int [] array, int left, int right) {if (left = right) // only one element is returned; else if (left + 1 = right) // two elements are directly sorted. {if (array [left]> array [rig Ht]) {int temp = array [left]; array [left] = array [right]; array [right] = temp ;}} else {int mid = (left + right)/2; binarySort (array, left, mid); binarySort (array, mid + 1, right); merge (array, left, right, mid) ;}// merge two ordered queues into private static void merge (int [] array, int left, int right, int mid) {for (int I = mid + 1; I <= right; I ++) for (int j = I-1; j> = left; j --) {while (array [j + 1] <array [j]) {int temp = array [j + 1]; array [j + 1] = array [j]; array [j] = Temp; break ;}}/ ** heap sorting * principle: create the largest or smallest heap, and then delete the root element one by one * unstable sorting, with the worst time complexity: time complexity (nlgn) * is actually very troublesome, but the efficiency is high. First, you need to build the heap, and then delete the root node sorting */public static void heapSort (int [] array) {// build a maximum heap // build the heap principle: start from the end. // Construct an empty root. If it is smaller than the root, the root is directly inserted. If it is larger than the root, the root is moved down and inserted to the node for (int I = 1; I <array. length; I ++) {int k = I; while (k> 0 & array [(k-1)/2] <array [k]) {int temp = array [k]; array [k] = array [(k-1)/2]; array [(k-1)/2] = temp; k = (k-1) /2; }}// print (array); // sort the heap. // Delete the root element to save space, you can put the root element at the last position for (int I = 0; I <array. length; I ++) {int max = array [0]; int k = 0; while (2 * k + 1 <array. length-i-1) {if (array [2 * k + 1]> array [2 * k + 2]) {array [k] = array [2 * k + 1]; k = 2 * k + 1;} else {array [k] = array [2 * k + 2]; k = 2 * k + 2;} array [k] = array [array. length-i-1]; array [array. length-i-1] = max; // an unexpected situation may occur and the position of k needs to be adjusted // if (k <array. length-i-1) {while (k> 0 & array [(k-1)/2] <array [k]) {int temp = array [k]; array [k] = array [(k-1)/2]; array [(k-1)/2] = temp; k = (k-1)/2 ;}}}} /** base sorting, which can be classified into MSD and LST * MSD: Sorting from the highest bit * LSD: Sorting from the lowest Bit * principle: Sort each location (10 hundred) * stable sorting: time complexity O (n) * advantage: low time complexity * disadvantage: a large amount of space is wasted, requiring a high level of digital distribution, the more evenly distributed, lower Space consumption */public static void RadixSort (int [] array) {// 10 times space, because I don't know if it is an average allocation. // Of course, if all the numbers at a certain position are the same, it is an extreme situation, requires 10 times space int [] temp = new int [10 * array. length]; for (int I = 0; I <temp. length; I ++) // initialize temp [I] = 0; // assume that the maximum number of orders to be sorted is two digits // sort the single digits for (int I = 0; I <array. length; I ++) {int d1 = getDigit (array [I], 1); // int d2 = getDigit (array [I], 2); int offset = 0; while (temp [d1 * array. length + offset]! = 0) {offset ++;} temp [d1 * array. length + offset] = array [I];} int pos = 0; for (int I = 0; I <temp. length; I ++) {if (temp [I]! = 0) {array [pos] = temp [I]; pos ++;} temp [I] = 0 ;} // sorts the top ten digits for (int I = 0; I <array. length; I ++) {int d2 = getDigit (array [I], 2); int offset = 0; while (temp [d2 * array. length + offset]! = 0) {offset ++;} temp [d2 * array. length + offset] = array [I];} pos = 0; for (int I = 0; I <temp. length; I ++) {if (temp [I]! = 0) {array [pos] = temp [I]; pos ++;} temp [I] = 0 ;}} /*** get the number at the d-digit of number * @ param d * @ return */private static int getDigit (int number, int d) {while (d> 1) {d --; number/= 10;} return number % 10;}/** Hill sorting * principle: group insertion sorting, select increment, incremental change * unstable sorting. */Public static void shellSort (int [] array) {int d = array. length; // incremental while (d> 0) {d = d/2; for (int j = d; j <array. length; j + = d) {if (array [j] <array [j-d]) {int temp = array [j]; int k = j-d; while (k> = 0 & temp <array [k]) {array [k + d] = array [k]; k-= d ;} array [k + d] = temp ;}}} public static void print (int [] array) {for (int e: array) {System. out. print (e + "");} System. out. println () ;}}</span>

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.