Sorting algorithm of data structure (eight sorts)-(eight)

Source: Internet
Author: User

The sorting algorithm can be divided into stable sort and unstable sort. In a simple formalization, this can be called a stable sort if a[i] = A[j],a[i] was before the position, after sorting a[i] or before the a[j] position. If the sorting algorithm is stable, sort from one key and then sort from another, the result of sorting the first key can be used to sort the second key. The Cardinal sort is like this, first by the low sort, successively by the high order, the low-level same element its sequence is also the same time will not change. In addition, if the sorting algorithm is stable, the number of elements exchanged may be less (personal sense, not confirmed) for comparison-based sorting algorithms.

Back to the topic, now analyze the stability of the common sorting algorithms, each giving a simple reason.

(1) Bubble sort

The bubble sort is to move the small element forward or the large element back. The comparison is an adjacent two element comparison, and the interchange also occurs between these two elements. So, if the two elements are equal, I think you will not be bored to exchange them again, if the two equal elements are not adjacent, then even through the preceding 22 exchange two adjacent together, this time will not be exchanged, so the same elements of the order has not changed, so bubble sort is a stable sorting algorithm.

(2) Select sort

The choice of sorting is to choose the smallest current element for each location, such as selecting the smallest one for the first position, selecting the second small for the second element in the remaining element, and so on, until the n-1 element, the nth element is not selected, because it is left with one of its largest elements. Then, in a trip, if the current element is smaller than an element, and the small element appears behind an element that is equal to the current element, then the post-swap stability is destroyed. Compare the awkward, for example, sequence 5 8 5 2 9, we know that the first time to select the 1th element 5 and 2 Exchange, then the original sequence of 2 5 of the relative sequence is destroyed, so the selection of sorting is not a stable sorting algorithm.

(3) Insert sort
An insert sort is an element that is inserted one at a time, based on an already ordered small sequence. Of course, there are only 1 elements at the beginning of this ordered sequence, which is the first element. The comparison starts at the end of the ordered sequence, that is, the element that you want to insert and the one that is already in order, is inserted directly behind it if it is larger than it is, or until it is found where it is inserted. If you encounter an element equal to the insert, the insertion element places the element you want to insert behind the equal element. So, the order of the equal elements is not changed, the order from the original unordered sequence is the order of the sequence, so the insertion sort is stable.

(4) Quick sort  
Quick sort there are two directions, left I subscript go right, when a[i ] <= A[center_index], where Center_index is an array subscript of the central element, and is generally taken as the No. 0 element of an array. and the right J subscript goes left, when a[j] > A[center_index]. If I and J are not moving, I <= J, Exchange A[i] and A[j], repeat the process above until I > J. Exchange A[j] and A[center_index], to complete a quick sort of a trip. When the central element and A[j] are switched, it is very possible to disrupt the stability of the previous element, such as the sequence is 5 3 3 4 3 8 9 10 11, now the central element 5 and 3 (5th element, subscript from 1) The exchange will be the stability of element 3 is disturbed, so fast sorting is an unstable sorting algorithm, Instability occurs at the moment of the exchange of central elements and a[j].

(5) Merge sort
The merge sort is to divide the sequence recursively into the short sequence, the recursive exit is the short sequence only 1 elements (think directly ordered) or 2 sequences (1 comparison and exchange), then merges each ordered segment sequence into an orderly long sequence, merges continuously until the original sequence is all ordered. It can be found that when 1 or 2 elements, 1 elements are not exchanged, and 2 elements are equal in size and no one is intentionally exchanged, which does not destabilize the stability. So, in the process of merging short ordered sequences, is stability compromised? No, we can guarantee that if the two current elements are equal, we keep the elements in the preceding sequence in front of the result sequence, which guarantees stability. Therefore, the merge sort is also a stable sorting algorithm.

(6) Base order
The cardinality sort is sorted by low, then collected, sorted by high order, then collected, and so on, until the highest bit. Sometimes some properties are prioritized, sorted by low priority, then sorted by high priority, and the final order is high priority high, high priority is the same low-priority high. Base sorting is based on sorting separately and is collected separately, so it is a stable sorting algorithm.

(7) Hill sort (shell)  
Hill sort is the insertion of elements according to different steps, when the first element is very disordered, the step is the largest, so the number of elements inserted in the order is very small, fast; When the elements are basically ordered, the step size is very low, and the insertion sort is very efficient for ordered sequences. So, the time complexity of hill sorting would be better than O (n^2). Because of the number of insertions, we know that one insert sort is stable and does not change the relative order of the same elements, but in different insertion sorts, the same elements may move in their own insert sort, and finally their stability will be disturbed, so the shell sort is unstable.

(8) Heap Sorting
We know that the heap structure is the child of node I is 2 * I and 2 * i + 1 nodes, the large top heap requires that the parent node is greater than or equal to its 2 child nodes, and the small top heap requires the parent node to be less than or equal to its 2 child nodes. In a sequence of n, the process of heap sequencing is to select the largest (large top heap) or the smallest (small top heap) from the beginning of N/2 and its child nodes by a total of 3 values, and the choice between the 3 elements will of course not destabilize. But when for N/2-1, N/2-2, ... 1 When these parent nodes select an element, the stability is broken. It is possible that the N/2 parent exchange has exchanged the latter element, while the N/2-1 parent node does not exchange the subsequent same element, then the stability between the 2 identical elements is destroyed. Therefore, heap sorting is not a stable sorting algorithm.

In conclusion, it is concluded that the sorting algorithm is not stable, such as selecting sort, quick sorting, hill sort, heap sorting, and bubble sort, insert sort, merge sort and cardinal sort are stable sorting algorithms .

In fact, I think any algorithm stability is based on the code you write to determine whether it is stable.

Here are the eight sorting algorithms that I implemented in Java:

<span style= "FONT-SIZE:12PX;" >public class sort{/* * First of all, the stability of the sorting algorithm should be known, popularly speaking is to ensure that the first 2 equal number of the sequence before and after the order of the sequences and their two before and after the order of the same position. * In a simple formalization, if a[i] = A[j],a[i] Originally before the position, after sorting a[i] or in the a[j] position before this can be called stable sorting. */public static void Main (string[] args) {int[] array ={4, 12, 15, 17, 8, 12, 10, 41, 22, 19, 69, 35, 68, 1};//{5,1,7,8,4 , 2,3};//Bubblesort (array);//Selectsort (array);//Insertsort (array);//quicksort (array);//binarysort (array);// Heapsort (array);//radixsort (array); Shellsort (array);p rint (array);} /* * Bubble sort * Principle: 22 comparison, big back bubbling * stable sort: Time complexity O (n^2) */public static void Bubblesort (int[] array) {for (int i = 0; i < ARRA Y.length; i++) for (int j = 0; J < array.length-i-1; j + +) if (Array[j] > array[j + 1]) {int temp = array[j + 1];array[j + 1] = Array[j];array[j] = temp;}}  /* Select sort * Principle: always select a minimum or maximum value as the element to be sorted * Unstable ordering: Time complexity O (n^2) */public static void Selectsort (int[] array) {for (int i = 0; i < Array.Length; i++) for (int j = i; J < Array.length-1; J + +) {if (Array[i] > array[j + 1]) {int temp = Array[i];array[i] = array[j + 1];array[j + 1] = temp;}}} /* Insert Sort * Principle: sorted array is an ordered sequence, inserting the element to be sorted into the ordered sequence * stable ordering: Time complexity O (n^2) */public static void Insertsort (int[] array) {for (int i = 1; i < Array.Length; i++) for (int j = i-1; j >-1; j--) {if (Array[j] > array[j + 1]) {int temp = Array[j];array[j] = array[j + 1];array[ J + 1] = temp;}}} /* * Quick sort * Principle: dig pit filling + divide and conquer method. The position of the number of centers is treated as a pit, and the appropriate number is filled into the pit, forming a new pit * unstable sort: Time complexity O (NLGN) */public static void QuickSort (int[] array) {quickSort (array, 0, ARRAY.LENGTH-1);} private static void QuickSort (int[] array, int start, int end) {if (Start < end) {//select hub element, placed on the right side, larger than central element, anyway left int i = Star T, j = end;int x = Array[i];while (i < J) {//Start comparison at end (I < J && x < array[j]) J--;if (i < j) array [i++] = Array[j];while (i < j && x > Array[i]) i++;if (i < j) array[j--] = Array[i];} Array[i] = x;quicksort (array, start, i-1) QuickSort (array, i + 1, end);}} /* Merge Sort * principle: binary sort, first array binary, then merge sort * Stable sort: time complexity (NLGN) */public STAtic void Binarysort (int []array) {binarysort (array,0,array.length-1);} private static void Binarysort (int []array,int left,int right) {if (left==right)//Only one element Return;else if (left+1==right)// Two elements, direct sort {if (Array[left]>array[right]) {int temp=array[left];array[left]=array[right];array[right]=temp;}} Else{int mid= (left+right)/2;binarysort (Array,left,mid); Binarysort (array,mid+1,right); Merge (array, left, Right,mid );}} Merge two ordered queues into private static void merge (int []array,int left,int right,int mid) {for (int i=mid+1;i<=right;i++) for (int j=  i-1;j>=left;j--) {while (array[j+1]<array[j]) {int temp=array[j+1];  ARRAY[J+1]=ARRAY[J];  Array[j]=temp;  Break }}}/* * Heap Sort * Principle: constructs the largest or smallest heap, then deletes the root element one by one * unstable sorting, the time complexity is the worst: Time complexity (NLGN) * is actually very troublesome, but high efficiency. The first thing to do is to build the heap before removing the root node sort */public static void heapsort (int []array) {//Build a maximum heap//build heap principle: insert from behind. Build an empty root that is directly inserted than the root and larger than the root, then move the root down, insert the node filter for (int i=1;i<array.length;i++) {int k=i;while (k>0&&array[(k-1)/2 ]<array[k]) {int temp=array[k];array[k]=array[(k-1)/2];array[(k-1)/2]=temp;k= (k-1)/2;}} print (array);//To sort the heap//delete the root element, in order to save space, you can put the root element in the last position for (int i=0;i<array.length;i++) {int max=array[0];int k=0; while (2*k+1<array.length-i-1) {if (array[2*k+1]>array[2*k+2]) {array[k]=array[2*k+1];k=2*k+1;} else {array[k]=array[2*k+2];k=2*k+2;}} array[k]=array[array.length-i-1];array[array.length-i-1]=max;//There may be an unexpected situation where you need to adjust the position of K//if (K&LT;ARRAY.LENGTH-I-1) {while (k>0&&array[(k-1)/2]<array[k]) {int temp=array[k];array[k]=array[(k-1)/2];array[(k-1)/2]= temp;k= (k-1)/2;}}} /* Base sort, divided into MSD and LST two types * MSD: Starting from the highest bit * LSD: Starting from the lowest bit * principle: Sort the numbers on each position (1000) * Stable sort: Time complexity O (n) * Advantages: Low Time complexity * Disadvantage: waste a lot of space, to the digital allocation requirements Relatively high, the more evenly distributed, the lower the space consumption */public static void Radixsort (int []array) {///10 times times the space, because do not know whether the average distribution//Of course, a position on the number is the same, then is the extreme situation, Requires 10 times times the space int []temp=new int[10*array.length];for (int i=0;i<temp.length;i++)//Initialize temp[i]=0;//assuming that the number of digits to be sorted is the maximum of two digits/ For single digit ordering for (int i=0;i<array.length;i++) {int d1=getdigit (array[i], 1);//int d2=getdigit (Array[i], 2); int offset=0; while (temp[d1*array.length+offset]!=0) {Offset++;} Temp[d1*array.length+offset]=array[i];} int pos=0;for (int i=0;i<temp.length;i++) {if (temp[i]!=0) {array[pos]=temp[i];p os++;} temp[i]=0;} On the 10-bit number of sorts for (int i=0;i<array.length;i++) {int d2=getdigit (array[i], 2); int Offset=0;while (temp[d2*array.length+ offset]!=0) {offset++;} Temp[d2*array.length+offset]=array[i];} pos=0;for (int i=0;i<temp.length;i++) {if (temp[i]!=0) {array[pos]=temp[i];p os++;} temp[i]=0;}} /** * Gets the number on the D-bit of digit * @param No. * @param d * @return */private static int getdigit (int number,int d) {while (d>1) { d--;number/=10;} return number%10;} /* * Hill Sort * Principle: Grouping insert Sort, select increment, change increment * unstable sort. */public static void Shellsort (int []array) {int d=array.length;//increment while (d>0) {d=d/2;for (int j=d;j<array.length ; j+=d) {if (array[j]<array[j-d]) {int temp=array[j];int k=j-d;while (k>=0&&temp<array[k]) {array[k+d ]=array[k];k-=d;} array[k+d]=temp;}}}} public static void print (int. []array) {for (int e:array) {System.out.print (E + "");} System.out.println ();}} </span>

Sorting algorithm of data structure (eight sorts)-(eight)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.