Review of data structure "sort" __ data structure

Source: Internet
Author: User
Tags array length comparison min sort sorts

Sort: Sorts a sequence of objects according to a keyword;


Stability: If A is originally in front of B, while A=b, A is still in front of B;

Instability: If A is originally in front of B, and A=b, then a may appear behind B;

Sort within: All sorting operations are done in memory;

Out of order: Because the data is too large, so the data is placed on the disk, and the sorting through the disk and memory data transfer can be carried out;

Sequencing time-consuming operations: comparing, moving;

Sort by Category:

(1) Exchange class: Bubble sort, quick sort; This class is characterized by constant comparison and exchange of sorts;

(2) Insert class: Simple insert sort, hill sort; This class is characterized by inserting the means of sorting;

(3) Select the class: Simple selection sorting, heap sorting, the characteristics of this class is to see the move again;

(4) Merge class: merge sort; This class is characterized by first splitting and merging;

Historical process: The complexity of the first sorting algorithm is in O (n^2), and the appearance of hill sort breaks the deadlock;

The following video was created by Sapientia University, which shows the sort steps in the form of dancing, and these videos can be used as a sort of review material ~

Bubble Sort Video: http://v.youku.com/v_show/id_XMzMyOTAyMzQ0.html

Select Sort Video: http://v.youku.com/v_show/id_XMzMyODk5MDI0.html

Insert Sort Video: http://v.youku.com/v_show/id_XMzMyODk3NjI4.html

Hill Sort Video: http://v.youku.com/v_show/id_XMzMyODk5MzI4.html

Merge sort videos: http://v.youku.com/v_show/id_XMzMyODk5Njg4.html

Quick Sort Videos: http://v.youku.com/v_show/id_XMzMyODk4NTQ4.html


The sorting algorithms described above are based on sorting, and a class of algorithms are not based on the comparison of sorting algorithms, that is, counting sorting, radix sorting ;


Preparation: The simplest sort


This kind of realization method is the simplest sort realization;

The disadvantage is that each time the minimum value is simple to find, and not for the next search to make a cushion;

The algorithm is as follows:

public static int[] Simple_sort (int[] arr) {for
	(int i = 0; i < arr.length; i++) {for
		(int j = i + 1; J < Arr.length; J + +) {
			if (Arr[i] > Arr[j]) {
				swap (arr, I, j);
	}}} return arr;
}


first, bubble sort


The bubble sort has been improved relative to the simplest sort, that is, each exchange is helpful to the subsequent, the large number will be bigger and smaller, and the small number will be smaller;

Bubble sort thought: 22 comparison between adjacent elements, if the former is greater than the latter, then the Exchange ;

So this sort belongs to the sort of exchange sorting, the same kind of the most commonly used sorting methods: fast sorting;


1. Standard Bubble Sorting


This method is the most general bubble sort implementation, the idea is 22 adjacent to compare and exchange;

The algorithm is implemented as follows:

public static int[] Bubble_sort2 (int[] arr) {for
	(int i = 0; i < arr.length; i++) {for
		(int j = arr.length- 1; J > i; j--) {
			if (Arr[j] < arr[j-1]) {
				swap (arr, J, j-1);
	}}} return arr;
}


2. Improved bubble sorting


The improvement is that if a sequence is present, the sequence is basically ordered, and if it is a standard bubble sort, it still needs to be compared.

Improved method: Through a Boolean ischanged, if there is no exchange of elements in a loop, then the order is already sorted;

The algorithm is implemented as follows:

Best: n-1 times comparison, does not move, so the time complexity is O (n), does not occupy the auxiliary space
//Worst: N (n-1)/2 comparisons and moves, so O (n^2), occupy the swap temporary space, size 1; public
static int[] Bubble_sort3 (int[] arr) {
	Boolean ischanged = true;
	for (int i = 0; i < arr.length && ischanged; i++) {
		ischanged = false;
		for (int j = i + 1; j < Arr.length; J + +) {
			if (Arr[i] > Arr[j]) {
				swap (arr, I, j);
				Ischanged = True;}}
	}
	return arr;
}


second, simple selection of sorting

Simple selection of sorting characteristics: each cycle to find the minimum value, and exchange, so the number of exchanges is always n-1 times;

Compared to the simplest sort, a lot of unnecessary exchanges have been improved, each cycle is continuously compared to record the minimum value, only one exchange (of course, may not exchange, when the minimum value is already in the correct position)

The algorithm is as follows:

Worst: N (n-1)/2 comparisons, n-1 times, so time complexity is O (n^2)//
best: N (n-1)/2 comparisons, no swap, so time complexity is O (n^2)
//Better than bubble sort public
static Int[] Selection_sort (int[] arr) {for
	(int i = 0; i < arr.length-1; i++) {
		int min = i;
		for (int j = i + 1; j < Arr.length; J + +) {
			if (Arr[min] > Arr[j]) {
				min = j;
			}
		}
		if (min! = i)
			swap (arr, Min, i);
	}
	return arr;
}

Three, simple insert sort


Thought: Given sequence, there is a dividing line, the left side of the dividing line is thought to be orderly, the right side of the dividing line is not sorted, each takes the leftmost one that has not been sorted, and is inserted into the correct position; Our default index of 0 is ordered, and each loop inserts an element to the right of the dividing line into an ordered array And move the dividing line to the right one bit;

The algorithm is as follows:

	Best: n-1 times comparison, 0 moves, time complexity O (n)
	//Worst: (n+2) (n-1)/2 comparisons, (n+4) (n-1)/2 moves, time complexity O (n^2) public
	static int[] Insertion_sort (int[] arr) {
		int J;
		for (int i = 1; i < arr.length; i++) {
			if (Arr[i] < arr[i-1]) {
				int tmp = arr[i];
				for (j = i-1; J >= 0 && arr[j] > tmp; j--) {
					arr[j + 1] = Arr[j];
				}
				Arr[j + 1] = tmp;
			}
		}
		return arr;
	}


Simple insert sorting is better than selecting sort and bubbling sort.

Iv. sort of Hill


1959 Shell Invention;

The first breakthrough O (n^2) sorting algorithm, is a simple insert sort of improved version;

Thought: Because the simple insertion sort is very effective when the records are less or basically ordered, we can make each group of capacity smaller by grouping the series, then group the sort, then make a simple insert sort;

The grouping here is a jump grouping, that is, the first 1,4,7 position is a group, the 2,5,8 position is a group, the 3,6,9 position is a group;


Index

1

2

3

4

5

6

7

8

9


At this point, if increment=3, the index of i%3 Equality is a group, such as an index 1,1+3,1+3*2

The general increment formula is: increment = increment/3+1;

The algorithm is implemented as follows:

O (n^ (3/2))
//unstable sorting algorithm public
static int[] Shell_sort (int[] arr) {
	int J;
	int increment = arr.length;
	do {
		increment = INCREMENT/3 + 1;
		for (int i = increment; i < arr.length; i++) {//i=increment Because insert sort the first record of each group is sorted
			if (Arr[i] < Arr[i-increme NT]) {
				int tmp = arr[i];
				for (j = i-increment; J >= 0 && arr[j] > tmp; J-= increment) {
					arr[j + increment] = Arr[j];
				}
				arr[j + increment] = tmp;}}
	} while (Increment > 1);
	return arr;
}

Five, heap sorting

Floyd and Williams invented in 1964;

Dagen: Any parent node is larger than the child node;

Small Gan: Any parent node is smaller than the child node;


The unstable sorting algorithm is an improved version of the simple selection sort;

Thought: Build a complete binary tree, first build the big root heap, and then each time the root node is the maximum value removed, and the number of the last node substitution, then the array length minus one, and then rebuild the large heap, and so on;

Note: This sorting method does not apply to a small number of sequences because the initial build heap takes time;

The algorithm is implemented as follows:

        Time complexity O (NLOGN) 
	//unstable sorting algorithm
	//Auxiliary space is 1
	//Not suitable for sequences with less order public
	static int[] Heap_sort (int[] arr) {
		int tmp[] = new Int[arr.length + 1];
		Tmp[0] =-1;
		for (int i = 0; i < arr.length; i++) {
			tmp[i + 1] = Arr[i];
		}
		Build Dagen: O (n)
		for (int i = ARR.LENGTH/2; I >= 1; i--) {
			makemaxrootheap (tmp, I, arr.length);
		}
		Rebuild: O (NLOGN) for
		(int i = arr.length; i > 1; i--) {
			swap (TMP, 1, i);
			MAKEMAXROOTHEAP (TMP, 1, i-1);
		}
		for (int i = 1; i < tmp.length; i++) {
			arr[i-1] = Tmp[i];
		}
		return arr;
	}

	private static void Makemaxrootheap (int[] arr, int low, int. high) {
		int tmp = Arr[low];
		Int J;
		for (j = 2 * Low, J <= High, j*=2) {
			if (J < high && Arr[j] < Arr[j + 1]) {
				j + +;
			}
			if (tmp >= Arr[j]) {break
				;
			}
			Arr[low] = arr[j];
			low = J;
		}
		Arr[low] = tmp;
	}

vi. Merge Sort


Stable sorting algorithm;

Thought: Using recursion to divide and merge, divide until the length of 1, and before the merger to ensure that the two sequences were originally ordered, the merger is also orderly;

The implementation code is as follows:

	stable sequencing;
	//Time complexity O (NLOGN)//
	space complexity: O (N+LOGN) public
	static int[] Merge_sort (int[] arr) {
		msort (arr, arr) , 0, arr.length-1);
		return arr;
	}
	
	private static void Msort (int[] sr, int[] TR, int s, int t) {
		int tr2[] = new Int[sr.length];
		int m;
		if (s = = t) {
			tr[s] = Sr[s];
		} else {
			m = (s + t)/2;
			Msort (SR, TR2, S, m);
			Msort (SR, TR2, M + 1, t);
			Merge (TR2, TR, S, M, t);
		}
	}

	private static void Merge (int[] tr2, int[] TR, int i, int m, int t) {
		int J, K;
		for (j = i, K = m + 1; I <= m && k <= T; j + +) {
			if (Tr2[i] < tr2[k]) {
				tr[j] = tr2[i++];
			} else {
				tr[j] = tr2[k++];
			}
		}
		while (I <= m) {
			tr[j++] = tr2[i++];
		}
		while (K <= t) {
			tr[j++] = tr2[k++];
		}
	}

Seven, quick sort


An upgraded version of a bubbling sort; the most current sorting method;

Thought: Select Pivot, adjust pivot to a reasonable position, so that all the left is less than him, all the right is greater than him;

Note: If the sequence is basically ordered or the number of sequences is small, a simple insert sort can be used, because fast sorting is inefficient for these situations;

The implementation code is as follows:

        Unstable sorting algorithm
	//Time complexity: Best: O (Nlogn) Worst: O (n^2)
	//Space complexity: O (logn) public
	static int[] Quick_sort (int[] arr) {
		Qsort (arr, 0, arr.length-1);
		return arr;
	}

	private static void Qsort (int[] arr, int low, int. high) {
		int pivot;
		if (Low < high) {
			pivot = partition (arr, low, high);
			Qsort (arr, low, pivot);
			Qsort (arr, pivot + 1, high);
		}
	}

	private static int partition (int[] arr, int low, int. high) {
		int pivotkey;
		PivotKey = arr[low];//Select Pivot, where you can optimize while
		(Low < High) {when
			(Low < high && Arr[high) >= PIV Otkey) {
				high--;
			}
			Swap (arr, low, high);//interchange, where you can optimize while
			(Low < high && Arr[low] <= pivotkey) {
				low++;
			}
			Swap (arr, low, high);
		}
		return low;
	}


Optimization Scenarios


(1) Select pivot: Selecting the value of pivot is essential for fast sorting, ideally, pivot should be the middle number of the sequence;

In the front we simply take the first number as pivot, this can be optimized;

Optimization method: Take the median as pivot when the number of numbers is drawn;

(2) Use Insert sort for decimal groups: Because the quick sort is suitable for large array sorting, if it is a decimal group, then the effect may not be easy to insert the sort to be good;


If you want to optimize, you can use the following code:

public static int[] Quick_sort (int[] arr) {
	if (arr.length>10) {
		qsort (arr, 0, arr.length-1);
	}
	else{
		Insertion_sort (arr);
	}
	return arr;
}


Eight, counting sort

Counting sorting is typically not a comparison-based sorting algorithm, and the sorting algorithm based on the comparison is at least O (Nlogn), and there is no possibility of creating a linear time sorting algorithm. That is not based on the comparison of the sorting algorithm;

If the data range of the array is 0~100, it is suitable for this algorithm;

Complexity: O (N+k), n is the original array length, K is the data range;


thought:


(1) First find the maximum value in the array, and then create a count array (to record the number of each element), the length is max, for example, the array is {1,1,2,3,4,5}, create a 6-length array count[],count[1] hold the number 1 occurrences, that is, 2;

(2) Populate the Count array, which iterates through the original array, and count[arr[i]-1]++;

(3) Accumulate the count array, i.e. count[i] = Count[i] + count[i-1];

(4) Reverse fill result array, result[count[arr[i]]-1] = arr[i];


The code is as follows:

Import java.util.ArrayList;


Import Java.util.Scanner; /** * Count sort Applies To: * (1) data range is small, it is recommended to be less than 1000 * (2) Each value is greater than or equal to 0 * @author xiazdong * */public class Count_sort {public
		static void Main (string[] args) {int[] array = Readarray ();
		System.out.print ("Array before sorting is:");
		print (array);
		int result[] = count_sort (array);
		System.out.print ("Sorted array is:");
	print (result);
		}//read array function private static int[] Readarray () {Scanner in = new Scanner (system.in);
		arraylist<integer> list = new arraylist<integer> ();
			while (true) {System.out.print ("input number:");
			int element = In.nextint ();
			if (element==-1) {break;
			} else{List.add (Element);
		}} integer[] arr = List.toarray (new integer[0]);
		Int[]array = new Int[arr.length];
		for (int i=0;i<arr.length;i++) {Array[i] = Arr[i];
	} return array;
		}//Count sort public static int[] Count_sort (int arr[]) {int gap = FINDGAP (arr);
		Int[] count = new INT[GAP];
		Int[] result = new Int[arr.length]; for (int i=0;i<arr.length;i++) {count[arr[i]]++;
		} for (int i=1;i<count.length;i++) {Count[i] = Count[i] + count[i-1]; 
			}//Reverse fill result array for (int i=arr.length-1;i>=0;i--) {result[count[arr[i]]-1] = Arr[i];
		count[arr[i]]--;
	} return result;
		} public static void print (Int. result[]) {for (int a:result) {System.out.print (A + "");
	} System.out.println ();
		/** * Find the data range of the array, that is, the maximum number of values * @param arr * @return */private static int findgap (int[] arr) {int max = arr[0];
			for (int i=1;i<arr.length;i++) {if (Max<arr[i]) {max = arr[i];
	}} return (max+1);
 }
}


Nine, the base sort

Cardinality sorting is also a non-comparative sorting algorithm, sorting each one, starting from the lowest bit, the complexity is O (kn), the array length, and k the maximum number of digits in the array;

For example {987,789}, sorted by single digit: {987,789}, sorted by 10-digit number: {987,789}, sorted by hundred: {789,987}


thought:

(1) Obtain the maximum number in the array and obtain the number of digits;

(2) Arr is the original array, starting from the lowest bit to take each bit to make up the radix array;

(3) Counting and sorting the radix (using the counting sort to apply to the characteristics of the small range number);


Import java.util.ArrayList;


Import Java.util.Scanner; /** * Count sort Applies To: * (1) data range is small, it is recommended to be less than 1000 * (2) Each value is greater than or equal to 0 * @author xiazdong * */public class Count_sort {public
		static void Main (string[] args) {int[] array = new int[]{1046,2084,9046,12074,56,7026,8099,17059,33,1};
		System.out.print ("Array before sorting is:");
		print (array);
		int result[] = radix_sort (array);
		System.out.print ("Sorted array is:");
	print (result);
		}//Radix sort O (kn) public static int[] Radix_sort (int[]arr) {int radix[] = new Int[arr.length];
		int count = 1;
		int n = findmaxlength (arr);
			for (int i=0;i<n;i++) {radix = Getradix (Arr,count);
			arr = Count_sort (arr, radix);
		Count *=10;
	} return arr;
		} private static int findmaxlength (int[] arr) {int max = arr[0];
			for (int i=1;i<arr.length;i++) {if (Max<arr[i]) {max = arr[i];
		}} int count = 1;
		int mcount = 1;
			while ((Max/mcount)!=0) {mcount = 1;
			count++;
			for (int i=0;i<count;i++) {Mcount *=10; }} return COunt;
		}//Get an array of bits that need to be sorted private static int[] Getradix (int[] arr,int count) {//o (n) int radix[] = new Int[arr.length];
		for (int i=0;i<arr.length;i++) {radix[i] = arr[i]/count% 10;
	} return radix; }//Similar to the Count sort//arr for the original array//radix as the array of bits that need to be sorted public static int[] Count_sort (int arr[],int radix[]) {int gap = Findgap (ra
		Dix);
		Int[] count = new INT[GAP];
		Int[] result = new Int[radix.length];
		for (int i=0;i<radix.length;i++) {count[radix[i]]++;
		} for (int i=1;i<count.length;i++) {Count[i] = Count[i] + count[i-1]; 
			}//Reverse fill result array for (int i=radix.length-1;i>=0;i--) {result[count[radix[i]]-1] = Arr[i];
		count[radix[i]]--;
	} return result;
		} public static void print (Int. result[]) {for (int a:result) {System.out.print (A + "");
	} System.out.println ();
		/** * Find the data range of the array, that is, the maximum number of values * @param arr * @return */private static int findgap (int[] arr) {int max = arr[0]; for (int i=1;i<arr.length;i++) {if (Max<arr[i]) {max = Arr[i];
	}} return (max+1);
 }
}




Comparison Chart


This figure is taken from the diagram of http://www.cnblogs.com/cj723/archive/2011/04/29/2033000.html








Summary: Each sort has the advantage of each sort, and we need to use the appropriate algorithm at the appropriate time;

For example, in the basic order, array size hours with direct insertion sort;

For example, in large arrays with fast sorting;

For example, if you want stability, use merge sort;



Excerpt from Wikipedia pictures:








Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.