Comparison of several sorting algorithms
Name |
Worst time complexity |
Best time complexity |
Complexity of space |
Note |
Select sort |
O (n^2) |
O |
|
|
Insert Sort |
O (n^2) |
|
|
|
Bubble sort |
O (n^2) |
O (N) |
|
|
Merge sort |
O (NLOGN) |
|
|
High time efficiency, but the need for temporary array merging, space complexity is high |
Quick Sort |
O (n^2) |
O (Nlogn) (Average time) |
|
|
Heap Sort |
O (NLOGN) |
|
|
|
Bucket sort |
O (N+n) |
|
|
Limited to small integer ordering, time efficiency higher than the first few |
Cardinality sort |
O (DN) |
|
|
Limited to small integer ordering, time efficiency higher than the first few |
1. Select sort (Selection sort)
The idea of choosing a sort is to select from the array the smallest, the second small, the second time small, ..., the elements are added to the previous array in turn.
Code:
Package sort;
public class Selectionsort {public
static void Selectionsort (int[] list) {
int index;
int currentelement;
for (int i = 0; i < list.length-1 i++) {
index = i;
Currentelement = List[i];
for (int j = i + 1; j < List.length; J + +) {
if (List[j] < currentelement) {
currentelement = list[j];
index = j;
}
}
if (index!= i) {
List[index] = list[i];
List[i] = currentelement;
}
}
public static void Main (string[] args) {
int[] list = {7,3,5,7,3,2,1,2};
Selectionsort (list);
for (int i = 0; i < list.length i++) {
System.out.println (list[i]);
}}
2. Insert sort (insertion sort)
The idea of the insertion algorithm is to start with the second element, and then insert it into the sequence in front of the sequence, and note that the insertion traversal traverses the array from the back forward and the subscript crosses the line. Procedures on the reference materials on the relatively simple.
Code:
Package sort;
public class Insertionsort {public
static int[] Insertionsort (int[] list) {for
(int i = 1; i < list.length; I + +) {
int k;
int currentelement = List[i];
for (k = i-1 k >= 0 && list[k] > Currentelement k--) {
list[k + 1] = List[k];
}
List[k + 1] = currentelement;
}
return list;
}
public static void Main (string[] args) {
int[] list = {7,3,5,7,3,2,1,2};
List = Insertionsort (list);
for (int i = 0; i < list.length i++) {
System.out.println (list[i]);
}}
3. Bubble sort (Bubble sort)
The bubble sort algorithm needs to traverse several arrays and compare successive contiguous elements in each traversal. If a pair of elements is descending, swap their positions; Bubble sort method needs to traverse n-1 number of groups, each traversal is the best case is not exchanged, the time complexity is O (n), the worst case is to exchange each time, for the first k to Exchange n-1-k times, at this time the complexity is O (n^2)
Trick: If there is no exchange in one traversal, all elements have been sorted out and need not be exchanged again. Use this attribute to improve the algorithm.
Code:
Package test;
public class Test {public
static void Main (string[] args) {
int[] list = {3,2,1};
List = Bubblesort (list);
for (int i = 0; i < list.length i++) {
System.out.println (list[i]);
}
public static int[] Bubblesort (int[] list) {
Boolean nextpass = true;
for (int i = 0; i < list.length-1 && Nextpass; i++) {
nextpass = false;
for (int j = 0; J < list.length-1-I; j + +) {
if (List[j] > list[j + 1]) {
int tmp = LIST[J];
LIST[J] = list[j + 1];
List[j + 1] = tmp;
Nextpass = true;
}
}
return list;
}
4. Merging algorithm (mergesort)
The merging algorithm can be recursively described as: the algorithm divides the array into two halves, and recursively applies the merge sort to each part. After the two sections are sorted out, merge them. The complexity of the merge sort is O (n log n).
Code:
Package sort;
Import java.awt.List;
public class MergeSort {public static int[] Merge (int[] list) {int[] firsthalf = new INT[LIST.LENGTH/2];
System.arraycopy (list, 0, firsthalf, 0, firsthalf.length);
if (Firsthalf.length > 1) firsthalf = merge (Firsthalf);
int[] secondhalf = new INT[LIST.LENGTH-LIST.LENGTH/2];
System.arraycopy (list, LIST.LENGTH/2, secondhalf, 0, secondhalf.length);
if (Secondhalf.length > 1) secondhalf = merge (Secondhalf);
List = Merge (Firsthalf, secondhalf);
return list;
public static int[] Merge (int[] List1, int[] list2) {int[] tmp = new Int[list1.length + list2.length];
int current1 = 0, Current2 = 0, current3 = 0;
while (Current1 < list1.length && Current2 < list2.length) {if (List1[current1] < List2[current2])
tmp[current3++] = list1[current1++];
else tmp[current3++] = list2[current2++];
while (Current1 < list1.length) {tmp[current3++] = list1[current1++]; }
while (Current2 < list2.length) {tmp[current3++] = list2[current2++];
return TMP;
public static void Main (string[] args) {int[] list = {4,2,7,4,7,9,8,1};
List = merge (list);
for (int i = 0; i < list.length i++) {System.out.println (list[i]);
}
}
}
5. Quick Sort (QuickSort)
The quick sort algorithm selects an element called a primary element (pivot) in the array, dividing the array into two parts so that all elements in the first section are less than or equal to the primary, while all elements in the second part are greater than the primary . The first part recursively applies the fast sort algorithm, then applies the fast sort algorithm recursively to the second part.
in the worst case , partitioning an array of n elements requires n-time comparisons and N-Times movement. Therefore, the time required for dividing is O (n). At worst, each time the main element divides the array into a large and an empty array. The size of this large sub array is minus 1 of the size of the last partitioned array of children. The algorithm requires (n-1) + (n-2) +...+2+1= O (n^2) time.
in the best case , each primary element divides the array into two parts that are roughly equal in size. Set T (n) to indicate the time required to sort an array of n elements using a quick sort algorithm, and therefore, similar to the analysis of the merge sort, the fast-sorted T (n) = O (nlogn).
Package sort;
Import java.awt.List;
public class QuickSort {public static int[] QuickSort (int[] list) {QuickSort (list, 0, list.length-1);
return list;
public static int[] QuickSort (int[] list, int A, last) {if (I < last) {//recursively to the array before and after the primary (pivot)
int pivotindex = partition (list, A, last);
QuickSort (list, A, pivotIndex-1);
QuickSort (list, Pivotindex + 1, last);
} return list; public static int partition (int[] list, int A, last) {//operation procedure See figure int pivot = List[first], low = 1, h
IgH = Last;
Find elements in the first half of the group that are larger than the primary and less than or equal to the principal in the next half of the group (High > Low) {while (pivot >= List[low] && Lower <= High)
low++;
while (Pivot < List[high] && low <= high) high--;
Swap two elements if (low < high) {int tmp = List[low];
List[low] = List[high];
List[high] = tmp;
}//Insert the main element into the appropriate location while (List[high] >= pivot && high > i) high--; if (List[high]< pivot) {List[first] = List[high];
List[high] = pivot;
return to High;
else {return i;
} public static void Main (string[] args) {int[] list = {2,6,3,5,4,1,8,45,2};
List = QuickSort (list);
for (int i = 0; i < list.length i++) {System.out.println (list[i]);
}
}
}
---------------------------------------------------------------------------------modified to 2016/5/21----------------- ----Now it seems that the quick-row code is not very clear, attach a clearer code below:
public class QuickSort {public static void main (string[] args) {int[] array = {1,1,1,1,1,1,1,1};
QuickSort qs = new QuickSort ();
Qs.quicksort (array);
for (int ele:array) System.out.print (Ele + "");
} public void QuickSort (int[] array) {quickSort (array, 0, array.length-1);
The public void QuickSort (int[] array, int start, int ends) {if (Start > End)//pay attention to the termination condition return; int index = partation (array, start, end);//The location of the main element QuickSort (array, start, index-1); recursively quickSort the array before the main element (array, Index + 1, end);//recursively pairs the array after the main element fast} public int partation (int[] array, int start, int end) {int pivot = Array[start]
;
int low = start;
int high = end + 1; while (High > Low) {//Here it must be noted that you first make sure that no array pointers are accessed across borders, and then make sure that the low and high point elements are the same size and do not fall into a dead loop like while (The low < end && Array [++low] < pivot {} while (High > Start && Array[--high] > Pivot) {}/* If the following two lines of code are substituted for the above two lines of code, when the ARRA Y[low] = = Array[high] will fall into a dead loop while (Array[low) < PIVOT) low++;
while (Array[high] > Pivot) high--;
*/if (High > Low) Swap (array, low, high);
Swap (array, start, high);
return to High;
public void Swap (int[] array, int a, int b) {int tmp = Array[a];
Array[a] = array[b];
ARRAY[B] = tmp;
}
}
Analysis: Merge VS fast Platoon
Both the merge and the Fast Platoon are divided into the rule.
For merge ordering, a lot of work is to merge the two sub linear tables, and merge is done after the sub linear table is arranged.
For fast sequencing, a large number of tasks are to divide the linear table into two linear tables, which are arranged after the sub linear table is ordered.
In the worst case, merge time efficiency > fast time efficiency, but on average, the efficiency of the two is the same.
Space efficiency, merging space efficiency < fast-draining space efficiency. The reason is that the merge sort requires a temporary array when merging two sub arrays, while the quick sort does not require additional array space.
6. Heap Sort (heapsort)
The heap sort uses a two-fork heap, which is a completely binary tree. First, the structure and characteristics of the heap are very important for us to understand heap sequencing.
====================================== extension =============================================
extension: heap (heap) is a two-fork tree with the following attributes:
It is a complete binary tree (except that the last layer is not filled and the last layer of leaves is left on the side, if each layer of a binary tree is full, then the binary tree is complete. Each node is greater than or equal to the ordinal characteristics of any of its children's heaps: for the node at position I, its left child is in position 2i+ 1, the right child in the position 2i+2, its father in (I-2)/2. Add a new node to the heap: first, add the new node to the end of the heap and compare it to its father, if it is larger than its father, to exchange both positions repeat 2 until it is less than or equal to the father example:
Delete root node: Deletes the root node to replace the root node with the last element of the heap compares the replaced elements with the child node size, exchanging the substituted elements and the smaller elements in the position of repeat 3 until the replacement element is in the appropriate location example:
=======================================================================================================
To continue with the topic, because the heap to keep the parent node is always greater than the characteristics of the child nodes, we can add the elements in the list to the heap, the process of rebuilding the heap has been sorted, and then we use the deletion of root node in order to get the elements from large to small.
The time complexity of heap sort is the same as that of merge sort, it is O (n logn), but heap sort does not need extra array space, and the space efficiency of heap sort is higher than merge sort.
7. Bucket sort (bucketsort)
The four algorithms mentioned above can be used to compare any of the key-value types (such as integers, strings, and any comparable objects), but there is no algorithm with more time complexity than O (n Logn) in these comparison based sorting algorithms. Bucket sorting is suitable for sorting small integers without comparing key values.
The bucket sort works as follows. Suppose the range of key values is 0 to N-1. We need N tags for 0, 1, ..., N-1 barrels. If the key value of the element is I, then the calcium element is put into the bucket I for a long time. There are elements in each bucket that have the same value as the key value, and you can use ArrayList to implement a bucket.
The time complexity of the bucket sort is O (n+n), and the space complexity is O (n+n), where n refers to the size of the linear table.
8. Cardinal Order
When N is too large, the bucket sort described in 5 is not very desirable. You can sort by cardinality at this point.
The cardinality sort is based on buckets, but it only takes 10 barrels.
Typically, a cardinality order takes an O (DN) time to sort n elements with an integer key value, where D is the maximum value for the cardinality position in all key values.