Order
Fast Sorting (QuickSort) is also a sort algorithm, and the worst-case run time is O (n^2) for an input array containing n arrays. Although this worst-case run time is poor, fast sorting is often the best practical choice for sorting, because of its average performance, the expected run time of O (NLGN), and the small implied constant factor in O (NLGN), and the ability to do in-place sorting to work well in a virtual environment.
Principle
The quick sort is also the same as the merge sort, based on the divide-and-conquer method, divides into decomposition, solves, merges three steps;
Decomposition: Array Array[low...high] is divided into two (possibly empty) sub-arrays array[low...temp-1] and Array[temp+1...high], so that each element in array[low...temp-1] is less than or equal to array[ Temp], while each element in Array[temp+1...high] is greater than array[temp], the subscript temp is also computed in this process;
Workaround: Sort by recursive call quick Sort, sub array array[low...temp-1],array[temp+1...high];
Merge: Because two sub-arrays are sorted in place, their merges do not need to be manipulated, and the entire array Array[low...high] is already ordered.
This chapter introduces the principle of the fast sorting algorithm, the program implementation (including the randomization version) and its performance analysis.
Fast algorithm implementation
#include <iostream>#include <ctime>#include <cstdlib>#define Nusing namespace STD;//Fast sort recursive algorithmvoidQuickSort (int*Array,intLowintHigh);//Find a split pointintPartitionint*Array,intLowintHigh);//Exchange values for two variablesvoidExchangeint&a,int&B);intMain () {//Declare an array to be sorted int Array[N];//Set randomization seed to avoid generating the same random number each timeSrand (Time (0)); for(inti =0; i<n; i++) {Array[I] = rand ()%101;//array assignment uses random functions to generate random numbers between 1-100}cout<<"before sorting:"<< Endl; for(intj =0; j<n; J + +) {cout<<Array[j] <<" "; }cout<< Endl <<"After sorting:"<< Endl;//Call the Quick Sort function to sort the arrayQuickSort (Array,0N1); for(intK =0; k<n; k++) {cout<<Array[K] <<" "; }cout<< Endl;return 0;}//mainvoidQuickSort (int*Array,intLowintHigh) {if(Low < High) {intTEMP = partition (Array, low, high); QuickSort (Array, Low, temp-1); QuickSort (Array, temp +1, high); }}intPartitionint*Array,intLowintHigh) {inti = low-1;//default will divide the last element of the segment as the main element intx =Array[High]; for(intj = Low; jif(Array[j] <= X)//On array[i] The left side is less than x is the number of Array[high], the right is greater than its number{i + =1; ExchangeArray[I],Array[j]); }} Exchange (Array[i +1],Array[High]);returni +1;//So when the loop is complete, i+1 is the split point of the array}voidExchangeint&a,int&B) {inttemp = A; A = b; b = temp;}
Quick-Sort Randomized version
In the implementation of the fast sort algorithm described above, Partition (A, p, R) is always the default A[r] as the primary element, as a comparison criterion. If randomized sampling techniques can be used, it will make the analysis easier. The following is a randomized version of the Fast sorting algorithm implementation:
#include <iostream>#include <ctime>#include <cstdlib>#define Nusing namespace STD;//Fast sort recursive algorithmvoidQuickSort (int*Array,intLowintHigh);//Find a split pointintPartitionint*Array,intLowintHigh);//A random element between low and high as the principal element, to find the split pointintRandompartition (int*Array,intLowintHigh);//Exchange values for two variablesvoidExchangeint&a,int&B);intMain () {//Declare an array to be sorted int Array[N];//Set randomization seed to avoid generating the same random number each timeSrand (Time (0)); for(inti =0; i<n; i++) {Array[I] = rand ()%101;//array assignment uses random functions to generate random numbers between 1-100}cout<<"before sorting:"<< Endl; for(intj =0; j<n; J + +) {cout<<Array[j] <<" "; }cout<< Endl <<"After sorting:"<< Endl;//Call the Quick Sort function to sort the arrayQuickSort (Array,0N1); for(intK =0; k<n; k++) {cout<<Array[K] <<" "; }cout<< Endl; System"Pause");return 0;}//mainvoidQuickSort (int*Array,intLowintHigh) {if(Low < High) {inttemp = Randompartition (Array, low, high); QuickSort (Array, Low, temp-1); QuickSort (Array, temp +1, high); }}intPartitionint*Array,intLowintHigh) {inti = low-1;//default will divide the last element of the segment as the main element intx =Array[High]; for(intj = Low; jif(Array[j] <= X)//On array[i] The left side is less than x is the number of Array[high], the right is greater than its number{i + =1; ExchangeArray[I],Array[j]); }} Exchange (Array[i +1],Array[High]);returni +1;//So when the loop is complete, i+1 is the split point of the array}intRandompartition (int*Array,intLowintHigh) {//Find a random position between low and high inti = rand ()% (high-low +1) + low;//Swap the random principal to the tail,ExchangeArray[I],Array[High]);returnPartitionArray, low, high);}voidExchangeint&a,int&B) {inttemp = A; A = b; b = temp;}
Random version of the fast and ordinary fast row difference is not very large, the change is only to find the main element in the split point step selection, that is, the addition of randompartition function, the selection of the main element subscript I, the element is exchanged to the end of the segment, still call partition function to find the partition point.
Quick Sort Performance Analysis
The run time of a fast sort is related to whether the partition is symmetrical, and the latter is related to which element is selected. If the partitioning is symmetric, then the algorithm is as fast as the merge sort in the asymptotic sense, and if the partitioning is asymmetric then the algorithm is as slow as the insertion sort in a gradual sense. The following sections discuss the worst case division, the best case Division, and the division of balance for fast sorting.
worst case Partitioning:The worst case partitioning behavior of a quick sort occurs when the two regions that are generated during the partitioning process contain n-1 elements and 0 elements respectively. Assuming that this asymmetric division is present in each recursive invocation of the algorithm, the time cost for the division is O (n), since the T (n) =o (1) is returned after a recursive invocation of an array of size 0, so that the algorithm's run time can be recursively expressed as:
T (n) = t (n-1) + t (0) + O (n) = t (n-1) + O (n)
Intuitively, if the cost of each layer of recursion added up, you can get a arithmetic progression (equation (array,2) and the amount of its value is very O (n^2)) using the substitution method can be compared directly to prove that the solution of the recursive t (n) = t (n-1) + O (n) is t (n) = O (n^2).
Therefore, if the algorithm at each level of recursion, the division is the maximum degree of asymmetry, then the algorithm run time is O (n^2), that is, the worst case of fast sorting algorithm running time is not as good as the insertion sort. In addition, when the input array is fully sequenced, the run time for the fast sort is O (n^2), and the run time for the insertion sort is O (n).
Best Case Classification:In the most balanced partitioning possible for partition, the size of the resulting two sub-problems is unlikely to be greater than [N/2], because if one of the sub-problems is [N/2], the size of the other sub-problem will necessarily be [n/2]-1. In this case, the fast sort runs much faster, when the recursive expression of its run time is:
T (n) <= 2T (N/2) + O (n)
The recursive formula can be solved by T (n) = O (NLGN). Since both sides of the recursive division are symmetric on each level, the algorithm runs faster from a progressive point of view.
Division of the balance:The average run time for fast sequencing is close to its best-case run time, not very close to its worst-case run time (proof of reason detailed reference to the "Introduction to the Algorithm" book second edition P88), because any kind of partitioning by constant proportions will produce a recursive tree of Depth O (LGN), The cost of each layer is O (n), so the total run time is O (NLGN) whenever a constant ratio is divided.
Practice study Questions
Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.
Introduction to Algorithms-chapter 7 Quick Sort