Fast sequencing is an important sort algorithm based on divide-and-conquer technology. Unlike merge sorting, which is divided by the position of the elements in the array, the quick sort divides them according to the value of the element. Specifically, it rearranges the elements in a given array to get a fast-sorted partition. In a partition, all elements before the s subscript are less than or equal to A[s], and all elements after the S subscript are greater than or equal to A[s].
Obviously, after a partition is established, A[s] is already in its final position in an ordered array, and then we can continue to sort the sub array (using the same method) after A[s] before and A[s.
To sort all the elements of an array A, the initial invocation is Quicksort (a,1,a.length).
The following algorithm is for A[p. R] Partitioning (first pseudo code, understanding meaning).
PARTITION (a,p,r)
x = a[r]
i = p-1 for
j = p to r-1
if a[j]≤x
i = i + 1
exchange A[i] with a[ J]
Exchange a[i+1] with A[r] return
i+1
Efficiency of the Fast sorting algorithm:
In the optimal case, the number of key value comparisons Cbest (n) satisfies the following recurrence:
When N>1, Cbest (n) =2cbest (N/2) +n,cbest (1) =0
According to the principal theorem, Cbest (n) ∈θ (NLOGN) and cbest (n) = Nlog (n) are obtained for n=2k.
In the worst case, all the splitting points tend to be extreme: two arrays have one empty, and the other is just one less element than the partitioned array. Specifically, this regrettable situation occurs on an ascending array, which means that the input array has been ordered. So, after a n+1 comparison is made, and the a[0 is swapped with itself, the quick sort algorithm also sorts the Strictly incremented array a[1..n-1]. The sort of strictly incremented arrays that are reduced in size continues to a[n-2 the last child array ... N-1]. In this case, the total number of key value comparisons should be equal to:
Cworst (n) = (n+1) +n+...+3= (n+1) (n+2)/2-3∈θ (n2)
Now it's time to discuss the efficiency of fast sorting on average. For randomly ordered arrays of size n, the number of fast-sorted average key comparisons is Cavg (n). Assuming that the partition's split point S (0≤s≤n-1) is in every position the probability is 1/n, we get the following recursive formula:
Cavg (0) =0,cavg (1) =0
CAVG (n) ≈2nlnn≈1.38nlogn
As a result, fast sorting performs a 38% comparison operation on average, only more than the optimal situation. In addition, its most internal cycle efficiency is very high, so that when dealing with randomly arranged arrays, the speed is faster than the merge sort.
Here's a quick sort of go code:
Copy Code code as follows:
Func QuickSort (Slice_arg []int, ILeft int, IRight int) {
If ILeft < IRight {
var itmpval = Slice_arg[ileft]
var i, j = ileft, IRight
For I < J {
Fmt. Println ("i,j =", I, J)
For i < J && Slice_arg[j] > Itmpval {
j--
}
If I < J {
Slice_arg[i] = Slice_arg[j]
i++
}
for i < J && Slice_arg[i] < Itmpval {
i++
}
If I < J {
Slice_arg[j] = slice_arg[i]
j--
}
}
Slice_arg[i] = itmpval
QuickSort (Slice_arg, ILeft, i-1)
QuickSort (Slice_arg, j+1, IRight)
}
}