Algorithm IDEA in quick sorting
1. Divide governance Thoughts
The basic idea of the division and control law is to break down the original problem into several subproblems with smaller sizes but similar structures as the original problem. Recursively solve these subproblems, and then combine the solutions of these subproblems into the solutions of the original problem.
We can use the divide and conquer idea to divide the disordered array arr [p, R] into the following steps:
1. decomposition:
Find the principal component X in array arr [p, R], and divide the original array into two arrays based on this X: arr [p ,,, any element of Q-1] and ARR [q + 1, R] That belongs to ARR [p, Q-1] Must be smaller than that of ARR [Q]
Any element of ARR [q + 1, R] must be greater than that of ARR [Q].
2. Recursive solution:
Use recursive call to quickly sort sub-arrays arr [p, Q-1] and ARR [q + 1, R ].
3. Combination
Because the quick sorting is based on the original address sorting, merge is not required, that is, the array has already sorted
Code implementation:
Pseudocode:
quickSort(A, p, r) if p < r q = partition(A, p, r) quickSort(A, p, q - 1) quickSort(A, q + 1, r)
Here, in order to sort all elements of an array, the initial call is a (A, 0, A. length)
In fact, the final part of array decomposition is its partition, which is also the key to fast sorting.
Implementation of the go language:
package mainimport "fmt"func partition(A []int, p int, r int) int {x := A[r-1]i := p - 1for j := p; j < r-1; j++ {if A[j] <= x {i += 1A[i], A[j] = A[j], A[i]}}A[i+1], A[r-1] = A[r-1], A[i+1]return i + 1}func quickSort(A []int, p int, r int) {if p < r {q := partition(A, p, r)fmt.Println(q)quickSort(A, p, q-1)quickSort(A, q+1, r)}}func main() {arr := []int{2, 8, 7, 1, 3, 5, 6, 4}quickSort(arr, 0, len(arr))for _, v := range arr {fmt.Println(v)}}
Performance of quick sorting
In quick sorting, its running time depends on whether it is classified and balanced. If the division is balanced, its time complexity is nlog2 (n) the same as that of Merge Sorting. When the division is unbalanced, the time complexity is the same as that of insert sorting. It is O (n ^ 2), but when the number of elements is greater than 20, the performance of fast sorting is far better than that of merge and heap sorting (in the case of Division balance), which means that its average performance is very superior.
1. Worst case:
When array a [p, R] is divided into two arrays, their number of elements is. length-1 and 0. It can be imagined that every recursive call of an algorithm is unbalanced, and the time complexity of division is random (n ), because recursive calls to an array with 0 elements are directly returned, T (0) = random (1)
The final algorithm running time is T (n) = T (n-1) + hour (n)
2. Division of the best cases:
In fact, in fast sorting, generally or most sorting is closer to the best, rather than the worst,
In fast sorting, as long as the percentage you divide is a constant ratio, the time complexity of the algorithm is always O (nlogn). That is to say, even if you divide the result as 9999: 1, time complexity or O (nlogn );
Note:
This article is original and can be reproduced, but please mark the original connection !!
Reference: Introduction to Algorithms