(1) Quick sort:
Quick Sort (English: Quicksort), also known as the divide-and-exchange sort (partition-exchange sort), divides the sorted data into separate two parts by a single pass, where all of the data is smaller than any other part of the data, Then the two parts of the data are sorted quickly by this method, and the whole sorting process can be carried out recursively so as to achieve the whole data into ordered sequence.
The steps are:
1. Select an element from the series, called the "Datum" (pivot),
2. Reorder the series, where all elements are placed in front of the datum in a smaller position than the base value, and all elements are larger than the base value behind the datum (the same number can be on either side). At the end of this partition, the datum is in the middle of the sequence. This is called partition (partition) operation.
3. Recursively (recursive) sorts sub-columns that are less than the base value element and sub-columns that are larger than the base value element.
At the bottom of the recursive scenario, the size of the sequence is 0 or one, which is always sorted. Although it is always recursive, the algorithm always ends, because in each iteration (iteration), it will at least put an element in its final position.
(2) Quick sorting analysis:
(3) Code implementation:
1 defQuick_sort (alist, First, last):2 """Quick Sort"""3 ifFirst >= Last:#overall recursive end condition4 return5MID =Alist[first]6Low = First7High = Last8 whileLow <High :9 #High cursor shift leftTen whileLow < High andAlist[high] >= Mid:#Note that the two while only have a = sign to ensure the same element is on the same side OneHigh-= 1 AAlist[low] =Alist[high] - - #The low cursor shifts right the whileLow < High andAlist[low] <Mid: -Low + = 1 -Alist[high] =Alist[low] - #when exiting from a loop, Low==high +Alist[low] = mid#<==>alist[high] = Mid - + #perform a quick sort on the list to the left of low AQuick_sort (Alist, first, low-1)#Quick_sort (alist[:low-1]) at - #perform a quick sort on the list on the right side of the low -Quick_sort (Alist, low+1, last)#Quick_sort (alist[low+1:]) - - - if __name__=="__main__": inLi = [54, 26, 93, 17, 77, 31, 44, 55, 20] - Print(LI) toQuick_sort (li, 0, Len (LI)-1) + Print(LI)
(4) Operation result:
(5) Complexity of Time:
Optimal time complexity:O (n log n)
Worst time complexity:O (n2)
Stability: unstable
It is not obvious that the average cost of an O (n log n) time from the start of a quick sort is not apparent. But it is not difficult to observe the partitioning operation, the elements of the array will be visited once in each loop, using O (n) time. This operation is also O (n) in the version with which the binding (concatenation) is used.
In the best case, every time we run a partition, we divide a sequence into two nearly equal fragments. This means that each recursive call handles half the size of the sequence. Therefore, we only need to make log n nested calls before we reach the size of a series. This means that the depth of the call Tree is O (log n). However, in two program calls of the same hierarchy, the same part of the original sequence is not processed; Therefore, each hierarchy of program calls requires only an O (n) of time (there is some common additional cost per invocation, but because only O (n) calls are in each hierarchy, these are summarized in O (n) coefficients). The result is that the algorithm only uses O (n log n) time.
(6) Quick Sort Demo:
6.5 Quick Sort