Python implements eight sorting algorithms with the following details:
1. Insert Sort
Describe
The basic operation of inserting a sort is to insert a data into the ordered data that is already sorted, so as to get a new sequential data with a number plus one, the algorithm is suitable for the ordering of a small amount of data, and the time complexity is O (n^2). is a stable sorting method. The insertion algorithm divides the array to be sorted into two parts: the first part contains all the elements of the array, except for the last element (where an array has more space to insert), and the second part contains only that element (that is, the element to be inserted). After the first part is sorted, the last element is inserted into the first part of the sequence.
Code implementation
def insert_sort (lists): # Insert Sort count = Len (lists) for I in range (1, count): key = Lists[i] j = i-1
while J >= 0: if LISTS[J] > key: lists[j + 1] = Lists[j] lists[j] = key J-= 1 return lists
2. Hill sort
Describe
The Hill sort (Shell sort) is a sort of insertion. Also known as narrowing incremental sorting, is a more efficient and improved version of the direct insertion sorting algorithm. Hill Sort is a non-stable sorting algorithm. The method is due to DL. The shell was named after it was introduced in 1959. Hill sort is to group records by a certain increment of the subscript, sorting each group using the direct insertion sorting algorithm; As the increments gradually decrease, each group contains more and more keywords, when the increment is reduced to 1 o'clock, the entire file is divided into a group, the algorithm terminates.
Code implementation
def shell_sort (lists): # Hill Sort count = Len (lists) step = 2 group = count/step while group > 0:
for i in range (0, group): j = i + Group and J < count: k = j-group key = Lists[j] while K ; = 0: if lists[k] > key: lists[k + Group] = Lists[k] lists[k] = key K-= Group J + = group
group/= Step return lists
3. Bubble sort
Describe
It repeatedly visited the sequence to sort, comparing two elements at a time, and swapping them out if they were wrong in the order. The work of the sequence of visits is repeated until no more need to be exchanged, that is, the sequence is sorted.
Code implementation
def bubble_sort (lists): # bubble Sort count = Len (lists) for I in range (0, count): for J in range (i + 1, coun T): if lists[i] > lists[j]: lists[i], lists[j] = Lists[j], Lists[i] return lists
4. Quick Sort
Describe
By sorting the sorted data into separate two parts, one part of all data is smaller than the other part of the data, and then the two parts of the data are quickly sorted by this method, the entire sorting process can be recursive, so as to achieve the entire data into an ordered sequence.
Code implementation
def quick_sort (lists, left, right): # Quick Sort If left >= right: return lists key = Lists[left] low = l EFT High = Right While left < right: while left < right and lists[right] >= key: Right- = 1 l Ists[left] = Lists[right] While left < right and Lists[left] <= key: Left + = 1 lists[right] = lists[l EFT] Lists[right] = key quick_sort (lists, low, left-1) quick_sort (lists, left + 1, high) return LIS Ts
5. Direct selection of sorting
Describe
Basic idea: The 1th trip, in order to sort records R1 ~ R[n] In the selection of the smallest record, it and R1 Exchange, 2nd trip, in order to sort records R2 ~ R[n] In the selection of the smallest record, it and R2 exchange, and so on, the first trip in order to sort records R[i] ~ r[n to select the smallest record, and r[ I] exchange, which keeps the ordered sequence growing until all sorts are completed.
Code implementation
def select_sort (lists): # Select sort count = Len (lists) for I in range (0, count): min = i for j in range (i + 1, count): if lists[min] > Lists[j]: min = j Lists[min], lists[i] = Lists[i], Lists[min] return lists
6. Heap Sequencing
Describe
Heap Ordering (heapsort) is a sort of sorting algorithm designed by using the data structure of the pile-up tree (heap), which is a sort of selection. The elements of the specified index can be quickly positioned using the features of the array. The heap is divided into Dagen and small Gan, which are completely binary trees. The Dagen requirement is that the value of each node is not more than the value of its parent node, i.e. A[parent[i]] >= a[i]. In the non-descending order of the array, it is necessary to use the large root heap, because according to the requirements of Dagen, the maximum value must be at the top of the heap.
Code implementation
# adjust Heap def adjust_heap (lists, I, size): lchild = 2 * i + 1 rchild = 2 * i + 2 max = i if I < size/2:
if Lchild < size and Lists[lchild] > Lists[max]: max = Lchild if rchild < size and Lists[rchild] > L Ists[max]: max = rchild if max! = I: Lists[max], lists[i] = Lists[i], Lists[max] adjust_heap (lists, max , size) # Create heap def build_heap (lists, size): For I in range (0, (SIZE/2)) [:: -1]: adjust_heap (lists, I, size) # heap Sort def Heap_sort (lists): size = len (lists) build_heap (lists, size) for I in range (0, size) [:: -1]: lists[ 0], lists[i] = Lists[i], lists[0] adjust_heap (lists, 0, i)
7. Merge sort
Describe
Merge sort is an effective sorting algorithm based on merging operation, which is a very typical application of the partition method (Divide and Conquer). The ordered Subsequence is merged to obtain a fully ordered sequence, i.e., the order of each subsequence is ordered, and then the sequence of sub-sequences is ordered. If two ordered tables are combined into an ordered table, they are called two-way merging.
The merge process is: Compare the size of a[i] and a[j], if A[I]≤A[J], then the first ordered table elements a[i] copied to r[k], and I and K plus 1; otherwise, the second ordered TABLE element A[j] copied to R[k], and the J and K respectively add 1, This loop continues until one of the ordered tables is finished, and then the remaining elements in the other ordered table are copied to the cells in R from subscript K to subscript t. Merging sorting algorithm we usually use recursive implementation, first to sort the interval [s,t] to the midpoint of the two points, then the left sub-range, then the right sub-range is sorted, and finally the left and right intervals with a merge operation into an orderly interval [s,t].
Code implementation
def merge (left, right): i, j = 0, 0 result = [] When i < Len (left) and J < Len (right): if Left[i] & lt;= Right[j]: result.append (left[i]) i + = 1 else: result.append (Right[j]) J + = 1 result + = Left[i:] result + = Right[j:] return resultdef merge_sort (lists): # merge sort If Len (lists) <= 1: return lists num = len (lists)/2 left = Merge_sort (Lists[:num]) Right = Merge_sort (lists[num:]) C18/>return merge (left, right)
8. Base Order
Describe
The radix sort (radix sort) belongs to the "distributive sort" (distribution sort), also known as the "bucket method" (bucket sort) or bin sort, as the name implies, it is a part of the information through the key value, the elements to be sorted into some "barrels", In order to achieve the role of sequencing, the cardinal ranking method is a sort of stability, its time complexity is O (Nlog (r) m), where R is the base taken, and M is the number of heaps, at some point, the cardinality of the sorting method is more efficient than other stability ranking method.
Code implementation
Import Mathdef radix_sort (lists, radix=10): k = Int (Math.ceil (Math.log (max (lists), radix)) bucket = [[] for I in Range (radix)] for I in range (1, k+1): For J in lists: bucket[j/(radix** (i-1))% (Radix**i)].append (j) Del lists[:] for z in bucket: lists + = Z del z[:] return lists
The above is the python implementation of the eight sorting algorithm detailed introduction, I hope that everyone's learning has helped.