Eight sorting methods of python and eight sorting methods of python

Source: Internet
Author: User

Eight sorting methods of python and eight sorting methods of python

I. Insert sorting

1 #-*-coding: UTF-8-*-2 ''' 3 the basic operation to describe 4 to insert a data into the sorted data, in this way, a new ordered data with a number plus one is obtained. The algorithm is suitable for sorting a small amount of data, and the time complexity is O (n ^ 2 ). 5 is a stable sorting method. The insert algorithm divides the array to be sorted into two parts: the first part contains all the elements of the array, except the last element (so that the array can have more space to insert ), 6. The second part only contains this element (that is, the element to be inserted ). After sorting the first part, insert the last element to the first part of the sorted 7 '''8 def insert_sort (lists): 9 count = len (lists) 10 for I in range (1, count): 11 key = lists [I] 12 j = I-113 while j> = if lists [j]> key: 15 lists [j + 1] = lists [j] 16 lists [j] = key17 j-= 118 return lists19 20 lst1 = raw_input (). split () 21 lst = [int (I) for I in lst1] 22 # lst = input () 23 insert_sort (lst) 24 for I in range (len (lst )): 25 print lst [I],
View Code

 

Ii. Hill sorting

1 #-*-coding: utf8-*-2 ''' 3 Description 4 Shell Sort is a Sort of insert. Also known as downgrading incremental sorting, it is a more efficient and improved version that directly inserts the sorting algorithm. Hill sorting is a non-stable sorting algorithm. 5 This method was named after DL. Shell was proposed in 1959. Hill sorting is to sort records by a certain increment group by using a direct insertion Sorting Algorithm for each group. As the increment gradually decreases, 6 contains more and more keywords. When the increment is reduced to 1, the entire file is divided into a group, and the algorithm is terminated. 7 ''' 8 def shell_sort (lists): 9 count = len (lists) 10 step = 211 group = count/step12 while group> for I in range (group ): 14 j = I + group15 while j <count: 16 k = j-group17 key = lists [j] 18 while k >=0:19 if lists [k]> key: 20 lists [k + group] = lists [k] 21 lists [k] = key22 k-= group23 j + = group24 group/= step25 return lists26 27 lst1 = raw_input (). split () 28 lst = [int (I) for I in lst1] 29 # lst = input () 30 shell_sort (lst) 31 for I in range (len (lst )): 32 print lst [I],
View Code

 

Iii. Bubble Sorting

1 #-*-coding: utf8-*-2 ''' 3 Description 4 it repeatedly visits the series to be sorted and compares two elements at a time, if their order is wrong, they will be exchanged. 5. The work of visiting a sequence is repeated until there is no need for exchange, that is, the sequence has been sorted. 6 '''7 def bubble_sort (lists): 8 count = len (lists) 9 for I in range (count): 10 for j in range (I + 1, count ): 11 if lists [I]> lists [j]: 12 lists [I], lists [j] = lists [j], lists [I] 13 return lists14 15 lst1 = raw_input (). split () 16 lst = [int (I) for I in lst1] 17 # lst = input () 18 bubble_sort (lst) 19 for I in range (len (lst )): 20 print lst [I],
View Code

 

4. Directly select sorting

1 #-*-coding: utf8-*-2 ''' 3 describe 4 Basic Idea: 1st times, record to be sorted r1 ~ Select the smallest record from r [n] and exchange it with r1. For example, 2nd rows, record r2 ~ Select the smallest record from r [n] and exchange it with r2; 5 and so on, where the I-th record is to be sorted r [I] ~ Select the smallest record in r [n] and exchange it with r [I] so that the ordered sequence continues to grow until all the records are sorted. 6 '''7 def select_sort (lists): 8 count = len (lists) 9 for I in range (count): 10 min = i11 for j in range (I + 1, count): 12 if lists [min]> lists [j]: 13 min = j14 lists [min], lists [I] = lists [I], lists [min] 15 return lists16 17 lst1 = raw_input (). split () 18 lst = [int (I) for I in lst1] 19 # lst = input () 20 select_sort (lst) 21 for I in range (len (lst )): 22 print lst [I],
View Code

 

5. Quick sorting

1 #-*-coding: utf8-*-2 ''' 3 Description (recursion is used, which is inefficient and hard to understand) 4. Split the data to be sorted into two independent parts by one sort. All the data in one part is smaller than that in the other part, 5. Then, sort the two data parts by using this method. The entire sorting process can be recursive to convert the data into an ordered sequence. 6 '''7 def quick_sort (lists, left, right): 8 if left> = right: 9 return lists10 key = lists [left] 11 low = left12 high = right13 while left <right: 14 while left <right and lists [right]> = key: 15 right-= 116 lists [left] = lists [right] 17 while left <right and lists [left] <= key: 18 left + = 119 lists [right] = lists [left] 20 lists [right] = key21 quick_sort (lists, low, left-1) 22 quick_sort (lists, left + 1, high) 23 return lists24 25 lst1 = raw_input (). split () 26 lst = [int (I) for I in lst1] 27 # lst = input () 28 quick_sort (lst, 0, len (lst)-1) 29 for I in range (len (lst): 30 print lst [I],
View Code

 

6. Heap sorting

1 #-*-coding: utf8-*-2 ''' 3 Description (hard to understand) 4 Heapsort refers to the use of heap trees) this data structure is designed as a sort algorithm, which selects sorting. You can use the features of arrays to quickly locate the elements of a specified index. The five heaps are divided into big and small heaps, which are completely Binary Trees. The requirement of A large root heap is that the value of each node is not greater than that of its PARENT node, that is, A [PARENT [I]> = A [I]. 6. In the non-descending sorting of arrays, you need to use the big root heap, because according to the requirements of the big root heap, the maximum value must be at the top of the heap. 7''' 8 # adjust he9 def adjust_heap (lists, I, size ): 10 lchild = 2 * I + 111 rchild = 2 * I + 212 max = i13 if I <size/if lchild <size and lists [lchild]> lists [max]: 15 max = lchild16 if rchild <size and lists [rchild]> lists [max]: 17 max = rchild18 if max! = I: 19 lists [max], lists [I] = lists [I], lists [max] 20 adjust_heap (lists, max, size) 21 22 # create heap 23 def build_heap (lists, size): 24 for I in range (0, (size/2) [:-1]: 25 adjust_heap (lists, i, size) 26 27 # heap sorting 28 def heap_sort (lists): 29 size = len (lists) 30 build_heap (lists, size) 31 for I in range (0, size) [:-1]: 32 lists [0], lists [I] = lists [I], lists [0] 33 adjust_heap (lists, 0, I) 34 35 lst1 = raw_input (). split () 36 lst = [int (I) for I in lst1] 37 # lst = input () 38 heap_sort (lst) 39 for I in range (len (lst )): 40 print lst [I],
View Code

 

VII. Merge Sorting

1 #-*-coding: utf8-*-2 ''' 3 Description (recursive) 4. Merge Sorting is an effective Sorting Algorithm Based on the merge operation, this algorithm is a very typical application of Divide and Conquer. Merge ordered subsequences to obtain a fully ordered sequence. 5 means that each subsequence is ordered first, and then the subsequence segments are ordered. If two ordered tables are merged into an ordered table, it is called a two-way merge. 6. the merging process is to compare the sizes of a [I] And a [j]. If a [I] is less than or equal to a [j], copy the element a [I] In the first ordered table to r [k] and Add 1 to I and k respectively; otherwise, copy the element a [j] In the second ordered table to r [k], and Add 1 to j and k respectively. In this way, the loop continues until one of the ordered tables is completed, then copy the remaining elements in another ordered table to the Unit from subscript k to subscript t in r. We usually implement the Merge Sorting Algorithm Using Recursion. In 8, we first sort the subintervals [s, t] to the midpoint, then sort the left subintervals, and then sort the right subintervals, finally, merge the left and right intervals into an ordered interval [s, t] with one merge operation. 9 '''10 def merge (left, right): 11 # merging process 12 I, j = 0,013 result = [] 14 while I <len (left) and j <len (right): 15 if left [I] <= right [j]: 16 result. append (left [I]) 17 I ++ = 118 else: 19 result. append (right [j]) 20 j ++ = 121 result. extend (left [I:]) 22 result. extend (right [j:]) 23 return result24 25 def merge_sort (lists): 26 if len (lists) <= 1:27 return lists28 mid = len (lists) /229 left = merge_sort (lists [: mid]) 30 right = merge_sort (lists [mid:]) 31 return merge (left, right) 32 33 lst1 = raw_input (). split () 34 lst = [int (I) for I in lst1] 35 # lst = input () 36 tt = merge_sort (lst) 37 for I in range (len (tt): 38 print tt [I],
View Code

 

8. Base sorting

1 #-*-coding: utf8-*-2 ''' 3 Description (indicating no contact, first heard of) 4 Base sorting (radix sort) it is a distributed sort (distribution sort), also known as the bucket sort (bucket sort) or bin sort. As the name suggests, it refers to some information about key values, 5. The elements to be sorted are allocated to some "buckets" to achieve sorting. The base sorting method is a stable sorting, and its time complexity is O (nlog (r) m ), 6. r indicates the base number and m indicates the heap number. In some cases, the base sorting method is more efficient than other stability sorting methods. 7 '''8 import math 9 def radix_sort (lists, radix = 10): 10 k = int (math. ceil (math. log (max (lists), radix) 11 bucket = [[] for I in range (radix)] 12 for I in range (1, k + 1 ): 13 for j in lists: 14 bucket [j/(radix ** (I-1) % (radix ** I)]. append (j) 15 del lists [:] 16 for z in bucket: 17 lists + = z18 del z [:] 19 return lists20 21 lst1 = raw_input (). split () 22 lst = [int (I) for I in lst1] 23 # lst = input () 24 radix_sort (lst) 25 for I in range (len (lst )): 26 print lst [I],
View Code

The following is a comparison of the time complexity and stability of each sort algorithm:

 

What is the fastest average sorting algorithm? Sorting method Average condition best condition worst condition auxiliary space stability Bubble Sorting O (n ^ 2) O (n ^ 2) O (1) stable selection of sorting O (n ^ 2) O (n ^ 2) O (n ^ 2) O (1) unstable insertion of sorting O (n ^ 2) O (n) O (n ^ 2) O (1) stable Hill sorting O (n * log (n ))~ O (n ^ 2) O (n ^ 1.3) O (n ^ 2) O (1) unstable heap sorting O (n * log (n )) O (n * log (n) O (1) unstable merge sort O (n * log (n )) O (n * log (n) O (n) stable and fast sorting O (n * log (n )) O (n * log (n) O (n ^ 2) O (1) unstable Bubble Sorting after optimization, the best time complexity can reach O (n ). Set a flag. If there is no exchange in a comparison, it can be terminated in advance. Therefore, the time complexity is O (n) in the positive order ). In the worst and best cases of sorting, You must select the smallest (large) number in the remaining sequence, and exchange with the position element after the sorted sequence, the complexity of the best and worst time in sequence is O (n ^ 2 ). Insert sorting refers to inserting the last element of a sorted sequence into a sequence with a sorted order (select a proper position, in the positive order, the time complexity is O (n ). The heap is a complete binary tree, so the depth of the tree must be log (n) + 1. The best and worst time complexity is O (n * log (n )). Merge Sorting divides a large array into two small arrays, which are recursive in sequence, which is equivalent to a binary tree. The depth is log (n) + 1, therefore, the best and worst time complexity is O (n * log (n )). In a forward or backward order, each division only gets a subsequence with one fewer record than the previous division. It is drawn from a recursive tree, which is an oblique tree, in this case, n-1 recursion is required, and the I-th partition must be compared by the n-I keyword to find the I-th record, therefore, the time complexity is \ sum _ {I = 1} ^ {n-1} (n-I) = n (n-1)/2, that is, O (n ^ 2 ).

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.