Python's Eight sorting methods

Source: Internet
Author: User
Tags sorts
First, insert sort

#-*-coding:utf-8-*-"The basic operation of inserting a sort is to insert a data into the ordered data that is already sorted, so as to get a new sequential data with a number plus one, the algorithm is suitable for ordering small amounts of data, and the time complexity is O (n^2). is a stable sorting method. The insertion algorithm divides the array to be sorted into two parts: the first part contains all the elements of the array, except for the last element (where an array has more space to insert), and the second part contains only that element (that is, the element to be inserted). After the first part is sorted, the last element is inserted into the first part of the sequence "Def insert_sort (lists):    count = Len (lists) for    I in range (1, count):        Key = Lists[i]        j = i-1 while        J >= 0:            if LISTS[J] > key:                lists[j + 1] = Lists[j]                lists[j] = key            J-= 1    return listslst1 = Raw_input (). Split () LST = [Int (i) for i in lst1] #lst = input () insert_sort (LST) for I in R Ange (Len (LST)):    print Lst[i],

Ii. sort of Hill

#-*-Coding:utf8-*-' Description of the hill sort (Shell sort) is a sort of insertion. Also known as narrowing incremental sorting, is a more efficient and improved version of the direct insertion sorting algorithm. Hill Sort is a non-stable sorting algorithm. The method is due to DL. The shell was named after it was introduced in 1959. Hill sort is to group records by a certain increment of the subscript, sorting each group using the direct insertion sorting algorithm; As the increments gradually decrease, each group contains more and more keywords, when the increment is reduced to 1 o'clock, the entire file is divided into a group, the algorithm terminates. "Def shell_sort (lists):    count = Len (lists)    step = 2    group = count/step while    group > 0: for        i I N Range (Group):            j = i + group while            J < count:                k = j-group                key = Lists[j] While                k >= 0:                    i F Lists[k] > key:                        lists[k + Group] = Lists[k]                        lists[k] = key                    K-= Group                j + = Group        Group/= St EP    Return listslst1 = Raw_input (). Split () LST = [Int (i) for i in lst1] #lst = input () shell_sort (LST) for I in Range (Len ( LST)):    print Lst[i],

Third, bubble sort

#-*-Coding:utf8-*-"describes it repeatedly visiting the sequence of sorts, comparing two elements at a time, and swapping them out if they were in the wrong order. The work of the sequence of visits is repeated until no more need to be exchanged, that is, the sequence is sorted. "Def bubble_sort (lists):    count = Len (lists)    for I in Range (count):        for J in range (i + 1, count):            if list S[i] > Lists[j]:                lists[i], lists[j] = Lists[j], lists[i]    return listslst1 = Raw_input (). Split () LST = [Int (i) fo R i in Lst1] #lst = input () bubble_sort (LST) for I in range (len (LST)):    print Lst[i],

Iv. Direct selection of sorting

#-*-Coding:utf8-*-"describes the basic idea: The 1th trip, select the smallest record in the sorted record R1 ~ R[n], swap it with R1, 2nd trip, select the smallest record in the sorted record R2 ~ r[n, and then swap it with the R2; The first pass selects the smallest record in the sorted record R[i] ~ R[n], swapping it with r[i], so that the ordered sequence continues to grow until all sorts are completed. "Def select_sort (lists):    count = Len (lists)    for I in Range (count):        min = i        for j in range (i + 1, count):            if lists[min] > Lists[j]:                min = j        Lists[min], lists[i] = Lists[i], lists[min]    return listslst1 = raw_i Nput (). Split () LST = [Int (i) for i in lst1] #lst = input () select_sort (LST) for I in range (len (LST)):    print Lst[i],

V. Quick Sort

#-*-Coding:utf8-*-' description (using recursion, less efficient, more difficult to understand) divides the sorted data into separate two parts by one trip, and one part of all the data is smaller than the rest of the data, and then the two parts of the data are quickly sorted by this method. , the whole sorting process can be carried out recursively, in order to achieve the entire data into an ordered sequence. "Def quick_sort (lists, left, right): If left    >= right:        return lists    key = Lists[left] Low    = left
  
   high = right While Left < right: while left < right and        Lists[right] >= key: Right-            = 1        lists[le FT] = Lists[right] While left < right and        Lists[left] <= key: Left            + + 1        lists[right] = LISTS[LEFT]
   lists[right] = key    quick_sort (lists, low, left-1)    quick_sort (lists, left + 1, high)    return listslst1 = Raw_input (). Split () LST = [Int (i) for i in lst1] #lst = input () quick_sort (Lst,0,len (LST)-1) for I in range (len (LST)):    p Rint Lst[i],
  

Vi. sequencing of Heaps

#-*-Coding:utf8-*-' description (harder to understand) heap sequencing (heapsort) is a sort of sorting algorithm designed using a data structure such as a stacked tree (heap), which is a sort of selection. The elements of the specified index can be quickly positioned using the features of the array. The heap is divided into Dagen and small Gan, which are completely binary trees. The Dagen requirement is that the value of each node is not more than the value of its parent node, i.e. A[parent[i]] >= a[i]. In the non-descending order of the array, it is necessary to use the large root heap, because according to the requirements of Dagen, the maximum value must be at the top of the heap.        "# Adjustment Heap def adjust_heap (lists, I, size): Lchild = 2 * i + 1 rchild = 2 * i + 2 max = i If I < size/2: If lchild < size and Lists[lchild] > Lists[max]: max = Lchild if rchild < size and LISTS[RC            Hild] > Lists[max]: max = Rchild if max! = I:lists[max], lists[i] = Lists[i], Lists[max] Adjust_heap (lists, max, size) # Create heap def build_heap (lists, size): For I in range (0, (SIZE/2)) [:: -1]: Adj Ust_heap (lists, I, size) # heap Sort def heap_sort (lists): size = Len (lists) build_heap (lists, size) for I in range (0, si Ze) [:: -1]: lists[0], lists[i] = Lists[i], lists[0] adjust_heap (lists, 0, i) Lst1 = Raw_input (). Split () LST = [ int (i) for i in lst1] #lst = input () heap_sort (LST) for I In range (len (LST)): Print Lst[i], 

Seven, merge sort

#-*-Coding:utf8-*-"description (using recursion) The merge sort is an efficient sorting algorithm based on the merging operation, which is a very typical application using the divide-and-conquer method (Divide and Conquer). The ordered Subsequence is merged to obtain a fully ordered sequence, i.e., the order of each subsequence is ordered, and then the sequence of sub-sequences is ordered. If two ordered tables are combined into an ordered table, they are called two-way merging. The merge process is: Compare the size of a[i] and a[j], if A[I]≤A[J], then the first ordered table elements a[i] copied to r[k], and I and K plus 1; otherwise, the second ordered TABLE element A[j] copied to R[k], and the J and K respectively add 1, This loop continues until one of the ordered tables is finished, and then the remaining elements in the other ordered table are copied to the cells in R from subscript K to subscript t. Merging sorting algorithm we usually use recursive implementation, first to sort the interval [s,t] to the midpoint of the two points, then the left sub-range, then the right sub-range is sorted, and finally the left and right intervals with a merge operation into an orderly interval [s,t].        "Def merge (left, right): #合并过程 i, j = 0, 0 result = [] while I < Len (left) and J < Len (right): If Left[i] <= Right[j]: Result.append (left[i]) i + = 1 Else:result.append (right  [j]) J + = 1 Result.extend (left[i:]) result.extend (right[j:]) return resultdef merge_sort (lists): if Len (lists) <= 1:return lists mid = Len (lists)/2 left = Merge_sort (Lists[:mid]) right = Merge_sort (l Ists[mid:]) return merge (left, right) Lst1 = Raw_input (). Split () LST = [Int (i) for i in lst1] #lst = input ()tt = Merge_sort (LST) for I in range (len (TT)): Print Tt[i], 

Eight, the base sort

#-*-Coding:utf8-*-"description (means no contact, first heard) the cardinality sort (radix sort) is" assigned sort "(distribution sort), also known as" bucket method "(bucket sort) or bin sort, As the name implies, it is through the key value of the part of the information, the elements to be sorted into some "barrels", in order to achieve the role of sorting, the cardinality is a sort of stability, the time complexity of O (Nlog (r) m), where R is the base taken, and M is the number of heaps, at some point, The efficiency of radix sorting method is higher than that of other stability sequencing methods. "Import Mathdef radix_sort (lists, radix=10):    k = Int (Math.ceil (Math.log (max (lists), radix))    bucket = [[] For I in range (radix)] "    for I in Range" (1, k+1): For        J in lists:            bucket[j/(radix** (i-1))% (Radix**i)].append (j) 
  del lists[:] for        z in bucket:            lists + = Z            del z[:]    return listslst1 = Raw_input (). Split () LST = [Int (i) fo R i in Lst1] #lst = input () radix_sort (LST) for I in range (len (LST)):    print Lst[i],

The time complexity and stability comparison of each sorting algorithm are attached below:

What is the fastest sorting algorithm for average speed?

Sorting method average situation best case for worst case assisted space stability

Bubble sort O (n^2) o (n) o (n^2) O (1) stabilized

Select Sort O (n^2) o (n^2) o (n^2) O (1) unstable

Insert sort O (n^2) o (n) o (n^2) O (1) stabilized

Hill sort O (N*log (n)) ~o (n^2) O (n^1.3) o (n^2) O (1) unstable

Heap Sort O (N*log (n)) O (N*log (n)) O (N*log (n)) O (1) unstable

Merge sort O (N*log (n)) O (N*log (n)) O (N*log (n)) O (n) stable

Quick Sort O (N*log (n)) O (N*log (n)) O (n^2) O (1) unstable

After the bubble sort has been optimized, it is best to have a time complexity of O (n). Set a flag bit, if there is no interchange in the comparison, can end prematurely, so in the case of the positive sequence, the time complexity is O (n). Select sort in the worst and best case, you must select the smallest (large) number in the remaining sequence, swap with a position element after the ordered sequence, and then the best and worst time complexity are O (n^2). An insertion sort is an O (n) time complexity in a positive order when the latter element of an ordered sequence is inserted into a sequence that is already ordered (the appropriate position needs to be selected). The heap is a fully binary tree, so the depth of the tree must be log (n) +1, and the best and worst time complexity are O (N*log (n)). Merge sort is to divide a large array into two decimal groups, recursively, which is equivalent to a two-fork tree with a depth of log (n) +1, so the best and worst time complexity are O (N*log (n)). Quick sort in the case of positive or reverse order, each division only get a sub-sequence of a record less than the last division, with a recursive tree drawn out, is a diagonal tree, at this time need to n-1 recursion, and the First Division I divide to go through n-i keyword comparison to find the first record, so the time complexity is \sum_{i=1}^{ N-1} (N-i) =n (n-1)/2, i.e. O (n^2).

  • Related Article

    Contact Us

    The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

    If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

    A Free Trial That Lets You Build Big!

    Start building with 50+ products and up to 12 months usage for Elastic Compute Service

    • Sales Support

      1 on 1 presale consultation

    • After-Sales Support

      24/7 Technical Support 6 Free Tickets per Quarter Faster Response

    • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.