Python Sorting Algorithm summary and example details, python details

Source: Internet
Author: User

Python Sorting Algorithm summary and example details, python details

Summarizes common centralized sorting algorithms.

Merge Sorting

Merge Sorting is also called Merge Sorting, which is a typical application of the division and control method. The idea of divide governance is to divide each problem into small problems, solve each small problem, and then merge it.

The specific Merge Sorting is to recursively break down a group of unordered numbers into sub-items with only one element by n/2. One element is sorted. Then combine these ordered child elements.

The merging process is to compare two sorted sub-sequences. First, we select the smallest of the two sub-sequences for comparison, and then select the sub-sequence with the smallest of the two elements from the sub-sequence.

Remove the data from the final result set until the two subsequences are merged.

The Code is as follows:

#! /Usr/bin/python import sys def merge (nums, first, middle, last): ''' merge''' # Slice boundary, left closed and right open, and 0 is the start lnums = nums [first: middle + 1] rnums = nums [middle + 1: last + 1] lnums. append (sys. maxint) rnums. append (sys. maxint) l = 0 r = 0 for I in range (first, last + 1): if lnums [l] <rnums [r]: nums [I] = lnums [l] l + = 1 else: nums [I] = rnums [r] r + = 1 def merge_sort (nums, first, last ): ''' the merge sort merge_sort function transmits a subscript, not the number of elements ''' if first <last: middle = (first + last)/2 merge_sort (nums, first, middle) merge_sort (nums, middle + 1, last) merge (nums, first, middle, last) if _ name _ = '_ main __': nums = [, 4,-, 6, 7, 3] print 'nums is: ', nums merge_sort (nums, 0, 7) print 'merge sort:', nums

Stable, time complexity O (nlog n)

Insert sort

The Code is as follows:

#! /Usr/bin/python importsys definsert_sort (a): ''''' there is an ordered data sequence for insertion and sorting. You need to insert a number to the sorted data sequence, however, the data sequence is still ordered after insertion. At the beginning, an element is obviously ordered, insert an element to the appropriate position, insert the third element, and so on '''a_len = len () if a_len = 0 and a [j]> key: a [j + 1] = a [j] j-= 1 a [j + 1] = key return a if _ name _ = '_ main __': nums = [, 4,-, 6, 7, 3] print 'nums is: ', nums insert_sort (nums) print 'insert sort:', nums

Stable, time complexity O (n ^ 2)

Exchange the values of two elements in python: a, B = B, a. In fact, this is because the left and right sides of the value assignment symbol are tuples.

(In python, tuples are defined by commas (,), rather than parentheses ).

Select sort

Selection sort is a simple and intuitive sorting algorithm. It works as follows. First, find the smallest (large) element in the unordered sequence and store it

The starting position of the sorting sequence. Then, the minimum (large) element is searched from the remaining unordered elements and placed at the end of the sorted sequence. And so on

All elements are sorted.

Import sys def select_sort (a): ''' select the smallest (or largest) element from the data elements to be sorted, the sequence is placed at the end of the sorted series until all the data elements to be sorted are arranged. Selecting sorting is an unstable sorting method. '''A_len = len (a) for I in range (a_len ): # On 0-n-1, select the corresponding element min_index = I # record the subscript of the smallest element for j in range (I + 1, a_len ): # Find the minimum value if (a [j] <a [min_index]): min_index = j if min_index! = I: # Find the minimum element to exchange a [I], a [min_index] = a [min_index], a [I] if _ name _ = '_ main _': A = [10,-3, 5, 7, 1, 3, 7] print 'before sort: ', A select_sort (A) print 'after sort:',

Unstable, time complexity O (n ^ 2)

Hill sorting

Hill sorting, also known as the descending incremental sorting algorithm, is a non-stable sorting algorithm. This method, also known as downgrading incremental sorting, was named after DL. Shell was proposed in 1959.

Take an integer d1 smaller than n as the first increment, and divide all records of the file into d1 groups. All records whose distance is multiples of d1 are placed in the same group. Sort in each group first;

Then, take the second incremental d2

Import sys def shell_sort (a): ''' shell sorting ''' a_len = len (a) gap = a_len/2 # incremental while gap> 0: for I in range (a_len): # select and sort the same Group. m = I j = I + 1 while j <a_len: if a [j] <a [m]: m = j + = gap # j increase gap if m! = I: a [m], a [I] = a [I], a [m] gap/= 2 if _ name _ = '_ main _': A = [10,-3, 5, 7, 1, 3, 7] print 'before sort: ', A shell_sort (A) print 'after sort:',

Unstable, time complexity average time O (nlogn) Worst time O (n ^ s) 1

Heap Sort)

Definition of "heap": in "heap" where the starting index is 0:

The right sub-node of node I is in the position 2 * I + 24) the parent node of node I is in the position floor (I-1)/2): Note floor indicates the "round" operation.

Heap features:

The key value of each node must always be greater than (or less than) its parent node.

"Max Heap ":

The "Heap" root node stores the node with the largest key value. That is, the key value of each node in the heap is always greater than its subnode.

Move up, move down:

When the key value of a node is greater than its parent node, We need to "move up" to move the node to its parent node, let its parent node go to its location, and then we continue to judge the node until the node no longer exceeds its parent node ".

Now let's take a look at the "Move Down" operation. When we reduce the key value of a node, We need to "move down" it.

Method:

We first create a maximum heap (time complexity O (n), and then we only need to swap the root node with the last node, and then exclude the last one, then, adjust the heap of the root node after the switch (time complexity O (lgn), that is, "Move down" the root node. The total time complexity of heap sorting is O (nlgn ).

The Code is as follows:

#! /Usr/bin env python # The array number starts from 0 def left (I): return 2 * I + 1 def right (I ): return 2 * I + 2 # Keep the maximum heap property so that the subtree root with I becomes the maximum heap def max_heapify (A, I, heap_size): if heap_size <= 0: return l = left (I) r = right (I) largest = I # select the larger node in the subnode if l A [largest]: largest = l if r A [largest]: largest = r if I! = Largest: # It indicates that the current node is not the largest. Move Down A [I], A [largest] = A [largest], A [I] # Switch max_heapify (A, largest, heap_size) # continue tracking the downward point # print A # Build A heap def bulid_max_heap (A): heap_size = len (A) if heap_size> 1: node = heap_size/2-1 while node> = 0: max_heapify (A, node, heap_size) node-= 1 # The heap sorting subscript starts from 0 def heap_sort (): bulid_max_heap (A) heap_size = len (A) I = heap_size-1 while I> 0: A [0], A [I] = A [I], A [0] # store the maximum value in the heap to an appropriate position in the array, heap_size-= 1 # decrease heap size 1 I-= 1 # decrease the subscript of the maximum value in the heap by 1 max_heapify (A, 0, heap_size) if _ name _ = '_ main _': A = [10,-3, 5, 7, 1, 3, 7] print 'before sort: ', A heap_sort (A) print 'after sort:',

Unstable, time complexity O (nlog n)

Quick sorting

The same as the merge sort algorithm, the quick sort algorithm is also based on the shard mode. Subarray A [p... R] the following three steps are involved:

Decomposition: returns the array A [p... R] is divided into A [p... Q-1] with A [q + 1... R], where A [p... Each element in the Q-1] is less than or equal to A [q] And A [q + 1... Each element in r] is greater than or equal to A [q];

Solution: Use recursive call to quickly sort sub-arrays A [p... Q-1] And A [q + 1... R;

Merge: because the two sub-arrays are sorted in place, no additional operations are required.

For the start of each iteration of partition, x = A [r], for any array subscript k, there are:

1) if p ≤ k ≤ I, A [k] ≤ x.

2) if I + 1 ≤ k ≤ J-1, then A [k]> x.

3) if k = r, A [k] = x.

The Code is as follows:

#! /Usr/bin/env python # Fast sort ''' to divide the Array Based on A [r, if the partition process is smaller than A [r] on the left, or if the partition process is larger than A [r] on the right, there are two ways to sort the partition process quickly, one method is the two pointer indexes described above. One method is to scan backward one after the other, and the other is to scan the two pointers from the first to the middle. ''' # P, r is the subscript def partition1 (A, p, r) of array A: ''' method 1, method '''x = A [r] I = p-1 j = p while j <r: if A [j] <x: I + = 1 A [I], A [j] = A [j], A [I] j + = 1 A [I + 1], A [r] = A [r], A [I + 1] return I + 1 def partition2 (A, p, r ): ''''' Method for scanning two pointers from the beginning and end to the center ''' I = p j = r x = A [p] while I = x and I <j: j-= 1 A [I] = A [j] while A [I] <= x and I <j: I + = 1 A [j] = A [I] A [I] = x return I # quick sort def quick_sort (A, p, r ): ''' the worst time complexity of quick sorting is O (n2), and the usual time complexity is O (nlgn) ''' if p <r: q = partition2 (, p, r) quick_sort (A, p, q-1) quick_sort (A, q + 1, r) if _ name _ = '_ main __': A = [5,-, 1, 2] print 'before sort: ', A quick_sort (A, 0, 7) print 'after sort:',

Unstable, time complexity ideal O (nlogn) Worst time O (n ^ 2)

Let's talk about the sequence in python:

Lists, tuples, and strings are all sequences, but what are sequences? Why are they so special? The two main features of a sequence are index operators and slice operators. The index operator allows us to capture a specific item from the sequence. The Slice operator allows us to obtain a slice of a sequence, that is, a part of the sequence, for example, a = ['A', 'bb', 'cc'], print a [0] indicates the index operation, and print a [0: 2] indicates the slice operation.

I hope to use this article to learn about Python algorithm sorting. Thank you for your support!

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.