Python's top ten classic sorting algorithms

Source: Internet
Author: User
Tags sorts

Sorting algorithms can be divided into internal and external sorting, the internal sorting is the data records in memory to sort, and the external sort is because of the data is very large, one can not hold all the sort records, in order to access the external memory. Common internal sorting algorithms are: Insert sort, hill sort, select sort, bubble sort, merge sort, quick sort, heap sort, cardinality sort, etc. Summarize with a picture:

About the complexity of time:

    1. Square Order (O (N2)) sort all sorts of simple sorts: direct insert, direct selection, and bubbling sort.

    2. Linear Logarithmic order (O (nlog2n)) Sort quick sort, heap sort, and merge sort.

    3. O (n1+§)) ordering, § is a constant between 0 and 1. Hill sort.

    4. Linear Order (O (n)) sort cardinal order, in addition to bucket, box sorting.

About Stability:

Stable sorting algorithms: bubble sort, insert sort, merge sort, and Cardinal sort.

Not a stable sorting algorithm: Choose Sort, quick sort, hill sort, heap sort.

Noun Explanation:

N: Data size

K: The number of "barrels"

In-place: Consumes constant memory and does not consume extra memory

Out-place: Consumes extra memory

Stability: The order and order of 2 equal key values are sorted before they are the same

Bubble sort

Bubble sort (Bubble sort) is also a simple and intuitive sorting algorithm. It repeatedly visited the sequence to sort, comparing two elements at a time, and swapping them out if they were wrong in the order. The work of the sequence of visits is repeated until no more need to be exchanged, that is, the sequence is sorted. The algorithm is named because the smaller elements will slowly "float" through the switch to the top of the sequence.

As one of the simplest sorting algorithms, the bubbling sort gives me the feeling of Abandon in the word book, every time on the first page of the first place, so the most familiar. Bubble sort There is also an optimization algorithm, that is, to set a flag, when the traversal of a sequence of elements does not occur, it proves that the sequence is orderly. But this improvement does not have much effect on improving performance.

1. Algorithm steps

  1. Compares the adjacent elements. If the first one is bigger than the second one, swap them both.

  2. Do the same for each pair of adjacent elements, starting with the last pair from the first pair to the end. When this is done, the final element will be the maximum number.

  3. Repeat the above steps for all elements, except for the last one.

  4. Repeat the above steps each time for fewer elements, until there are no pairs of numbers to compare.

    2. Motion Diagram Demo

    3. Python Code Implementation

    def bubbleSort(arr):
       for i in range(1, len(arr)):
           for j in range(0, len(arr)-i):
               if arr[j] > arr[j+1]:
                   arr[j], arr[j + 1] = arr[j + 1], arr[j]
       return arr

    Select sort

    Select Sort is a simple and intuitive sorting algorithm, which is the time complexity of O (N2) Regardless of what data goes in. So when it's used, the smaller the data, the better. The only advantage might be that you don't take up extra memory space.

    1. Algorithm steps

      1. First find the smallest (large) element in the unordered sequence, and place it at the beginning of the sort sequence

      2. Then continue looking for the smallest (large) element from the remaining unsorted elements and place it at the end of the sorted sequence.

      3. Repeat the second step until all the elements are sorted.

    2. Motion Diagram Demo

    3. Python Code Implementation

    def selectionSort(arr):
       for i in range(len(arr) - 1):
           # 记录最小数的索引
           minIndex = i
           for j in range(i + 1, len(arr)):
               if arr[j] < arr[minIndex]:
                   minIndex = j
           # i 不是最小数时,将 i 和最小数进行交换
           if i != minIndex:
               arr[i], arr[minIndex] = arr[minIndex], arr[i]
       return arr

    Insert Sort

    The code implementation of the insertion sort is as simple and rude as it is without bubbling and sorting, but the principle should be the easiest to understand, because anyone who has played poker should be able to understand it in seconds. Insertion sorting is the simplest and most intuitive sort algorithm, which works by constructing an ordered sequence, for unsorted data, to scan from backward forward in a sorted sequence, to find the appropriate position and insert.

    Insert sort and bubble sort, there is also an optimization algorithm called split-half insertion.

    1. Algorithm steps

      1. The first element of the first order sequence is treated as an ordered sequence, and the second element to the last element as an unordered sequence.

      2. Scans the unordered sequence from beginning to end, inserting each element scanned into the appropriate position of the ordered sequence. (If the element you want to insert is equal to an element in an ordered sequence, insert the element you want to insert behind the equal element.) )

    2. Motion Diagram Demo

    3. Python Code Implementation

    def insertionSort(arr):
       for i in range(len(arr)):
           preIndex = i-1
           current = arr[i]
           while preIndex >= 0 and arr[preIndex] > current:
               arr[preIndex+1] = arr[preIndex]
               preIndex-=1
           arr[preIndex+1] = current
       return arr

    Hill sort

    Hill sort, also called descending incremental sorting algorithm, is a more efficient and improved version of insertion sequencing. But Hill sort is a non-stable sorting algorithm.

    The hill sort is based on the following two-point nature of the insertion sort, which proposes an improved method:

      • The insertion sort is efficient when the data operation is almost already ordered, that is, the efficiency of linear sequencing can be achieved;

      • But the insertion sort is generally inefficient because the insertion sort can only move the data one bit at a time;

    The basic idea of Hill sort is: First, the whole sequence of records is divided into several sub-sequences to be directly inserted into the sort, and then the whole record is sequentially inserted in order when the records are "basically ordered".

    1. Algorithm steps

      1. Select an incremental sequence T1,T2,......,TK, where Ti > TJ, tk = 1;

      2. According to the number of increment series K, the sequence is sorted by K-trip;

      3. Each order, according to the corresponding increment ti, the backlog sequence is divided into several sub-sequences of length m, respectively, the sub-table is directly inserted sort. Only the increment factor is 1 o'clock, the entire sequence is treated as a table, and the length of the table is the length of the entire sequence.

    2. Python Code Implementation

      def  shellsort  (arr): 
        import Math
       gap= 1
        while (Gap < Len (arr)/ 3):
          &NBSP;GAP = gap* 3 + 1
      &NBSP while Gap > 0:
          &NBSP, for I in range (Gap,len (arr)):
      &N Bsp        temp = Arr[i]
              &NBSP;J = i-gap
          & nbsp   &NBSP while J >= 0 and arr[j] > Temp:
                  & NBSP;ARR[J+GAP]=ARR[J]
                  &NBSP;J-=GAP
            & nbsp &NBSP;ARR[J+GAP] = temp
          &NBSP;GAP = Math.floor (gap/ 3)
        return arr

    Merge sort

    Merge sort is an efficient sorting algorithm based on merging operations. This algorithm is a very typical application of the partition method (Divide and Conquer).

    As a typical algorithm application of divide and conquer thought, the implementation of merge sort is two ways:

      • Top-down recursion (all recursive methods can be rewritten with iterations, so there is a 2nd method);

      • Bottom-up iterations;

    In the "Data structure and algorithm JavaScript description", the author gives a bottom-up iterative method. But for the recursive method, the author thinks:

    However, it is not possible to do so in JavaScript, as the recursion goes too deep for the language to handle.

    However, this approach is not feasible in JavaScript because the recursive depth of the algorithm is too deep for it.

    To tell you the truth, I don't quite understand this remark. Does it mean that the JavaScript compiler has too little memory and is too recursive to be too deep to cause memory overflow? Also hope that the great God can teach.

    As with select Sort, the performance of the merge sort is not affected by the input data, but behaves much better than the selection, since it is always the time complexity of O (NLOGN). The cost is that additional memory space is required.

    1. Algorithm steps

      1. The space is applied to the sum of two sorted sequences, which is used to store the merged sequence;

      2. Set two pointers, the initial position is the starting position of two sorted sequences;

      3. Compare the elements pointed to by two pointers, select a relatively small element into the merge space, and move the pointer to the next position;

      4. Repeat step 3 until a pointer reaches the end of the sequence;

      5. Copies all the remaining elements of another sequence directly to the end of the merge sequence.

    2. Motion Diagram Demo

    3. Python Code Implementation

    DefMergeSort(arr):
    Import Math
    if (Len (arr) <2):
    Return arr
    Middle = Math.floor (len (arr)/2)
    Left, right = arr[0:middle], Arr[middle:]
    return Merge (MergeSort (left), MergeSort (right))

    def merge(left,right):
    result = []
    And right :
    if left[0] <= right[0]:
    Result.append (Left.pop (0));
    Else:
    Result.append (Right.pop (0));
    While left :
    Result.append (Left.pop (0));
    While right :
    Result.append (Right.pop (0));
    return result

    Quick Sort

    Fast sequencing is a sort of algorithm developed by Donny Holl. On average, order n items to be compared 0 (NLOGN) times. In the worst case scenario, a 0 (N2) comparison is required, but this is not a common situation. In fact, fast sequencing is often significantly faster than other 0 (NLOGN) algorithms because its internal loop (inner Loop) can be implemented efficiently on most architectures.

    Quick sort uses the divide-and-conquer (Divide and conquer) strategy to divide a serial (list) into two sub-serial (sub-lists).

    Fast sorting is a typical application of divide-and-conquer thought in sorting algorithm. In essence, the fast sort should be considered as the recursive division method based on bubble sort.

    The name of the quick sort is simple and rude, because you know the meaning of it when you hear the name, fast and efficient! It is one of the fastest sorting algorithms for dealing with big data. Although worst case time complexity reached O (N2), but people are excellent, in most cases is more than the average time complexity O (n logn) ranking algorithm performance is better, but this is why, I do not know. Fortunately my obsessive-compulsive disorder again, check N more information finally in the "Algorithm Art and Informatics Contest" found a satisfactory answer:

    The worst-case scenario for a fast sort is O (N2), such as a fast sequence of sequential sequences. However, the expected time of its averaging is O (Nlogn), and the O (NLOGN) notation has a small number of constant constants, which is much smaller than the merge ordering of O (Nlogn), which is more stable than the complexity. Therefore, for the most order-less random series, fast sorting is always better than merge sort.

    1. Algorithm steps

      1. Select an element from the series, called the "Datum" (pivot);

      2. Reorder the columns, where all elements are placed in front of the datum in a smaller position than the base value, and all elements are larger than the base value behind the datum (the same number can be on either side). After the partition exits, the datum is in the middle of the sequence. This is called partition (partition) operation;

      3. recursively (recursive) sorts sub-numbers that are smaller than the base value elements and sub-numbers that are larger than the base value elements;

    At the bottom of the recursive scenario, the size of the sequence is 0 or one, which is always sorted. Although it is always recursive, the algorithm always exits, because in each iteration (iteration), it will at least put an element to its last position.

    2. Motion Diagram Demo

    3. Python Code Implementation

    DefQuickSort(arr, Left=none, Right=none):
    left =0IfNot Isinstance (left, (int, float))else left
    right = Len (arr)-1IfNot Isinstance (right, (int, float))else right
    If left < right:
    Partitionindex = partition (arr, left, right)
    QuickSort (arr, left, partitionindex-1)
    QuickSort (arr, partitionindex+1, right)
    return arr

    def partition(arr, left, right):
    Pivot = Left
    index = pivot+1
    i = Index
    While I <= right:
    if arr[i] < Arr[pivot]:
    Swap (arr, I, index)
    index+=1
    i+=1
    Swap (arr,pivot,index-1)
    return index-1

    def swap(arr, I, J):
    Arr[i], arr[j] = Arr[j], arr[i]

    Heap Sort

    Heap ordering (heapsort) refers to a sort algorithm designed using the data structure of the heap. A heap is a structure that approximates a complete binary tree and satisfies the properties of the heap at the same time: that is, the key value or index of the child node is always less than (or greater than) its parent node. Heap ordering can be said to be a sort of choice using the concept of heaps. It is divided into two ways:

      1. Large Top heap: The value of each node is greater than or equal to the value of its child nodes, and is used in ascending order in the heap sorting algorithm;

      2. Small top heap: The value of each node is less than or equal to the value of its child nodes and is used in descending order in the heap sorting algorithm;

    The average time complexity for heap sorting is 0 (NLOGN).

    1. Algorithm steps

      1. Create a heap h[0......n-1];

      2. Swap the stack head (maximum) with the end of the heap;

      3. Reduce the size of the heap by 1 and call Shift_down (0) to adjust the new array top data to the corresponding position;

      4. Repeat step 2 until the size of the heap is 1.

    2. Motion Diagram Demo

    3. Python Code Implementation

    DefBuildmaxheap(arr):
    Import Math
    For IIn range (len (arr)/Math.floor2),-1,-1):
    Heapify (Arr,i)

    DefHeapify(arr, i):
    left =2*i+1
    right =2*i+2
    largest = I
    If left < Arrlenand Arr[left] > Arr[largest]:
    Largest = Left
    If right < Arrlen and Arr[right] > Arr[largest]:
    largest = Right

    if largest! = I:
    Swap (arr, I, largest)
    Heapify (arr, largest)

    def swap(arr, I, J):
    Arr[i], arr[j] = Arr[j], arr[i]

    def heapsort(arr):
    Global Arrlen
    Arrlen = Len (arr)
    Buildmaxheap (arr)
    for I in range (len (arr)-1,0,-1):
    Swap (arr,0,i)
    Arrlen-=1
    Heapify (arr, 0)
    return arr

    Count sort

    The core of a count sort is to convert the input data value into a key stored in an additional array space. As a sort of linear time complexity, counting ordering requires that the input data be an integer with a definite range.

    1. Motion Diagram Demo

    2. Python Code Implementation

        def  countingsort  (arr, maxValue): 
       bucketlen = maxvalue+ 1
       bucket = [ 0]*bucketlen
       sortedindex = 0
       arrlen = Len (arr)     for I in range (Arrlen):
           , if not Bucket[arr[i]:
               bucket[arr[i]]= 0
           bucket[arr[i]]+= 1 br>     for J in range (Bucketlen):
            while Bucket[j]> 0:
               arr[sortedindex] = J
               sortedinde x+= 1
               bucket[j]-= 1
        return arr span>

    Bucket sort

    Bucket sorting is an upgraded version of the Count sort. It takes advantage of the mapping of functions, the key to efficiency is the determination of this mapping function. To make the buckets more efficient, we need to do these two points:

      1. Increase the number of buckets as much as possible with ample extra space

      2. The mapping function is used to distribute N data of input evenly into K buckets.

    At the same time, for the ordering of the elements in a bucket, it is important to choose which comparison sorting algorithm is of the performance impact.

    1. When is the fastest

    When the input data can be evenly distributed to each bucket.

    2. When is the slowest?

    When the input data is assigned to the same bucket.

    Base sort

    Radix sorting is a non-comparative integer sorting algorithm, which is based on cutting the number of digits into different numbers and then comparing them by each bit. Because integers can also express strings (such as names or dates) and floating-point numbers in a particular format, the cardinality sort is not only used for integers.

    1. Base sort vs count sort vs bucket sort

    There are two ways to sort the cardinality:

    These three sorting algorithms use the concept of buckets, but there are significant differences in how buckets are used:

      • Base order: Allocates buckets according to each digit of the key value;

      • Count sort: Each bucket only stores a single key value;

      • Bucket sorting: Each bucket stores a certain range of values;

    2. LSD Cardinal Sort Action Diagram Demo

Python's top ten classic sorting algorithms

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.