Python development [Chapter 28]: algorithm (II), chapter 28 of python

Source: Internet
Author: User

Python development [Chapter 28]: algorithm (II), chapter 28 of python
Heap sorting

1.-tree and Binary Tree Introduction

A tree is a data structure, such as a directory structure.

A tree is a data structure that can be recursively defined.

A tree is a collection of n nodes:

  • If n = 0, this is an empty tree;
  • If n> 0, there is one node as the root node of the tree. Other nodes can be divided into m sets, and each set itself is a tree;

Concepts

  • Root Node,
  • Leaf node
  • Depth (height) of the tree)
  • Tree degree
  • Child node/parent node
  • Subtree

Diagram:

 

2. Binary Tree

Binary Tree: tree with a degree of no more than 2 (a node can have a maximum of two forks)

Diagram:

 

3. Two Special Binary Trees

  • Full Binary Tree
  • Full Binary Tree

Diagram:

 

4. Binary Tree Storage

  • Chained Storage
  • Sequential storage (list)

Diagram:

What is the relationship between the numbers of the parent node and the left child node?

  • 0-1 1-3 2-5 3-7 4-9
  • I ~ 2i + 1

What is the relationship between the numbers of the parent node and the right child node?

  • 0-2 1-4 2-6 3-8 4-10
  • I-2i + 2

 

5. Heap

Heap:

  • Big root heap: A Complete Binary Tree that satisfies the requirement that any node is larger than its child node
  • Small root heap: A Complete Binary Tree that allows any node to be smaller than its child node

Big root heap:

Small root heap:

 

 

6. Heap sorting process

 

7. Create a heap

Def sift (data, low, high ): # low: the root node whose range is to be adjusted # The Last node of high data I = low j = 2 * I + 1 # Left child tmp = data [I] # de-output and Node while j <= high: # The left child is in the list, indicating that I have a child if j + 1 <= high and data [j] <data [j + 1]: # if you have a right child and the Right child is older than the left child, j = j + 1 if data [j]> tmp: data [I] = data [j] I = j = 2 * I + 1 else: break data [I] = tmpdef heap_sort (data): n = len (data) for I in range (n // 2-1,-1,-1): # n // 2-1 fixed usage sift (data, I, n-1) # Build heap

 

8. Heap sorting

Complete code:

Import timeimport randomdef call_time (func): def inner (* args, ** kwargs): t1 = time. time () re = func (* args, ** kwargs) t2 = time. time () print ('time cost: ', func. _ name __, t2-t1) return re return innerdef sift (data, low, high ): # low: the root node whose range is to be adjusted # The Last node of high data I = low j = 2 * I + 1 # Left child tmp = data [I] # de-output and Node while j <= high: # The left child is in the list, indicating that I have a child if j + 1 <= high and data [j] <data [j + 1]: # if you have a right child and the Right child is older than the left child, j = j + 1 if data [j]> tmp: data [I] = data [j] I = j = 2 * I + 1 else: break data [I] = tmp @ call_timedef heap_sort (data ): n = len (data) for I in range (n // 2-1,-1,-1): # n // 2-1 fixed usage sift (data, I, n-1) # construct heap for I in range (n): # loop n times each time a number of data [0], data [n-1-i] = data [n-1-i], data [0] sift (data, 0, n-1-i-1) data = list (range (10000) random. shuffle (data) heap_sort (data) # Time cost: heap_sort 0.08801126480102539

Time Complexity: O (nlogn)

 

 

Merge Sorting

SetOrdered list of two segments, Merge it into an ordered list

Example:

[2,5,7,8,91,3,4,6]

Ideas:

Break down: the smaller the list, until it is divided into one element.

An element is ordered.

Merge: merge two ordered lists, and the list is growing

Code:

import timeimport randomdef call_time(func):    def inner(*args,**kwargs):        t1 = time.time()        re = func(*args,**kwargs)        t2 = time.time()        print('Time cost:',func.__name__,t2-t1)        return re    return innerdef merge(li, low, mid, high):    i = low    j = mid + 1    ltmp = []    while i <= mid and j <= high:        if li[i] < li[j]:            ltmp.append(li[i])            i += 1        else:            ltmp.append(li[j])            j += 1    while i <= mid:        ltmp.append(li[i])        i += 1    while j <= high:        ltmp.append(li[j])        j += 1    li[low:high+1] = ltmpdef _mergesort(li, low, high):    if low < high:        mid = (low + high) // 2        _mergesort(li,low, mid)        _mergesort(li, mid+1, high)        merge(li, low, mid, high)@call_timedef mergesort(li):    _mergesort(li, 0, len(li) - 1)data = list(range(10000))random.shuffle(data)mergesort(data)# Time cost: mergesort 0.0835103988647461

Time Complexity: O (nlogn)

 

 

Hill sorting

Hill sorting is a grouping insertion sorting algorithm.

First, take an integer d1 = n/2, divide the elements into d1 groups, and the distance between each group of adjacent elements is d1. insert and sort the elements in each group directly;

Take the second integer d2 = d1/2 and repeat the group sorting process until di = 1, that is, all elements are directly inserted and sorted in the same group.

Hill sorting does not make certain elements orderly, but keeps the overall data closer and closer. The last sorting makes all data orderly.

Code:

import timeimport randomdef call_time(func):    def inner(*args,**kwargs):        t1 = time.time()        re = func(*args,**kwargs)        t2 = time.time()        print('Time cost:',func.__name__,t2-t1)        return re    return inner@call_timedef shell_sort(li):    gap = len(li) // 2    while gap >= 1:        for i in range(gap, len(li)):            tmp = li[i]            j = i - gap            while j >= 0 and tmp < li[j]:                li[j + gap] = li[j]                j -= gap            li[j + gap] = tmp        gap = gap // 2data = list(range(10000))random.shuffle(data)shell_sort(data)# Time cost: shell_sort 0.1275160312652588

Time Complexity: O (nlogn)

 

 

Comparison of fast sorting, heap sorting, and Merge Sorting:

The time complexity of the three sort algorithms is O (nlogn)

Generally, in terms of running time:

  • Fast sorting <Merge Sorting

Disadvantages of the three sort algorithms:

  • Fast sorting: low sorting efficiency in extreme cases
  • Merge Sorting: Additional memory overhead required
  • Heap sorting: relatively slow in fast sorting algorithms

Comparison time:

Quick_sort (data1) # heap_sort (data2) # mergesort (data3) # sys_sort (data4) # Time cost: quick_sort 0.053006649017333984 # Time cost: heap_sort 0.08601117134094238 # Time cost: mergesort 0.08000993728637695 # Time cost: sys_sort 0.004500627517700195

View:

 

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.