Fundamentals of Algorithmic theory

Source: Internet
Author: User
Tags for in range sorts

Algorithm: A computational process, a method of solving the problem. Algorithm is an independent existence of a problem-solving methods and ideas.

Five features of the algorithm

    1. Input: Algorithm has 0 or more inputs
    2. Output: The algorithm has at least 1 or more outputs
    3. Poor: The algorithm automatically ends after a limited number of steps without an infinite loop, and each step can be completed within an acceptable time
    4. Deterministic: Each step in the algorithm has a definite meaning and does not appear ambiguity
    5. Feasibility: Every step of the algorithm is feasible, that is, each step is able to perform a limited number of times to complete

Time complexity is an equation (unit) that is used to estimate the run time of an algorithm.

Time complexity: Suppose there is a function g, so that the algorithm A to deal with the size of the problem example of n time is t (n) =o (g (n)), then the O (g (n)) is the asymptotic time complexity of algorithm A, referred to as the time complexity, as t (N)

"Big O notation": For monotone integer function f, if there is an integer function g and a real constant c>0, so that for a sufficiently large n always have f (n) <=c*g (n), it is said that the function g is an asymptotic function of f (ignoring the constant), recorded as f (n) =o (g (n)). In other words, the growth rate of function f is constrained by the function g in the limit meaning of infinite, that is, function f is similar to function G.

Several basic calculation rules of time complexity
    1. The basic operation, that is, only the constant term, that its time complexity is O (1)
    2. Sequential structure, time complexity calculated by addition
    3. Cyclic structure, time complexity calculated by multiplication
    4. Branching structure, maximum time complexity
    5. When judging the efficiency of an algorithm, it is often only necessary to pay attention to the highest number of operations, other minor items and constant items can be ignored
    6. In the absence of special instructions, the time complexity of the algorithm we are analyzing refers to the worst time complexity

The Sorting algorithm (English: Sorting algorithm) is an algorithm that can arrange a string of data in a particular order.

    • Sort low B threesome:
      • Bubble sort
      • Select sort
      • Insert Sort
    • Quick Sort
    • Sort NB two person group:
      • Heap Sort
      • Merge sort
    • There is no one to use the sort:
      • Base sort
      • Hill sort
      • Bucket sort

Bubble sort (English: Bubble sort) is a simple sort algorithm. It iterates over the sequence of columns to be sorted, compares two elements at a time, and swaps them if their order is wrong. The work of iterating through the series is repeated until no more swapping is needed, meaning that the sequence is sorted. The algorithm is named because the smaller elements will slowly "float" through the switch to the top of the sequence.

The bubbling sorting algorithm works as follows:

    • to compare adjacent elements. If the first one is larger than the second (ascending), swap them both. The
    • works the same for each pair of adjacent elements, starting with the last pair from the first pair to the end. When this is done, the final element will be the maximum number. The
    • repeats the above steps for all elements, except for the last one.
    • repeats the above steps each time for fewer elements, until there is no pair of numbers to compare. If a trip is performed in a bubbling sort without swapping, the list is already ordered and you can end the algorithm directly.
def Bubble_sort (LI):      for  in range (Len (LI)-1):        = False        for in range (Len (li) -i) :            if li[j]>li[j+1]:                li[j],li[j+1] = li[j+1],li[j]                =  True        if not Exchange:            return True

Complexity of Time

    • Optimal time complexity: O (n) (indicates that traversing a discovery does not have any elements that can be exchanged, the sort ends.) )
    • Worst time complexity: O (n2)
    • Stability: Stable
Select sort

Select sort (Selection sort) is a simple and intuitive sorting algorithm. It works as follows. First find the smallest (large) element in the unordered sequence, place it at the beginning of the sort sequence, and then continue looking for the smallest (large) element from the remaining unsorted elements, and place it at the end of the sorted sequence. And so on until all elements are sorted.

The main advantages of selecting a sort are related to data movement. If an element is in the correct final position, it will not be moved. Select sort every time a pair of elements is exchanged, at least one of them will be moved to its final position, so the table of n elements is sorted for a total of up to n-1 times. In all of the sorting methods that rely entirely on swapping to move elements, choosing a sort is a very good one.

def Select_sort (LI):      for  in range (Len (LI)-1):        for in range (I+1, Len (LI)):             if li[i]>li[j]:                = li[j],li[i]
Complexity of Time
    • Optimal time complexity: O (n2)
    • Worst time complexity: O (n2)
    • Stability: Unstable (consider ascending each time to select the maximum case)
Insert Sort

Insert Sort (English: insertion sort) is a simple and intuitive sorting algorithm. It works by constructing an ordered sequence, for unsorted data, to scan from backward forward in the sorted sequence, to find the appropriate position and insert it. Insert sort on implementation, in the process of backward forward scanning, the ordered elements need to be shifted backwards and forwards, providing the insertion space for the newest elements.

def Insert_sort (LI):      for  in range (1, Len (LI)):        = Li[i]        = i-1 while and         Li[j] > Temp:            + 1] = Li[j]            -= 1        li[j+1] = Temp
Complexity of Time
    • Optimal time complexity: O (n) (in ascending order, sequence already in ascending state)
    • Worst time complexity: O (n2)
    • Stability: Stable
Quick Sort

Quick Sort (English: Quicksort), also known as the divide-and-exchange sort (partition-exchange sort), divides the sorted data into separate two parts by a single pass, where all of the data is smaller than any other part of the data, Then the two parts of the data are sorted quickly by this method, and the whole sorting process can be carried out recursively so as to achieve the whole data into ordered sequence.

The steps are:

    1. Pick an element from the sequence called "Datum" (pivot),
    2. Reorder the columns, where all elements are placed in front of the datum in a smaller position than the base value, and all elements are larger than the base value behind the datum (the same number can be on either side). At the end of this partition, the datum is in the middle of the sequence. This is called partition (partition) operation.
    3. recursively (recursive) sorts sub-columns that are smaller than the base value elements and sub-columns that are larger than the base value elements.

At the bottom of the recursive scenario, the size of the sequence is 0 or one, which is always sorted. Although it is always recursive, the algorithm always ends, because in each iteration (iteration), it will at least put an element in its final position.

defQuick_sort (li,start,end):" "Quick Sort" "    #conditions for recursive rollout    ifStart >=End:return    #set the starting element to locate the base element for the locationMID =Li[start]#Low is left-to-right cursor to the left of the sequenceLow =Start#High is the right-to-left cursor to the right of the sequenceHigh =End whileLow <High :#If low is not coincident with high, the element with high points is not smaller than the datum element, then high moves to the left         whileLow < High andLi[high] >=Mid:high-= 1#Place the high- pointing element in the low positionLi[low] =Li[high]#If low is not coincident with high, the element that is pointed to is smaller than the datum element, and the lower is moved to the right         whileLow < High andLi[low] <Mid:low+ = 1#Place the element with low pointing to the high positionli[high]=Li[low]#after exiting the loop, low is coincident with high, at which point the position is the correct position of the datum element    #Place a datum element at that locationLi[low] =Mid#Quick ordering of sub-sequences to the left of a datum elementQuick_sort (li,start,low-1)    #Quick ordering of sub-sequences to the right of a datum elementQuick_sort (li,low+1, end) Li= [54,26,93,17,77,31,44,55,20]quick_sort (Li,0,len (LI)-1)Print(LI)
Complexity of Time
    • Optimal time complexity: O (NLOGN)
    • Worst time complexity: O (n2)
    • Stability: Unstable
Merge sort

Merge sort is a very typical application of divide-and-conquer method. The idea of merging sorts is to recursively decompose the arrays and then merge the arrays.

After the decomposition of the array to a minimum, and then merge two ordered arrays, the basic idea is to compare the first number of two arrays, who is small to take who, after taking the corresponding pointer to move backward one. Then compare until an array is empty and finally copy the remainder of the other array.

defmerge (left,right): L,r=0,0 Result= []     whileL<len (left) andR <Len (right):ifLEFT[L] <Right[r]: Result.append (Left[l]) L+ = 1Else: Result.append (Right[r]) R+ = 1result+=left[l:] Result+=Right[r:]returnresultdefMerge_sort (LI):ifLen (LI) <=1:        returnLi Num= Len (li)//2Print(num) left=Merge_sort (Li[:num]) right=Merge_sort (li[num:])returnmerge (Left,right)Print(Merge_sort (LI))
The concept of a tree

A tree is an abstract data type (ADT) or a data structure of an abstract data type that simulates a collection of data of a tree-like nature. It is a set of hierarchical relationships consisting of n (n>=1) finite nodes. It's called a "tree" because it looks like an upside down tree, which means it's rooted up and leaves facing down. It has the following characteristics:

    • Each node has 0 or more child nodes;
    • A node without a parent node is called the root node;
    • Each non-root node has only one parent node;
    • In addition to the root node, each child node can be divided into multiple disjoint subtrees;

Terminology of the tree
    • Node Degree: The number of sub-trees that a node contains is called the degree of the node;
    • Degree of the tree: the degree of the largest node in a tree is called the degree of the tree;
    • leaf node or end node: a zero-degree node;
    • Father node or parent node: If a node contains child nodes, this node is called the parent node of its child nodes;
    • child node or child node: the root node of a subtree containing a node is called a child node of that node;
    • Sibling nodes: nodes with the same parent node are called sibling nodes;
    • The hierarchy of nodes: From the beginning of the root definition, the root is the 1th layer, the root of the child node is the 2nd layer, and so on;
    • The height or depth of a tree: the maximum level of nodes in a tree;
    • Cousin node: The parent node in the same layer of the nodes are each other cousins;
    • Ancestor of a node: all nodes from the root to the branch of the node;
    • Descendants: Any node in a subtree that is rooted in a node is known as the descendant of that node.
    • Forest: The collection of trees that are disjoint by M (m>=0) is called a forest;
Types of Trees
    • Unordered tree: There is no sequential relationship between the child nodes of any node in the tree, which is called the unordered tree, also known as the free tree;
    • Ordered tree: There is a sequential relationship between the child nodes of any node in the tree, which is called ordered tree;
      • Binary tree: A tree with a maximum of two subtrees per node is called a binary tree;
        • Complete binary tree: For a binary tree, suppose its depth is D (d>1). In addition to layer D, the number of nodes in each layer has reached the maximum, and all nodes of layer d are continuously arranged from left to right, so that the two-fork tree is called a complete binary tree, where the definition of full two-tree is the complete binary tree with all the leaf nodes at the bottom;
        • Balanced binary tree (AVL tree): When and only if the height difference of the two subtrees trees of any node is not greater than 1 of the two-fork tree;
        • Sort binary tree (binary lookup tree (English: Binary search tree), also known as binary searching trees, ordered binary tree);
      • Hoffman (For information coding): The shortest two-tree with the right path is called Huffman tree or optimal binary tree;
      • B-Tree: A self-balancing two-fork search tree that optimizes read and write operations, keeps data in order and has two additional subtrees.

Two kinds of special binary trees:

How to store the binary tree:

Chain-type storage;

sequential storage;

Heap Sort:

Heap

    • Dagen: A fully binary tree that satisfies either node is larger than its child node
    • Small Gan: A completely binary tree that satisfies either node is smaller than its child node

Heap sorting process:

    1. Build heap
    2. Get the top element of the heap, for the largest element
    3. Remove the heap top and place the last element of the heap at the top of the heap, where the heap can be re-ordered with a single adjustment.
    4. The top element of the heap is the second largest element.
    5. Repeat step 3 until the heap becomes empty.
defSift (Data,low,high): I=Low J= 2*i+1Temp=Data[i] whileJ <=High :ifJ < High andDATA[J] < data[j+1]: J+ = 1ifTemp <Data[j]: data[i]=Data[j] I=J J= 2*i + 1Else:             BreakData[i]=Tempdefheap_sort (data): Num=len (data) forIinchRange (num//2-1,-1,-1): Sift (Data,i,num-1)     forIinchRange (n-1,-1,-1): Data[0],data[i]=Data[i],data[0] Sift (data,0,i-1)

Fundamentals of Algorithmic theory

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.