Python data structures and algorithms

Source: Internet
Author: User
Tags sorts

data structures and algorithms (Python) Bubble Sort

Bubble Sort (English: Bubble sort) is a simple sorting algorithm. It iterates over the sequence of columns to be sorted, compares two elements at a time, and swaps them if their order is wrong. The work of iterating through the series is repeated until no more swapping is needed, meaning that the sequence is sorted. The algorithm is named because the smaller elements will slowly "float" through the switch to the top of the sequence.

The bubbling sorting algorithm works as follows:

    • Compares the adjacent elements. If the first one is larger than the second (ascending), swap them both.
    • Do the same for each pair of adjacent elements, starting with the last pair from the first pair to the end. When this is done, the final element will be the maximum number.
    • Repeat the above steps for all elements, except for the last one.
    • Repeat the above steps each time for fewer elements, until there are no pairs of numbers to compare.
analysis of bubble sort

Swap process Diagram (First time):

then we need to do a n-1 bubble process, each corresponding to the number of comparisons as shown:

def bubble_sort (alist):      for  in range (len (alist) -1,0,-1):        # J indicates the number of times each traversal needs to be compared,        which is a gradually decreasing for in   Range ( J):            if alist[i] > alist[i+1]:                alist[i], alist[i+1] = alist[i+1= [ 54,26,93,17,77,31,44,55,20]bubble_sort (li) print (LI)

Complexity of Time
    • Optimal time complexity: O (n) (indicates that traversing a discovery does not have any elements that can be exchanged, the sort ends.) )
    • Worst time complexity: O (N2)
    • Stability: Stable
demo of bubbling Sort

Effect:

Select Sort

Select Sort ( Selection sort) is a simple and intuitive sorting algorithm. It works as follows. First find the smallest (large) element in the unordered sequence, place it at the beginning of the sort sequence, and then continue looking for the smallest (large) element from the remaining unsorted elements, and place it at the end of the sorted sequence. And so on until all elements are sorted.

the main advantages of selecting a sort are related to data movement. If an element is in the correct final position, it will not be moved. Select sort every time a pair of elements is exchanged, at least one of them will be moved to its final position, so the table of n elements is sorted for a total of up to n-1 times. In all of the sorting methods that rely entirely on swapping to move elements, choosing a sort is a very good one.

Select Sort Analysis

Sorting process:

Red indicates the current minimum value, yellow indicates the sorted sequence, and blue indicates the current position.

def selection_sort (alist): N=Len (alist) # requires n-1-Time selection Operation forIinchRange (n-1): # Record minimum position min_index=I # from I+1 position to the end to select the minimum data forJinchRange (i+1, N):ifALIST[J] <Alist[min_index]: Min_index=J # If the selected data is not in the correct position, swapifMin_index! =I:alist[i], Alist[min_index]=Alist[min_index], Alist[i] alist= [54,226,93,17,77,31,44,55,20]selection_sort (alist) print (alist)

Complexity of Time
    • Optimal time complexity: O (N2)
    • Worst time complexity: O (N2)
    • Stability: Unstable (consider ascending each time to select the maximum case)
Select Sort Demo

Insert Sort

Insert Sort (English: Insertion sort) is a simple and intuitive sorting algorithm. It works by constructing an ordered sequence, for unsorted data, to scan from backward forward in the sorted sequence, to find the appropriate position and insert it. Insert sort on implementation, in the process of backward forward scanning, the ordered elements need to be shifted backwards and forwards, providing the insertion space for the newest elements.

Insert Sort Analysis

def insert_sort (alist):    # Starting with the second position, the element subscript 1 is    inserted forward for the in range (1  , Len (alist)):        # Starting with the first element, if it is less than the previous one, swap position          for  in range (I, 0,-1):            if alist[j] < alist[j-1]:                alist[j], alist[j -1] = alist[j-1= [54,26,93,17,77,31,44,55,20]insert_sort (alist) print (alist)

Complexity of Time
    • Optimal time complexity: O (n) (in ascending order, sequence already in ascending state)
    • Worst time complexity: O (N2)
    • Stability: Stable
Insert Sort Demo

Quick Sort

Quick Sort (English: Quicksort), also known as the divide-and-exchange sort (partition-exchange sort), divides the sorted data into separate two parts by a single pass, with all of the data in one part smaller than all the other data, Then the two parts of the data are sorted quickly by this method, and the whole sorting process can be carried out recursively so as to achieve the whole data into ordered sequence.

The steps are:

    1. pick an element from the sequence called "datum" (pivot),
    2. Reorder the columns, where all elements are placed in front of the datum in a smaller position than the base value, and all elements are larger than the base value behind the datum (the same number can be on either side). At the end of this partition, the datum is in the middle of the sequence. This is called partition (partition) operation.
    3. recursively (recursive) Sorts sub-columns that are smaller than the base value elements and sub-columns that are larger than the base value elements.

At the bottom of the recursive scenario, the size of the sequence is 0 or one, which is always sorted. Although it is always recursive, the algorithm always ends, because in each iteration (iteration), it will at least put an element in its final position.

Analysis of fast sequencing

def quick_sort (alist, Start, end):"" Quick Sort "" "# Recursive Exit CriteriaifStart >=End:return# Set the starting element as the base element to find the location mid=Alist[start] # Low is the left-to-right cursor low for the left of the sequence=Start # High is the right-to-left cursor on the right of the sequence high=End whileLow <High : # If Low is not coincident with high, the element with high points is not smaller than the datum element, then high moves to the left whileLow < high and Alist[high] >=Mid:high-= 1# Place the high-pointing element in the low position Alist[low]=Alist[high] # If Low is not coincident with high, the element that is pointed to is smaller than the datum element, the lower is moved to the right whileLow < high and Alist[low] <Mid:low+ = 1# Place the low- pointing element in high position Alist[high]=Alist[low] # After exiting the loop, low is coincident with high, at which point the position is the correct position of the Datum element # Place the datum element at that position Alist[low]=Mid # Quick ordering of sub-sequences to the left of Datum elements Quick_sort (alist, start, low-1# Quick sort of sub-sequence to the right of the Datum element Quick_sort (alist, Low+1, end) Alist= [54,26,93,17,77,31,44,55,20]quick_sort (alist,0,len (alist)-1) print (alist)

Complexity of Time
    • Optimal time complexity: O (NLOGN)
    • Worst time complexity: O (N2)
    • Stability: Unstable

Quick sort from the start average takes The description of O (n log n) time is not obvious. But it is not difficult to observe the partitioning operation, the elements of the array will be visited once in each loop, using O (n) time. This operation is also O (n) in the version with which the binding (concatenation) is used.

in the best case, every time we run a partition, we divide a sequence into two nearly equal fragments. This means that each recursive call handles half the size of the sequence. Therefore, we only need to make log n nested calls before we reach the size of a series . This means that the depth of the call Tree is O (log n). However, in two program calls of the same hierarchy, the same part of the original sequence is not processed; Therefore, each hierarchy of program calls requires only an O (n) of time (there is some common additional cost per invocation, but because only O (n) calls are in each hierarchy, these are summarized in O (n) coefficients). The result is that the algorithm only uses O (n log n) time.

Quick Sort Demo

Hill Sort

Hill Sort (Shell sort) is a sort of insertion. Also known as narrowing incremental sorting, is a more efficient and improved version of the direct insertion sorting algorithm. Hill Sort is a non-stable sorting algorithm. The method is due to DL. The shell was named after it was introduced in 1959. Hill sort is to group records by a certain increment of the subscript, sorting each group using the direct insertion sorting algorithm; As the increments gradually decrease, each group contains more and more keywords, when the increment is reduced to 1 o'clock, the entire file is divided into a group, the algorithm terminates.

Hill Sort Process

The basic idea of the hill sort is that the array is listed in a table and the columns are sorted separately, repeating the process, but each time it is done with a longer column (with longer steps and fewer columns). Finally, there is only one column for the entire table. Converting an array to a table is a good way to understand the algorithm, and the algorithm itself uses arrays for sorting.

For example, suppose you have such a group of numbers [13 14 94 33 82 25 59 94 65 23 45 27 73 25 39 10], if we start with a step of 5, we can better describe the algorithm by placing the list in a table of 5 columns, so they should look like this (the vertical element is the step size):

13 14 94 33 8225 59 94 65 2345 27 73 25 3910

Then we sort each column:

10 14 73 25 2313 27 94 33 3925 59 94 65 8245

By connecting the four lines of numbers together, we get: [10 14 73 25 23 13 27 94 33 39 25 59 94 65 82 45]. At this point 10 has moved to the correct position, and then the step is sorted by 3:

10 14 7325 23 1327 94 3339 25 5994 65 8245

After sorting becomes:

10 14 1325 23 3327 25 5939 65 7345 94 8294

finally , sort by 1 steps (this is a simple insert sort)

analysis of the hill sort

def shell_sort (alist):     = Len (alist)    # initial step     = N/2 while     gap > 0:        # Insert Sort        by step for  inch Range (Gap, N):             = I            # Insert sort             while J>=gap and Alist[j-gap] > alist[j]:                alist[j-  Gap], alist[j] = alist[j], alist[j-Gap]                -= gap        # Get a new        step =GAP/2 = [ 54,26,93,17,77,31,44,55,20]shell_sort (alist) print (alist)

Complexity of Time
    • Optimal time complexity: varies by step sequence
    • Worst time complexity: O (N2)
    • Stability thinking: unstable
Hill Sort Demo Merge Sort

Merge sort is a very typical application of divide-and-conquer method. The idea of merging sorts is to recursively decompose the arrays and then merge the arrays.

After the decomposition of the array to a minimum, and then merge two ordered arrays, the basic idea is to compare the first number of two arrays, who is small to take who, after taking the corresponding pointer to move backward one. Then compare until an array is empty and finally copy the remainder of the other array.

Analysis of merge sort

def merge_sort (alist):ifLen (alist) <= 1:        returnalist # binary decomposition num= Len (alist)/2 Left=Merge_sort (Alist[:num]) right=Merge_sort (alist[num:]) # mergereturnMerge (left,right) def merge (left, right):"' merge operation, combine two ordered arrays left[] and right[] into a large ordered array"#left与right的下标指针 L, R= 0, 0result= []     whileL<len (left) and r<Len (right):ifLEFT[L] <Right[r]: Result.append (Left[l]) L+ = 1Else: Result.append (Right[r]) R+ = 1result+=left[l:] Result+=Right[r:]returnresult Alist= [54,26,93,17,77,31,44,55,20]sorted_alist=mergesort (alist) print (sorted_alist)

Complexity of Time
    • Optimal time complexity: O (NLOGN)
    • Worst time complexity: O (NLOGN)
    • Stability: Stable
Search

A search is an algorithmic process that finds a particular item in a project collection. The usual answer to the search is true or FALSE, because the item exists. Several common methods of search: Sequential lookup, binary lookup, binary tree lookup, hash lookup

Binary Method Search

Binary search also known as binary lookup, the advantages are less than the number of comparisons, Find Fast, the average performance is good, the disadvantage is that the unknown origin table is ordered table, and insert delete difficult. Therefore, the binary lookup method is suitable for an ordered list that does not change frequently and finds frequent. First, suppose that the elements in the table are arranged in ascending order, comparing the keywords in the middle position of the table with the lookup keywords, and if they are equal, the lookup succeeds; otherwise, the table is divided into the front and the last two sub-tables with the intermediate positional records, and if the middle position record keyword is greater than the Find keyword, the previous child Otherwise, find the latter child table further. Repeat the process until you find a record that satisfies the criteria, make the lookup successful, or until the child table does not exist, the lookup is unsuccessful at this time.

The binary method finds the implementation of (non-recursive implementation)
def binary_search (alist, item): First= 0 Last= Len (alist)-1 whilefirst<=Last:midpoint= (first + last)/2ifAlist[midpoint] = =Item:returnTrue elif Item<Alist[midpoint]: last= Midpoint-1Else: First= Midpoint+1returnfalsetestlist= [0, 1, 2, 8, 13, 17, 19, 32, 42,]print (Binary_search (testlist,3) Print (Binary_search (testlist,13) ) (Recursive implementation) def binary_search (Alist, item):ifLen (alist) = = 0:        returnFalseElse: Midpoint= Len (alist)//2        ifalist[midpoint]==Item:returnTrueElse:          ifitem<Alist[midpoint]:returnBinary_search (Alist[:midpoint],item)Else:            returnBinary_search (alist[midpoint+1:],item) testlist= [0, 1, 2, 8, 13, 17, 19, 32, 42,]print (Binary_search (testlist,3) Print (Binary_search (testlist,13))

Complexity of Time
    • Optimal time complexity: O (1)
    • Worst time complexity: O (LOGN)

Python data structures and algorithms

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.