This article is a summary of some sort of data structure, including the analysis of the time and space complexity of various sorting methods, mainly from direct insertion, exchange (bubbling, fast), selection (direct selection, heap sorting) and merging four categories to analyze.
Insert directly:
Insert each record sequentially into a sorted ordered table to get a new, increased number of records 1 i i-1 is already in order, at this point the first i Key and section of the record i-1 i-2
Example: 9 6 8
Complexity of:direct insertion in the best case, only one comparison and no moving element is required per trip, soNtotal number of elements comparedn-1, Total mobile0The worst case scenario, the firstJThe sequencing is compared to each previous record, with a double loop, and this is aboutn^2In the process of ordering, only one element of the auxiliary space, space complexityO(1), and is a stable sort.
Exchange:
bubble: first record key value and second record key value are compared, if R[1].key>r[2].key, then swap, then continue comparing the second and third, comparen-1after completion of the maximum record inNthe position, this is the first trip blistering; then the second blistering finishn-1position, repeat this process until you have not done a record exchange operation.
Example: 9 6 8
Bubble Sort Best case, a set of sorted sequences, just comparen-1times, complete a bubble, time complexity ofO(N), worst case scenario, one forNinverse sequence of elements, each with a number of comparisonsn-1,n-2 ... 1, soO(n^2); The ordering process requires only one element of the secondary space for element exchange, space complexityO(1),and is a stable sort.
Fast:the sorted records are divided into two separate sections by a single pass, where some of the record keywords are not less than the other keyword, and then the two parts continue to be sorted quickly. Practice: Two pointers attachedIand theJ, the initial value points to the first record and the last record, usually assuming that the first record is a keywordKey, first fromJthe position is searched forward to find the first keyword less thanKeyRecords and exchange them, and then fromIthe position is searched backwards to find the first keyword greater thanKeyRecords and exchange them, repeating these two steps untilIwith theJequal.
Select the first record 9 as the keyword, after the first trip to sort, and then to 9 or so two parts of the quick row.
time complexity: through main theorem :t [n] = at[n/b] + f (n) where a >= 1 and b> 1 Span lang= "ZH-CN" style= "font-family: Arial" is a constant   Span lang= "ZH-CN" style= "font-family: Arial" > and f (n) is an asymptotic positive function,   To use this main theorem, you need to consider the following three scenarios:
Each division of a fast sort divides a problem into two sub-problems, in which the relationship can be expressed in the following form:t[n] = 2t[n/2] +o (n) where O (n) is the time complexity of partition (), comparing the primary T [n] =at[n/b] + f (n), Our quick sort: a = 2, B =2, f (n) = O (n)
Choose:
Direct selection: in i sub-select operation, through Span lang= "en-US" style= "FONT-SIZE:18PX; Font-family:calibri ">n-i secondary key value comparison, from Span lang= "en-US" style= "FONT-SIZE:18PX; Font-family:calibri ">n-i+1 record with minimum key value selected, and section i ( i<=i<=n-1 ) record Exchange.
Example: 9 6 8
Complexity Analysis: Direct selection consists of two layers of nesting for cycle, the complexity of the time is O ( n^2 ), and is unstable.
Heap Sort: N/2 Start filtering, step over Span lang= "en-US" style= "FONT-SIZE:18PX; Font-family:calibri ">n/2-1 n/2-2 ... 1
Example: 9 6 8
Put in a binary tree for:
Build heap: Filter from N/2, the second node 6, then filter the first node and build the heap as follows:
Heap sorting process: Heap top element 6 and last node 11 compare, swap, rebuild heap:
Then the heap top element 8 and the second-to-last node are compared, and the heap is rebuilt after swapping until the first two nodes compare to the end of the interchange.
time Complexity: Includes filter and heap comparison two parts. So how many filters do you want to perform? Each time the filter root node sinks, so the number of filters will not exceed the depth of the complete binary tree:([Log2N]Rounding down+1), whereNis the number of nodes., 2as the base,that is, the complexity of time isO(Log2N), the heap comparison is the top node of the heap and the last node begins to be linear at the beginning of the second node, soO(N), so the complexity of timeO(NLog2N).
Merge (in two-way merge): repeated two sequential file merging presents an ordered file sorting method.
Example: 9 6 8
Complexity of Time O ( N Log 2 N ), and is stable. About time complexity analysis is not very understanding, welcome to communicate.
Reference: Fast sorting complexity analysis, merge sorting complexity analysis
Soft test: Sort