I. insertion sort)
1. basic Idea: Insert a data element to be sorted to the appropriate position in the series to be sorted. The sequence is still ordered and the data elements to be sorted are all inserted.
2. stability reasons:
Insert sorting inserts an element at a time based on an ordered small sequence. Of course, at the beginning, this ordered small sequence had only one element, which was the first element. The comparison starts from the end of the ordered sequence, that is, the element to be inserted is compared with the already ordered sequence. If it is larger than it, it is directly inserted after it, otherwise, search until you find the inserted position. If you encounter an element that is equal to the inserted element, the inserted element is placed behind the element that you want to insert. Therefore, the order of equal elements is not changed, and the order from the original unordered sequence is the order after sorting, so insertion sorting is stable.
3. Sorting Process:
Example:
[Initial keyword] 49 38 65 97 76 13 27 49
J = 1 (49) [49] 38 65 97 76 13 27 49
J = 2 (38) [38 49] 65 97 76 13 27 49
J = 3 (65) [38 49 65] 97 76 13 27 49
J = 4 (97) [38 49 65 97] 76 13 27 49
J = 5 (76) [38 49 65 76 97] 13 27 49
J = 6 (13) [13 38 49 65 76 97] 27 49
J = 7 (27) [13 27 38 49 65 76 97] 49
J = 8 (49) [13 27 38 49 65 76 97]
Ii. Select sorting
1. Basic Ideas:
Each trip selects the smallest (or largest) element from the data elements to be sorted, and puts them at the end of an ordered series. All the data elements to be sorted are sorted.
2. reason for stability: Sorting is to set the minimum element for each position. For example, to set the minimum element for the first position, to set the second smallest element for the remaining element, and so on, the n-th element does not need to be selected until the n-th element, because only one of its largest elements is left. If the current element is smaller than an element, and the small element appears after an element equal to the current element, then the stability will be damaged after the exchange. Compare interfaces. For example, in the sequence 5 8 5 2 9, we know that the first selection of 1st elements 5 will exchange with 2, therefore, the relative order of the two 5 in the original sequence is damaged. Therefore, selecting sorting is not a stable sorting algorithm.
3. Sorting Process:
Example:
Initial keyword [49 38 65 97 76 13 27 49]
13 [38 65 97 76 49 27 49] after the first round of sorting
13 27 after the second round of sorting [65 97 76 49 38 49]
13 27 38 after the third round of sorting [97 76 49 65 49]
13 27 38 49 after the fourth round of sorting [49 97 65 76]
After the fifth sorting, 13 27 38 49 [97 97 76]
13 27 38 49 49 76 [76 97]
13 27 38 49 76 76 [97]
Last sorting result 13 27 38 49 49 76 76 97
3. bubblesort)
1. Basic Idea: Compare the sizes of elements in the data to be sorted in two pairs. If the order of the two data elements is the opposite, the elements are exchanged until there is no reverse order.
2. Stability reason: Bubble Sorting is to adjust small elements forward or large elements backward. The comparison is an adjacent comparison between two elements, and the Exchange also occurs between these two elements. Therefore, if the two elements are equal, I think you will not be bored to exchange them. If the two equal elements are not adjacent, even if the two are adjacent through the previous two exchanges, at this time, the sequence of the same elements is not changed, so the Bubble Sorting is a stable sorting algorithm.
3. Sorting Process:
Example:
Initial keyword 49 38 65 97 76 13 27 49
38 49 after the first exchange 65 97 76 13 27 49
38 49 after the second exchange 65 76 97 13 27 49
38 49 after the third exchange 65 76 13 97 27 49
38 49 after the fourth exchange 65 76 13 27 97 49
38 49 65 76 13 27 49 97 after the fifth exchange
......
4. Quick Sort)
1. basic Idea: In the current unordered zone R [1 .. h] Any data element is used as the benchmark for comparison. this benchmark is used to divide the current disordered zoning into two smaller disordered zones: R [1 .. i-1] and R [I + 1 .. h], and the data elements in the unordered subarea on the left are smaller than or equal to the baseline element. The data elements in the unordered subarea on the right are greater than or equal to the reference element, the benchmark X is located at the final sorting position, that is, R [1 .. i-1] ≤ x. key ≤ r [I + 1 .. h] (1 ≤ I ≤ h), when R [1 .. i-1] and R [I + 1 .. h] when they are not empty, they are divided separately until all data elements in the unordered subarea are sorted.
2. stability reason: There are two directions for fast sorting. The I subscript on the left is always on the right. When a [I] <= A [center_index], center_index is the array subscript of the central element, it is generally set to an array of 0th elements. The J subscript on the right goes to the left, when a [J]> A [center_index]. If I and j cannot move, I <= J, exchange a [I] And a [J], repeat the above process until I> J. Exchange a [J] And a [center_index] to complete a quick sorting. When the central element is exchanged with a [J], it is very likely that the stability of the preceding elements is disrupted, for example, the sequence is 5.
3 3 4 3 8 9 10 11, now the exchange of central elements 5 and 3 (5th elements, subscript starts from 1) will disrupt the stability of element 3, therefore, quick sorting is an unstable sorting algorithm, which occurs when the central element is exchanged with a [J.
3. Sorting Process:
Instance demo
Initial keyword [49 38 65 97 76 13 27 49]
After the first exchange [27 38 65 97 76 13 49 49]
After the second exchange [27 38 49 97 76 13 65 49]
J left scan, position unchanged, after the third exchange [27 38 13 97 76 49 65 49]
I scan right, the position remains unchanged, after the fourth switch [27 38 13 49 76 97 65 49]
J left scan [27 38 13 49 76 97 65 49]
(One division process)
Initial keyword [49 38 65 97 76 13 27 49]
[27 38 13] 49 [76 97 65 49]
[13] 27 [38] 49 [49 65] 76 [97]
13 27 38 49 [65] 76 97
Last sorting result 13 27 38 49 65 76 97
Status after sorting
V. Heap sorting
1. basic Idea: heap sorting is a tree-based sorting. During the sorting process, R [1 .. n] As a Complete Binary Tree ordered storage structure, the use of the full binary tree between the parent node and the child node to select the smallest element.
2. stack definition: sequence of n elements K1, K2, K3 ,..., kN. it is called heap. if and only when the sequence satisfies the characteristics: Ki ≤ k2i Ki ≤ k2i + 1 (1 ≤ I ≤ [n/2]) heap is essentially a Complete Binary Tree that meets the following requirements: any non-leaf node in the tree has a keyword greater than or equal to its child node. For example, a sequence of 10, 15, 56, 25, 30, and 70 is a heap, as shown in the Complete Binary Tree corresponding to it. In this kind of heap, the root node (called heap top) has the smallest keyword. We call it a small root heap. Otherwise, if any non-leaf node keyword in A Complete Binary Tree is greater than or equal to its child's keyword, it is called a big root heap.
3. Sorting Process:
Vi. Merge Sorting
1. basic Idea: place two ordered sub-files (equivalent to the input heap) in the adjacent position of the same vector: R [low .. m], R [M + 1 .. high], first merge them into a local temporary storage vector r1 (equivalent to the output heap), and then copy R1 back to R [low .. high.
2. reason for stability: merging and sorting refers to recursively dividing a sequence into a short sequence. The recursive exit means that a short sequence has only one element (which is considered to be directly ordered) or two sequences (one comparison and exchange ), then, the ordered segments are merged into an ordered long sequence until all the original sequences are sorted. It can be found that when one or two elements, one element will not be exchanged. If two elements are equal in size, no one will intentionally exchange them, which will not damage stability. So, in the process of merging short ordered sequences, is stability damaged? No. During the merge process, we can ensure that if the two current elements are equal, we store the elements in the previous sequence before the result sequence, thus ensuring stability. Therefore, Merge Sorting is also a stable sorting algorithm.
3. Example:
7. Base sorting:
1. Basic Idea: the sorting process does not need to compare keywords, but uses the "Allocation" and "Collection" processes to sort data.
2. Stability reason: The base sorting is sorted first by the low position, then collected, and then sorted by the high position, and so on until the highest position. Sometimes some attributes have a priority order. They are first sorted by low priority and then by high priority. The final order is the highest priority, and the highest priority is the highest priority. Base sorting is based on separate sorting and collected separately, so it is a stable sorting algorithm.
3. Example:
8. Hill sorting
1. Basic Idea: First take an integer D1 less than N as the first increment, and divide all records of the file into D1 groups. All records whose distance is a multiple of DL are placed in the same group. Sort directly inserted persons in each group first, and then take the second incremental D2 <d1 repeat the preceding grouping and sorting, until the incremental dt = 1 (dt <DT-L <... <D2 <d1), that is, all records are placed in the same group for direct insertion sorting.
2. reason for stability: Hill sorting sorts elements based on different step sizes. When the elements are unordered at the beginning, the step size is the largest, so the number of inserted elements is very small and the sorting speed is fast; when the elements are basically ordered, the step size is very small, and insertion sorting is very efficient for ordered sequences. Therefore, the time complexity of hill sorting is better than that of O (N ^ 2. Because of the multiple insertion sorting, we know that one insertion sorting is stable and does not change the relative order of the same elements. However, in different insertion sorting processes, the same elements may move in their respective insert sorting, and the final stability will be disrupted, so shell sorting is unstable.
3. algorithm demonstration