Simple Sorting Algorithm and SIMPLE Algorithm
Recently, I am busy preparing for a job. For coders, all sorts of sorting algorithms, including Algorithm Implementation and complexity analysis, are necessary before I look for a job. So I also began to study various sorting algorithms, but after reading these algorithms several times, I found that the principle is not complex, so I was thinking that these algorithms are so important, how can they be used to solve the problem? In this article, I personally understand how to image and briefly describe the principles of various basic sorting algorithms as much as possible, analyze their complexity, and finally describe their practical application. I hope my article will be as popular as possible. let's expand our imagination!
I. algorithm principle analysis:
1. Bubble Sorting:
First, let's understand its principle from its name. Suppose a lake bottom has 10 (n) bubbles side by side from left to right. We swim from left to right at the bottom of the lake, only two adjacent bubbles are compared at a time. If the left side is smaller than the right side, the two bubbles are exchanged for their positions. After a tour, 9 (n-1) times are compared, the biggest one will get it. Then we will pop the bubble out of the water, and so on. We need to repeat 9 (n-1) times in this way (considering that there are only two at last ), each time it is compared (n-I) times, j is the number of times it is run.
The pseudocode is as follows:
For I = 1: N-1 (n-1)
For j = 1: n-I (compare n-I times)
If a [j]> a [j + 1]
Swap (a [j], a [j + 1])
Endif
Endfor
Endfor
Of course, since the array element starts from 0, you only need to reduce the first and end of the loop by one during programming.
Complexity analysis: the algorithm has two loops, the total number of outer loops (n-1) and the total number of inner loops (n-I). The total number is (n-1) + (n-2) +... + 1 = [n * (n-1)]/2 = O (n ^ 2 ).
2. Insert sorting:
This algorithm is also understood by its name. This time we imagine that the students have lined up, and each student has a different height. Now we hope they can quickly arrange the teams in a descending order. Our strategy is to compare the inserted people with those in the team. If the inserted people are shorter than those in the team, move the people in the team to a backward position, insert it to the appropriate location. In this case, we assume that the first element is already in the array (because no one matches him). In this case, we only need to insert the remaining n-1 individual into the team. If I is already inserted in the team, I + 1 must be compared with I in the team.
For I = 2: n // insert personal I
Key = a [I]; // set it as a keyword for comparison with each position
J = I-1;
For j = I-1: 1 // personal comparison of keywords and previous I-1
If a [j]> key // if it is greater than the keyword, move it backward and leave it blank.
A [j + 1] = a [j];
Else // exit if the key is greater than the value in the team
Break;
Endif
Endfor
A [j + 1] = key; // insert it to the end of the value
Endfor
Complexity Analysis: This algorithm also contains two layers of loops, the outer layer has n-1 elements, the inner layer has I-1 elements, there are 1 + 2 +... + n-1 = [n * (n-1)]/2 = O (n ^ 2)
Take a break. Looking at the above two algorithms, although they are different, they all need to be compared cyclically. In the case of relative order, the inner loop of fast sorting can exit quickly, so the effect is better, the Bubble Sorting is compared from the beginning.
3. Merge Sorting:
After reading two nested loop algorithms, let's look at the recursive algorithms. Merge Sorting is a typical merge algorithm. Let's talk about its idea and merge it. In other words, it is merging. to merge it, we must first separate the original array. Let's continue with the above example. We first divide the team into two groups, then sort the two groups, sort them, and merge them. The two groups can be further divided into two groups and merged. Therefore, recursive methods can be used here.
Merge: // Merged Code
Input: a []
N1 = med;
N2 = n-med;
L [] = a [1: n1];
R [] = a [med + 1: n];
L [n1 + 1] = 65535; // The Sentinel card. After each arrival, only the remaining stack is placed in the array.
R [n2 + 1] = 65535;
I = 1: n1;
J = 1: n2;
For k = 1: n
If (L [I] <R [j])
A [k] = L [I]
I ++;
Else
A [k] = R [j]
J ++;
Endif
Endfor
MergeSort:
Q = n/2;
MergeSort (a, 1, q );
MergeSort (a, q + 1, n );
Merge (a, 1, q, n );
Complexity Analysis: For the complexity analysis of recursive calls, you need to understand the main method: T (n) = aT (n/B) + f (n) three situations: 1) if f (n) is smaller than n ^ (log (B) a), T (n) = Ot (n ^ (log (B) a); 2) if f (n) is greater than n ^ (log (B) a), T (n) = ot (f (n); 3) if f (n) if it is about the same size as n ^ (log (B) a), T (n) = ot (n ^ (log (B) a) * lgn ).
In this algorithm, a = 2, B = 2; so n ^ (log (B) a) = n = f (n), SO 3) is satisfied, so T (n) = O (nlgn ). Although the algorithm takes a short time, two arrays are needed to store L and R. A typical space exchange time method is used.
4. Fast sorting:
When you hear this name, everyone will soon feel that this algorithm is definitely fast, and in fact it is. The algorithm also adopts the divide-Conquer and Recursion ideas. The main idea is: first randomly select a key character as the standard, put shorter than him to the left, put higher than him to the right. Then follow this method in the remaining two groups for recursion.
Partition: // separate and determine the position of the Partition to facilitate separate and recursive sorting.
Te = a [1]; // use the first person as the standard
P_low = 1;
For I = 2: n
If a [I] <TE
Swap (a [I], a [p_low]);
P_low ++;
Endif
Endfor
Return p_low;
QuickSort:
P = Partition (a, 1, n );
QuickSort (a, 1, P-1 );
QuickSort (a, p + 1, n );
Complexity Analysis: similar to Merge Sorting, this algorithm also divides the original problem into two parts: T (n) = T (n/p) + T (n/(1-p) + n, the best case is T (n) = 2 T (n/2) + n = O (nlgn). The worst case is T (n) = T (n-1) + O (n), T (n) = O (n ^ 2), and T (n) = O (nlgn) is obtained from the analysis of the case of not evenly divided ).
Although the complexity of this algorithm is the same as that of the Quick Sort Algorithm, this algorithm is the worst case in the case of basic sorting. Therefore, exercise caution when using this algorithm.
5. Heap sorting:
Next we will talk about heap sorting, which is a little more difficult than the above sorting, but it is easy to understand as long as you master its essence. The actual example is really not good, but you can imagine a binary tree.
Heap sorting mainly utilizes the largest heap and the smallest heap. In essence, it is a binary tree.
This sorting process can be roughly divided into the following three steps: 1) create a binary tree 2) Change the heap to the maximum heap (that is, transfer the maximum value to the top of the heap), starting from the last non-leaf node 3) swap the heap top element with the last element, and then adjust the heap to satisfy the maximum heap (the heap length is changing)
1) No operation is required here. It just reminds you that the elements in the array are stored in the forward order in the tree (left and right of the root)
2)
BuildMaxHeap (a, n ):
The first non-leaf node is n/2.
For I = n/2:1
HeapAjust (a, I, n );
End
HeapAjust (a, I): adjusted to the maximum heap.
L = LEFT (I); // LEFT subtree
R = RIGHT (I); // RIGHT subtree
If l <n & a [l]> a [I]
Largest = l;
Endif
If r <n & a [r]> a [largest]
Largest = r;
Endif
If largest! = I
Swap (a [I], a [largest]);
HeapAjust (a, largest );
Endif
HeapSort (a, n)
BuildMaxHeap (a, n );
For I = n: 2
Swap (a [1], a [I];
HeapAjust (a, 1, I-1 );
Endfor
Complexity Analysis: This algorithm consists of two parts: the first part is the heap adjustment O (lgn), which is called (n/2) times during maximum heap creation, therefore, it is O (nlgn ). in heap sorting, it mainly includes O (nlgn) + O (nlgn) = O (nlgn)
Heap sorting does not require a large number of recursive or multi-dimensional temporary arrays. This is suitable for a sequence with a very large amount of data. For example, if there are more than millions of records, because of fast sorting, Merge Sorting uses recursion to design algorithms. When the data volume is very large, a stack overflow error may occur. However, the fast time generally increases linearly, but the speed of heap sorting is nlgn.