Algorithm Series Note 1 (sort)

Source: Internet
Author: User

This major records some of the classic sorting algorithms, which include bubble sorting, direct selection sorting , insert sort, merge sort, quick sort, heap sort, hill sort, bucket sort , and count sort and cardinality sort. First, the basic idea of these sorting algorithms is given, then the implementation code is given, and the time complexity is given in the end.

1: Bubble sort

Thought:

(1): Compare the adjacent front and back two elements, and if the subsequent data is smaller than the previous data, the position of the two data is exchanged. So after a traversal, the smallest element will be in the No. 0 position, which belongs to "bubbling".

(2): Repeat the first step, followed by the second small ... Elements to the top of the array.

Three ways to exchange data void swap (int &t1, int &t2) {int tmp = T1;T1 = T2;T2 = tmp;}
<pre name= "code" class= "CPP" >//  use XOR to swap void swap2 (int &a, int &b) {if (A! = b) {///  prevent SWAP2 (a , a) itself exchange itself a ^= b;b ^= a;a ^= b;}} void swap3 (int &a, int &b) {if (A! = b) {a = a + B;b = A-b;a = A-C;}}

#include <iostream>using namespace std;//  can be optimized to set the lookout, the exit  time complexity is n^2void swap (int &t1) when no exchange is made for the trip , int &t2) {int tmp = T1;T1 = T2;T2 = tmp;} void Bubblesort (int r1[], int n) {for (int i = 0; i < n; i++) {for (int j = n-1; j > i; j--) {if (R1[j] < r1[j-1]) {Swap (R1[j], r1[j-1]);}}} int main () {int a[10] ={6,5,4,7,2,4,1,8,5,10};bubblesort (A, ten); cout << "bubble sorted:"; for (int i = 0; I < 10; i++) {cout << a[i] << "";} cout << Endl;return 0;}

Time complexity: O (n^2) for a stable sorting algorithm

Note: The monitor can be optimized to set the whistle and exit when the trip is not in any exchange.

2: Direct Selection sort

Thought: The first time from r[0]~r[n-1] to select the minimum value, and r[0] Exchange, the second time from r[1]~r[n-1] to select the minimum value, and r[1] exchange, ...., I time from the r[i-1]~r[n-1] to choose the minimum value, and r[i-1] exchange, ... Section n-1 from r[n-2]~r[n-1] to select the minimum value, and r[n-2] exchange, a total of n-1 times, to get a sort code from small to large order sequence.

Code:

#include <iostream>using namespace std;//uses temporary variables to swap void swap (int &a, int &b) {int tmp = A;A = B;B = tmp;} void display (int a[], int n) {for (int i = 0; i < n; i++) {cout << a[i] << "";} cout << Endl;} void Selectsort (int a[], int n) {for (int i = 0; i < n-1; i++) {int minindex = I;int minelement = a[minindex];for (int j = i+1; J < N; J + +) {if (A[j] < minelement) {minelement = A[j];minindex = j;}} SWAP3 (A[i], A[minindex]);}} int main () {int a[10] ={6,5,4,7,2,4,1,8,5,10}; Selectsort (A, ten);d Isplay (A, ten); return 0;}


Time complexity: O (n^2). For an unstable sort algorithm

3: Insert Sort

Thought: (1) The initial time, a[0] self into 1 ordered areas, disordered area a[1,.. N-1]. Make I=1

(2) Make temp=a[i],j from i-1 to 0, if A[j]>temp a[j+1]=a[j], so a[i] into the current orderly area a[0,... i-1] form a[0,.. I] of the ordered interval.

(3) i++ and repeat the second step until the i==n-1. Sorting is complete.

InsertSort.h

#ifndef insertsort#define insertsort#include "SortAlg.h" class insertsort:public sortalg{public:insertsort (int *is, const int &len): Sortalg (IS, Len) {}void sort () {for (int i = 1; i < length; i++) {int j = i-1, tmp = SORTELEMENT[I];WHI Le (sortelement[j] > tmp && J >= 0) {sortelement[j+1] = sortelement[j];j--;} SORTELEMENT[J+1] = tmp;}}; #endif

The average time complexity is O (n^2), and if the sequence is already sorted, the efficiency is at most O (n). is a stable sorting algorithm.

4: Merge sort

Thought: (1) If only one element is sorted well, return. (2) Merge sort a[1,... N/2] and a[n/2+1,... N] respectively.

(1) Merge the two well-merged sequences into O (n).

Code:

#include <iostream>using namespace std;void mergesort (int r1[], int r2[], int s, int m, int t) {int i = s, j = M+1;i NT k = s;while (i <= m && J <= T) {if (R1[i] <= r1[j]) r2[k++] = r1[i++];elser2[k++] = r1[j++];} if (i > M) {while (J <= t) r2[k++] = r1[j++];} if (J > t) {while (I <= m) r2[k++] = r1[i++];} for (int n = s; n <= t; n++) {///  Note this is essential r1[n] = R2[n];}} void merge (int r1[], int r2[], int s, int t) {if (S < t) {int m = (s+t)/2;merge (R1, R2, S, m);  Merge Sort A[1,,n/2]merge (R1, R2, m+1, T);   Merge sort a[n/2+1, N]mergesort (R1, R2, S, M, t);    Merges the sorted two sub-sequences}}void display (int *a, int length) {for (int i = 0; i < length; i++) {cout << a[i] << "";} cout << Endl;} int main () {int r1[10] = {3,2,5,1,7,4,9,7,8,7}, R2[10];merge (R1, R2, 0, 9);d isplay (R2); return 0;}


Time complexity: O (NLGN). For a stable sorting algorithm

5: Quick Sort

Thought: (1) usually select an element as the principal element x (usually the first element, the randomization is a random selection of an element and then swapped with the first element). Places an element that is less than x in array A to its left, and an element greater than or equal to X on its right. < This step is usually divided into two ways, the book is dug pit filling, from both sides. And the other way is to start from the side, the code also gives the >.

(2) The left and right sub-sequences can be sorted quickly.

Code:

#include <iostream> #include <ctime>using namespace std;void swap (int &t1, int &t2) {int tmp = T1;T1 = t 2;T2 = tmp;} The Division method in the MIT video is only made from one side int Partition (int r1[], int p, int q) {int i = p, j = p+1;int pivot = r1[p];for (j = p+1; j <= Q; j + +) {if (R1[j] < pivot) {I++;swap (R1[i], r1[j]);}} Swap (R1[i], r1[p]); return i;}  Division method from both sides of the book divide and dig pit fill number divide and conquer method unstable sorting algorithm int Partition2 (int r1[], int p, int q) {int i = p, j = q;int pivot = r1[p];while (i < J) {while (J > I && r1[j] > Pivot) J--;if (J > i) r1[i++] = R1[j];while (i < J && R1[i] < Pivot) I++;if (i < j) r1[j--] = R1[i];} R1[i] = Pivot;return i;}          A randomized fast sorting algorithm int randompartition (int r1[], int p, int q) {srand (unsigned) time (0)), int randint = rand ()% (q-p+1) +p; Randomly selects an element as the main element swap (R1[p], r1[randint]); int i = p, j = p+1;int pivot = r1[p];for (j = p+1; j <= Q; j + +) {if (R1[j] < Pivot) {I++;swap (R1[i], r1[j]);}} Swap (R1[i], r1[p]); return i;} void QuickSort (int r1[], int p, int q) {if (P < q) {inT m = Partition (r1, p, q);//int m = randompartition (r1, p, q); QuickSort (R1, p, m-1); QuickSort (R1, m+1, q);}} int main () {int a[10] ={6,5,4,7,2,4,1,8,5,10};cout << "before sort:"; for (int i = 0; I < 10; i++) {cout << a[i] << "";} cout << Endl; QuickSort (A, 0, 9); cout << "quick-row:"; for (int i = 0; I < 10; i++) {cout << a[i] << "";} cout << Endl;return 0;}


The average time complexity is O (NLGN). When the sequence is in order, at this time the worst case, the time complexity is O (n^2). is an unstable sorting algorithm.

6: Heap Sort

The first thing to understand about heap sequencing is the process of storing, deleting, inserting, and stacking the heap (see: http://blog.csdn.net/morewindows/article/details/6709644). The idea of heap sequencing is to swap the root element of the already built heap (small heap example) with the last one, and then heap the heap with the last element into a two-fork heap, so that the last element is the smallest element, repeating the operation, and the resulting array is the descending array.

Code:

#include <iostream>using namespace std;void display (int a[], int n) {for (int i = 0; i < n; i++) {cout << a[i] << "";} cout << Endl;} void swap (int &a, int &b) {int tmp = A;A = B;B = tmp;} Using an array to store the heap, the insertion element is equivalent to stacking the last inserted element up in the array to heap void minheapfixup (int a[], int i) {int tmp = A[i];int j = (i-1)/2;while (J >= 0 &amp        ;& i!= 0) {//No i!=0 will appear dead loop if (A[j] <= tmp) break;a[i] = A[j];i = J;j = (i-1)/2; -1/2 = = 0}a[i] = tmp;} add element void Minheapaddnumber (int a[], int n, int x) {a[n-1] = X;minheapfixup (A, n-1) in the smallest heap;} Deleting elements is to delete the No. 0 element and place the last element in the position of the No. 0 element to heap void Minheapfixdown (int a[], int i, int n) {//delete element int j = 2*i +1;int TM p = a[i];while (J < N) {if (J+1 < n && a[j+1] < a[j]) j++;if (tmp <= A[J]) break;a[i] = A[j];i = J;j = j*2+ 1;} A[i] = tmp;} Delete the number of void Minheapdeletenumber (int a[], int n) {swap (a[0], a[n-1]) in the minimum heap, Minheapfixdown (A, 0, n-2);} The heap array establishes a minimum heap of void makeminheap (int a[], int n) {for (int i = N/2-1; I >= 0; i--) {MinHEapfixdown (A, I, n);}} Heap sort void minheapsorttodes (int a[], int n) {for (int i = n-1; i > 0; i--) {swap (a[0], a[i]), Minheapfixdown (A, 0, i);}} int main () {int a[11] = {9, 12, 17, 30, 50, 20, 60, 65, 4, 19};//heap Makeminheap (A, ten);d Isplay (A, 10);//Add Element Minheapaddnumbe R (A, one, 3);d Isplay (A, 11),//delete element//minheapdeletenumber (A, one),//display (A, 11),//Heap sort minheapsorttodes (A, one);d isplay ( A, one); return 0;}


The time complexity is O (NLGN). is an unstable sorting algorithm.

7: Hill Sort

Thought: First, the entire sequence of elements is divided into a number of sub-sequences (consisting of elements separated by an "increment") to directly insert the sort, and then reduce the increment and then order, the whole sequence of elements are basically ordered (the increment is small enough), and then a direct insertion of the whole element of the ordering. Because the direct insert sort is very efficient when the elements are basically ordered (close to the best case), the hill sort is more efficient in time than the first two methods.

The stride length is usually started with N/2, halved each time, until the last 1.

Code:

#include <iostream>using namespace std;void shellsort (int a[], int n) {for (int gap = N/2; gap > 0; gap--) {   // Step for (int i = gap; i < n; i++) {   int temp = A[i];int J = i-gap;while (J >= 0 && a[j] > Temp) {   //Enter Row direct Insert sort a[j+gap] = a[j];j-= gap;} A[J+GAP] = temp;}}} void display (int a[], int n) {for (int i = 0; i < n; i++) {cout << a[i] << "";} cout << Endl;} int main () {int a[10] ={6,5,4,7,2,4,1,8,5,10}; Shellsort (A, ten);d Isplay (A, ten); return 0;}


Time complexity is generally dependent on the step size, between O (NLGN) and O (n^2), for unstable ordering.

8: Bucket Sort

Bucket sorting (bucket sort) or so-called box sorting is a sorting algorithm that works by splitting the array into a finite number of buckets. Each bucket is sorted separately (it is possible to use a different sorting algorithm or to sort by using the bucket sort recursively). Bucket sequencing is an inductive result of pigeon nest sorting. When the values in the array to be sorted are evenly distributed, the bucket sort uses linear time (Θ (n)). But the bucket sort is not a comparison sort, and he is not affected by the O (n log n) lower bound.

For example, to sort n integer a[1..n in the [1..1000] range, you can set the bucket to a range of size 10, specifically, set the set b[1] to store [1..10] integers, set b[2] to store (10..20] integers, ... Set B[i] Storage ((i-1) *10,i*10] integer, i =,.. 100. There are a total of 100 barrels. Then scan the A[1..N] from start to finish, and put each a[i] into the corresponding bucket b[j]. And then the 100 barrels in each bucket in the number of sorting, then can be bubbling, selection, and even fast, in general, any sort method can be. Finally, the numbers in each bucket are output sequentially, and the numbers in each bucket are output from small to large, so that a sequence of all the numbers is ordered.

Suppose there are n numbers, there are m buckets, and if the numbers are evenly distributed, there is an average number of n/m in each bucket. If the numbers in each bucket are sorted quickly, the complexity of the entire algorithm is O (N+m*n/m*log (n/m)) =o (N+NLOGN-NLOGM)

As seen from the above, when M approaches N, the sorting complexity of buckets is close to O (n)

Of course, the calculation of the above complexity is based on the assumption that the input n numbers are evenly distributed . This hypothesis is very strong, the actual application of the effect is not so good. If all the numbers fall into the same bucket, it will degenerate into a general sort.

#include <iostream>using namespace std;//mit Video division method only from one side of int Partition (int r1[], int p, int q) {int i = p, j = p+ 1;int pivot = r1[p];for (j = p+1; j <= Q; j + +) {if (R1[j] < pivot) {I++;swap (R1[i], r1[j]);}} Swap (R1[i], r1[p]); return i;} void QuickSort (int r1[], int p, int q) {if (P < q) {int m = Partition (r1, p, q);//int m = randompartition (r1, p, q); QuickSort (R1, p, m-1); QuickSort (R1, m+1, q);}}      The element in each bucket is an array and records the number of elements stored in the bucket struct bucket{int node[10]; There should be a simple definition for the size of the array to be sorted in order to 10 very limited int count;}; void Bucketsort (int r1[], int n) {//Bucket size is 10bucket* pbucket = new bucket[10];//initialize bucket for (int i = 0; i <; i++) {(Pbucke t+i)->count = 0;} int maxnum = R1[0], minnum = r1[0];for (int i = 1; i < n; i++) {if (R1[i] > Maxnum) maxnum = R1[i];if (R1[i] < Minnum ) Minnum = R1[i];} Put data in a bucket for (int i = 0; i < n; i++) {int index = ((r1[i]-minnum)/(Maxnum-minnum)) *9; (pbucket+index)->node[(pbucket+     Index)->count] = R1[i]; (Pbucket+index)->count++;} Quick or other sorting for each bucketint k = 0;for (int i = 0; i <; i++) {QuickSort (pbucket+i)->node, 0, (pbucket+i)->count-1); for (int j = 0; J < (pbucket+i)->count; J + +) {r1[k++] = (pbucket+i)->node[j];}} delete []pbucket;} void display (int a[], int n) {for (int i = 0; i < n; i++) {cout << a[i] << "";} cout << Endl;} int main () {int a[10] ={6,5,4,7,2,4,1,8,5,10}; Bucketsort (A, ten);d Isplay (A, ten); return 0;}


Time complexity O (n), an unstable sorting algorithm

9: Count sort and number of cardinality rows

Both sorting algorithms are linear time-sorting algorithms, which assume that the input data is an integer in a small interval, and is a small-scale number. The base sort is generally used to deal with a wide range of data within a linear time.

Code to sort by count:

#include <iostream>using namespace std;void countsort (int r1[], int c[], int n, int k, int r2[]) {for (int i = 0; I &l T K i++) C[i] = 0;for (int i = 0; i < n; i++) c[r1[i]]++;   Count the occurrences of each element for (int i = 1; i < K; i++) c[i] = C[i-1] + c[i];        The count is less than or equal to the number of elements that the element is about to place  in R2//for (int i = n-1; I >= 0; i--) {     //To ensure that the algorithm is stable for (int i = 0; i < n; i++) {R2[c [R1[i]]-1] = R1[i];c[r1[i]] = C[r1[i]]-1;}} int main () {int a[10] = {2,3,1,4,5,4,2,3,0,3}, B[10];int c[6];cout << "before sort:"; for (int i = 0; I < 10; i++) {cout << a[i] << "";} cout << Endl; Countsort (A, C, 6, b); cout << "Count after sort:"; for (int i = 0; I < 10; i++) {cout << b[i] << "";} cout << Endl;return 0;}


The cardinality sort code may be limited to the actual application, starting from the single bit, then to 10 bits, finally to the high, etc., can be used for each bucket sort.

They have a time complexity of O (n)

Summary: (1) bubble sort, direct select sort, insert sort, merge sort, quick sort, heap sort, hill sort as comparison based sorting algorithm, the average time complexity is no less than O (NLGN). Both the count and the base sort are linear time sorting algorithms. Bucket sorting is more special

(2) The average time complexity of bubble sort, direct selection sorting, and insertion sort is O (n^2). The average time complexity for merge sorting, quick sorting, and heap sorting is O (NLGN), which is the best comparison sorting algorithm for progressive. Bucket sort, count sort, and cardinality sort O (n)

(3) Direct Select sort, quick sort, heap sort, hill sort (Leap step exchange) and bucket sort all are unstable sorting algorithms, they all leap over. The others are stable sorting algorithms.

Reference documents

1:http://blog.csdn.net/column/details/algorithm-easyword.html vernacular vernacular classical algorithm series

2:http://blog.csdn.net/houapple/article/details/6480100 Bucket Sorting


Algorithm Series Note 1 (sort)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.