There is another: two-point insertion sort Average time O (n2) stable
1. Insert Sort
In the set of numbers to be sorted, assuming that the number of front (n-1) [n>=2] is already in the order, now the nth number is inserted into the ordinal number in front, so that the n number is also in order. This cycle is repeated until all the rows are in order.
The direct insert sort is stable. Algorithm time complexity O (n2)--[n squared]
Main ()
{
int a[10],j,i,m;
for (j=1;j<10;j++)
{
M=A[J];
for (i=j-1;i>=0;i--)
{
if (a[i]<m)
Break
Else
A[i+1]=a[i];
}
A[i+1]=m;
}
}
Annotated version:
void Lnsertsort (Seqlist R)
{//R[1..N in sequential table R] Insert sort by ascending order
int i,j;
for (i=2;i<=n;i++)//insert r[2],...,r[n]
if (R[i].key<r[i-1].key) {//if R[i].key is greater than or equal to all keys in the ordered area, then R[i]
Should be in the original position
R[0]=r[i];j=i-1; R[0] is a sentinel, and a copy of R[i]
do{//Right-to-left in the ordered area r[1. I-1] In the insertion position of find r[i]
R[J+1]=R[J];//move the keyword greater than R[i].key
j--;
}while (R[0].key<r[j].key);//when R[i].key≥r[j].key is terminated
R[J+1]=R[0];//r[i] Insert to the correct position
}//endif
}//insertsort
2. Hill sort
D.l.shell realized this idea in 1959 in a sort algorithm named after him. The algorithm first sorts the group of numbers by an increment d into groups, each group of records of the subscript difference D. Sort all the elements in each group, then use a smaller increment to do it, and then sort them in each group. When the increment is reduced to 1 o'clock, the entire number to be sorted is divided into a group, and the sort is completed.
The following function is an implementation of a hill sort algorithm, in which half of the first fetch sequence is incremental,
Halve each time later, until the increment is 1.
The hill sort is not stable.
void Shell_sort (int *x, int n)
{
int H, J, K, T;
for (H=N/2; h>0; h=h/2)/* Control increment */
{
for (j=h; j<n; j + +)/* This is actually the above direct insert sort */
{
t = * (X+J);
for (k=j-h; (K>=0 && t<* (x+k)); K-=H)
{
if (* (x+k) <t)
Break
Else
* (X+K+H) = * (X+K);
}
* (x+k+h) = t;
}
}
}
3. Bubble sort
In the set of numbers to be sorted, the total number in the range that is not currently in sequence, the top-down pairs of adjacent two numbers are compared and adjusted sequentially, so that the larger number to sink, smaller upward. That is, each time a comparison of two adjacent numbers finds that they are in the opposite order of order, they are interchanged.
The bubbling sort is stable. Algorithm time complexity O (n2)--[n squared]
Main ()
{
int a[10],i,j,k;
for (i=0;i<9;i++)
for (j=0;j<10-i;j++)
if (a[j]>a[j+1])
{
K=A[J];
A[J]=A[J+1];
A[j+1]=k;
}
}
4. Quick Sort
A quick sort is an essential improvement to the bubbling sort. Its basic idea is that the length of the sequencing sequence can be drastically reduced after a scan. In a bubbling sort, a scan can only ensure that the number of maximum values is moved to the correct position, while the length of the sequence to be sorted may be reduced by only 1. Quick sort through a scan, you can make sure that the number of points on the left is smaller than it, and the number on the right is larger than it. It then uses the same method to manipulate the left and right sides of the number until there is only one element to the left of the datum point.
It is obvious that the fast sort can be implemented recursively, and of course it can be implemented by using the stack to dissolve recursion.
Fast sequencing is not stable. Optimal condition algorithm time complexity O (nlog2n), Worst O (n2)
Main ()
{
int a[10],i;
Quick_sort (a,0,9);
}
Quick_sort (int l[],int first,int end)
{
int split;
if (End>first)
{
Split=quick (first,end,l);//Perform a hill sort once, the return value is the subscript value of this sort datum value
Quick_sort (l,first,split-1);//The same sort operation is done on the arrays around the datum points after the above sorting is completed
Quick_sort (L,split+1,end);
}
}
Quick (int first,int end,int l[])
{
int left=first,right=end;
int Key=l[first];
while (Left<right)
{
while ((Left<right) && (L[right]>=key))
right--;
if (left<right)
L[left++]=l[right];
while ((Left<right) && (L[left]<=key))
left++;
if (left<right)
L[right--]=l[left];
}
L[left]=key;
return left;
}
5. Select sort
In the set of numbers to be sorted, select the smallest number to exchange with the first position, and then in the remaining number, find the smallest and second position of the number of exchanges, so loop to the penultimate number and the last number comparison.
Choosing a sort is not stable. Algorithm complexity O (n2)--[n squared]
Main ()
{
int t,k,i,j,a[10];
for (i=0;i<9;i++)
{
K=i;
for (j=i+1;j<10;j++)
if (A[k]>a[j])
K=j;
T=a[i];
A[I]=A[K];
a[k]=t;
}
}
6. Heap Sequencing
Heap sorting is a sort of tree selection, which is an effective improvement on direct selection sorting. The heap is defined as follows: A sequence with n elements (h1,h2,..., HN), when and only if satisfied (hi>=h2i,hi>=2i+1) or (hi<=h2i,hi<=2i+1) (i=1,2,..., N/2)
is called a heap. Only the heap that satisfies the former condition is discussed here.
As can be seen from the definition of a heap, the top element of the heap (that is, the first element) must be the largest. A complete binary tree can
The structure of the heap is visually represented. Heap top is the root, the other is Zuozi, right subtree. The sequence of the numbers to be sorted is initially treated as a two-fork tree that is stored sequentially, adjusting their order of storage to become a heap, when the heap has the largest number of root nodes. The root node is then exchanged with the last node of the heap. The number of fronts (n-1) is then re-adjusted to make it a heap. And so on, until there are only two nodes of the heap, and exchange them, and finally get an ordered sequence of n nodes.
From the algorithm description, heap sequencing requires two processes, one is to build the heap, the other is the heap top and the last element of the heap
Swap location. So the heap sort has two functions. One is to build the seepage function of the heap, the second is to call the infiltration function repeatedly
A function that implements sorting. There is a maximum heap and a minimum heap of points.
Heap sequencing is not stable. Algorithm time complexity O (nlog2n).
Function: Infiltration Build heap
void Sift (int *x, int n, int s)
{
int T, K, J;
t = * (x+s); /* Staging start Element */
K = s; /* Start element subscript */
j = 2*k + 1; /* Right Sub-tree element subscript */
while (J<n)
{
/* Determine if the condition of the heap is satisfied: meet on the next round to continue the comparison, otherwise adjust. */
if (j<n-1 && * (X+J) < * (x+j+1))
{
j + +;
}
if (t<* (X+J))/* Adjust */
{
* (X+K) = * (X+J);
K = J; /* After adjustment, the starting element is also adjusted */
j = 2*k + 1;
}
else/* There is no need to adjust, it is already a heap, exiting the loop. */
{
Break
}
}
* (x+k) = t; /* Start element put to its correct position */
}
Function: Heap Sort
void Heap_sort (int *x, int n)
{
int I, k, t;
int *p;
for (i=n/2-1; i>=0; i--)
{
Sift (x,n,i); /* Initial Build heap */
}
for (k=n-1; k>=1; k--)
{
t = * (x+0); /* Heap top to last */
* (x+0) = * (X+K);
* (x+k) = t;
Sift (x,k,0); /* The remaining number will be built again. *
}
}
7. Merge sort
Use "merge" techniques to sort. Merge refers to merging several sorted sub-files into an ordered file.
1. Basic idea of algorithm
Set two ordered sub-files (equivalent to the input heap) placed in the same vector adjacent to the position: R[low. M],r[m+1..high], merge them into a local staging vector R1 (equivalent to the output heap) and copy R1 back to R[low when the merge is complete. High].
(1) Merger process
During the merge process, the I,J and p three pointers are set, and their initial values point to the starting position of the three record areas respectively. When merging, compare R[i] and r[j] keywords, take a smaller record of the keyword to r1[p], and then copy the recorded pointer I or J plus 1, and pointer to the copy position p plus 1.
Repeat this process until one of the two input sub-files has been completely copied (it may be called empty), and then the remaining records in the other non-empty sub-file are copied sequentially to R1.
(2) Dynamic application R1
When implemented, R1 is a dynamic application, because the space for the application may be very large, so the application space should be added to the success of the processing.
2. Merging algorithm
void Merge (seqlist r,int low,int m,int High)
{//R[low two ordered sub-files. M] and R[m+1..high] Merge into an orderly
Sub-file R[low: High
int i=low,j=m+1,p=0;//Reset Initial value
RecType *r1;//r1 is a local vector, which is faster if p is defined as a pointer of this type
r1= (Reetype *) malloc ((high-low+1) *sizeof (RecType));
if (! R1)//Application space failed
Error ("Insufficient memory available!") ;
while (I<=m&&j<=high)//Two sub-files are not empty when they are output to r1[p]
r1[p++]= (R[i].key<=r[j].key)? R[i++]:r[j++];
while (I<=M)//If the 1th sub-file is not empty, copy the remaining records into R1
R1[p++]=r[i++];
while (J<=high)//If the 2nd sub-file is not empty, copy the remaining records into R1
R1[p++]=r[j++];
for (p=0,i=low;i<=high;p++,i++)
After the r[i]=r1[p];//merge is complete, the results are copied back to R[low. High
}//merge
8. Two-way search and dichotomy insertion
First of all, the dichotomy lookup applies only to the sorted sequence, if it is a chaotic sequence. There's nothing I can do
There is an array of v already sorted in ascending order, and the array V has n=20 elements. There is an element x in the array, how do you know where x is located in the array?
A common way to solve this problem is to find the binary method. Here is the program:
High = n-1;
return-1; No Find out back-1
}
The idea is simple: first the input value x is compared to the middle element of the array V, if X is less than the middle element, the high value is set to the intermediate element-1, similarly, if X is greater than the middle element, the intermediate element + 1 is used as low, and the low and high are searched
binary insertion of sorting
The algorithm idea is simple to describe:
When inserting the element I, binary the preceding 0~i-1 elements, first with their
The element in the middle is more binary than, if small, to the first half, otherwise to the second half
Proceed binary until left>right, and then place the first 1 bits of the first element between the target positions
All elements are moved back, and the element i is placed in the target position.
Dichotomy is not sorted, only lookup. So when you find the location you want to insert. The move must start with the last record, move one bit backwards, and move the bottom 2nd bit until the record of the position to be inserted is moved.
Binary insertion sort is stable, average time O (n2)
void Binsort (ref int[] data1)
1. Find the insertion position by dichotomy
If R[I]<R[M] is established, the right hand moves the middle pointer to the left, otherwise the left pointer moves the middle pointer one bit to the left. Repeated lookups until the left pointer is greater than the right pointer.
2, after the move, a little confused, when the need to move back? What records need to be moved?
Although it is clear to us that we need to post those records with a sort code greater than r[i], we will inevitably ask ourselves such a few questions. In fact, it is equivalent to moving records from I-1 to left pointers.
3. Insert
The left pointer, obtained from 1, is actually where the element is to be inserted.
4. Algorithm
{
int left,right,num;
int middle,j;
for (int i = 1;i < data1. length;i++)
{
Get ready
left = 0;
right = I-1;
num = Data1[i];
Find the insertion position by dichotomy
while (right >= left)
{
Point to sorted middle position
Middle = (left + right)/2;
if (num < data1[middle])
The inserted element is in the right interval
right = middle-1;
Else
The inserted element is in the left interval
left = middle+1;
}
Post-move sorting code greater than R[i] Records
for (j = i-1;j >= left;j--)
{
DATA1[J+1] = Data1[j];
}
Insert
Data1[left] = num;
}
The inserted element is in the left interval
left = middle+1;
}
Post-move sorting code greater than R[i] Records
for (j = i-1;j >= left;j--)
{
DATA1[J+1] = Data1[j];
}
Insert
Data1[left] = num;
}
IOS Sorting algorithm Summary, binary search