The implementation of the eight-large sorting algorithm of C + + to summarize _c language

Source: Internet
Author: User
Tags sorts

Overview sort has the internal sort and the external sort, the internal sort is the data record in the memory to carry on the sort, but the external sort is because the sorting data is very big, once cannot accommodate all sorts records, in the sorting process needs to access the outer space.

Here we say that the eight sorts are the internal sort.


When n is large, you should use the time complexity of O (nlog2n) Sorting method: Quick Sort, heap sort or merge sort.

Quick sort: is currently considered the best method based on the comparison, when the keyword to be sorted is randomly distributed, the average time of fast ordering is shortest;

1. Insert sort-Direct Insert sort (Straight insertion sort)

Basic idea:

Inserts a record into a sorted ordered table to get a new ordered table with a record number of 1. That is, the 1th record of the sequence is first considered as an ordered subsequence, and then inserted from the 2nd record, until the entire sequence is ordered.

Important: Set up sentinels for temporary storage and judgment of array boundaries.

Direct Insert Sort Example:


If you encounter an element that is equal to the insertion element, the insertion element places the elements that you want to insert behind the equality element. Therefore, the order of the equal elements is unchanged, the order from the original unordered sequence is the order after the sequence, so the insertion sort is stable.

Implementation of the algorithm:

Insert sort
void insert_sort (int *list,int count)
{
 int temp;/* here as a sentinel, not in the list array alone for a unit
 /int i,j;
 for (i=1;i<count;i++)
 {
 if (list[i]<list[i-1])
 {
  temp = list[i];
  for (j=i-1;list[j]>temp&&j>=0;j--)
  {
  list[j+1] = list[j];
  }
  LIST[J+1] = temp;
 }}}


Efficiency:

Complexity of Time: O (n^2).

The other insertion sort has a binary insertion sort, 2-way insertion sort.

2. Insert sort-Hill sort (Shell ' s sort)

Hill sort was proposed by D.l.shell in 1959, and the relative direct ordering has a great improvement. Hill sort is also called narrowing the incremental sort. Hill sort is a non stable sort algorithm.

Basic idea:

Hill sort is to the record by a certain increment of the subscript group, the use of a direct insertion sort algorithm for each group; As the increment gradually decreases, each group contains more and more keywords, when the increment is reduced to 1 o'clock, the whole file is divided into a group, the algorithm will terminate.

Action method:

Sort process: First take a positive integer d1<n, put all the array elements separated by the ordinal number D1 a group, the group of direct insertion sort, and then take D2<D1, repeat the grouping and sorting operations, until Di=1, that is, all records into a group sorted.

Algorithm implementation:

We simply deal with the increment sequence: Increment sequence d = {N/2, N/4, N/8 ... 1} n is the number of numbers to sort

That is, the group of records that will be sorted is divided into several groups of n/2,n by an increment D (the number of numbers to be sorted). The subscript of a record in each group D. Inserts a direct sort of all elements in each group, then groups it with a smaller increment (D/2), and then inserts the sort directly in each group. Continue to shrink incrementally until 1, and finally use the direct insert sort to complete the sort.

Shell sort
void shell_sort (int *list,int count)
{
 int i,j;
 int temp;
 int increment = count;
 Do
 {
 increment = INCREMENT/2;
 for (i = increment;i<count;i++)
 {
  if (List[i]<list[i-increment])
  {
  temp = list[i];
  for (j=i-increment;j>=0&&list[j]>temp;j-=increment)
  {
   list[j+increment] = list[j];
  }
  List[j+increment] = temp;
  }}}

 while (increment>1);
}

Hill Sort Aging analysis is difficult, the number of key codes and the number of records to move depends on the selection of the increment factor sequence D, in particular cases can accurately estimate the number of key codes and record movement times. No one has yet given the best way to choose the most incremental factor sequence. The increment factor sequence can have various kinds of approaches, odd and prime, but note that there is no common factor in the increment factor except 1, and the last increment factor must be 1. The hill sorting method is an unstable sort of method. The lower bound of Hill's sorting time complexity is n*log2n, so the medium size is well behaved.
3. Select sort-Simple Select sort (plain Selection sort)

Basic idea:

In the set of numbers to be sorted, select the smallest (or largest) number and the number of the 1th position to exchange; then find the smallest (or largest) number in the 2nd position in the remaining number, and so on, until the n-1 element (the penultimate number) and the nth element (the last number) are compared.

Examples of simple selection sorting:

The time complexity of the comparison operation is O (n^2), and the time complexity of the move operation is o (n). Simple selection sorting is an unstable sort.

Action method:

The first trip, from N records to find the smallest key code record and the first record exchange;

Second, the record of the smallest key code is then exchanged with the second record from the n-1 record beginning with the second record;

And so on .....

In the first trip, the record of the smallest key is selected from the N-i+1 record beginning with the I record, and the first record is exchanged,

Until the entire sequence is ordered by key code.

Algorithm implementation:

Select the sort
void select_sort (int *list,int count)
{
 int min,i,j
 ; for (i=0;i<count;i++)
 {
 min = i;
 for (j=i+1;j<count;j++)
 {
  if (List[min]>list[j])
  {
  min = j;
  }
 }
 if (min!=i)
 {
  swap (list[i],list[min]);
 }
 }


The improvement of simple selection sort--two-yuan selection sort

Simple selection of the sort, each cycle can only determine the ordering of an element after the positioning. We can consider improving the position of two elements (the current maximum and minimum records) for each cycle, thus reducing the number of loops required for sorting. After the improvement of the N data sorting, up to only a [N/2] cycle can be.

4. Select Sort-heap sort (Heap sort)

Heap ordering is a sort of tree selection, which is an effective improvement of direct selection sort.

Basic idea:

The heap is defined as follows: A sequence with n elements (k1,k2,..., kn), if and only if satisfied


is called a heap. As you can see from the definition of the heap, the top element of the heap (that is, the first element) must be the smallest item (the small top heap).
If a heap is stored in a one-dimensional array, the heap corresponds to a complete binary tree, and the value of all non-leaf nodes is not greater than (or not less than) the value of their children, and the value of the root node (the top element of the heap) is the smallest (or largest). Such as:

(a) Large top heap sequence: (96,83,27,38,11,09)

(b) Small top heap sequence: (12,36,24,85,47,30,53,91)


Initially, the sequence of n numbers to be sorted is considered to be a sequential two-tree (one-dimensional array storage binary tree), adjust their storage order to become a heap, output the top element of the heap, and get the smallest (or largest) element in the N element, the minimum (or maximum) number of the root node of the heap. Then the front (n-1) element is readjusted to make it a heap, outputting the top element of the heap, and getting the second small (or second) element of n elements. And so on, until there are only two nodes in the heap, and exchange them, and finally get an ordered sequence of n nodes. Call this process a heap sort .

Therefore, the implementation of heap sorting requires two problems to be solved:
1. How to build a heap of n-sorted numbers;
2. After you output the top element of the heap, how to adjust the remaining n-1 elements to make it a new heap.

First, we'll discuss the second problem: after the top element is output, the remaining n-1 elements are restructured in a stack of adjustment processes.
How to adjust the small top heap:

1 with m elements of the heap, output heap top elements, left m-1 elements. The heap bottom element is fed to the top of the heap (the last element is exchanged with the top of the heap) and the heap is corrupted because the root node does not satisfy the nature of the heap.

2 The root node is exchanged with the smaller elements in the left and right subtree.

3 If the Zuozi Exchange: If the Shozi is destroyed, that is, Zuozi root node does not meet the nature of the heap, then repeat the method (2).

4 If it is exchanged with the right subtree, if the right subtree is destroyed, the root node of the right subtree does not satisfy the nature of the heap. The Repeat method (2).

5 continue to carry on the above exchange operation to the subtree that does not satisfy the heap nature, until the leaf node, the heap is built.

The adjustment process of the root node to the leaf node is screened. As shown in figure:


The process of building the initial heap on n elements is discussed again.
Building a Heap method: the process of building a heap on an initial sequence is a process of filtering over and over again.

1 The complete binary tree of n nodes, the last node is the subtree of the first node.

2 filter starts with a subtree that is the root of the first node, and the subtree becomes a heap.

3) Then the tree of the root of each node is filtered to make it a heap until the root node.

The initial process of the graph build heap: Unordered sequence: (49,38,65,97,76,13,27,49)



implementation of the algorithm:

From the description of the algorithm, heap sequencing requires two processes, one is to build a heap, and the other is to exchange positions between the top of the heap and the last element of the heap. So the heap sort has two function components. The first is to build the seepage function of the heap, and the second is to call the function of the infiltration function to realize the sort.

Adjust to a heap
void heap_adjust (int *list,int s,int m)
{
 int temp = List[s];
 for (int j=2*s+1;j<=m;j = 2*j+1)
 {
 if (list[j]<list[j+1]&&j<m)
 {
  j + +;
 }
 if (Temp>list[j]) break
  ;
 List[s] = list[j];
 s = j;
 }
 List[s] = temp;
}

Heap sort
void heap_sort (int *list,int len)
{
 //create a large top heap for
 (int s = len/2-1;s>=0;s--)
 {
 Heap_adjust (list,s,len-1);
 }

 Sort for
 (int i = len-1;i >= 1;i--)
 {
 swap (list[0],list[i));
 Heap_adjust (list,0,i-1);
 }

Analysis:

Set the tree depth to K. From root to leaf screening, the element compares the number of times to 2 (k-1) times, exchanging records at most k times. So, after the heap is built, the number of filters in the sort process does not exceed the following:


The number of times to build a heap is not more than 4n, so the heap sort at worst, the time complexity is also: O (NLOGN).

Heap sorting is not appropriate for files with fewer records because the number of comparisons required to build the initial heap is high. The heap sort is in-place ordering, and the auxiliary space is O ( 1), which is an unstable sort method.

5. Swap sort-bubble sort (Bubble sort)

Basic idea:

In the set of numbers to be sorted, the total number in the range that is not yet well ordered is compared and adjusted from top to bottom two consecutive numbers, allowing the larger numbers to sink and the smaller to go up. That is, whenever two adjacent numbers are compared and found to be in reverse order, they are interchanged.

Example of bubbling sort:

Implementation of the algorithm:

Bubble sort
void bubble_sort (int *list,int count)
{
 int flag = true;
 int i = 0,j = 0;
 for (i=0;i<=count&&flag;i++)
 {
  flag = false;
  for (j=count-1;j>=i;j--)
  {
   if (list[j]<list[j-1])
   {
    swap (list[j],list[j-1]);
    Flag = true;
   }}}


The improvement of bubble sort algorithm

A common way to improve bubble ordering is to add an iconic variable exchange, used to indicate whether there is a data exchange in a sort of trip, if there is no data exchange when a trip is made, then the data has been arranged according to the requirement, and the sorting can be finished immediately, avoiding the unnecessary comparison process. This article provides the following two improved algorithms:

1. Set up a symbolic variable POS to record the last swap position in each trip. Since POS position after the record has been exchanged in place, so in the next trip to order as long as the scan to the POS location.

2. In a traditional bubble sort, only one maximum or minimum value can be found for each trip in a sort operation. We consider using a method of forward and reverse two-pass bubbles in each sequence to get two final values (the largest and smallest) at a time, thus reducing the number of sorted journeys by almost half.

6. Exchange sort-fast sort (quick sort)

Basic idea:

A quick sort of algorithm is: 1 set two variables I, J, the beginning of the sorting: i=0,j=n-1;2) with the first array element as the key data, assign value to key , that is , key =a[0];3, starting from J to search forward, That is, to start the forward search (j--), find the first value less than the key A[j], the a[j] and A[i] interchange, and 4 from I start backward search, that is, from the front start backward search (i++), find the first greater than the key A[i], will be a[i and A[j] 5 repeat 3rd, 4 steps, until i=j (3,4 step, no match the value of the condition, that is, 3 a[j] is not less than key, 4 in a[i] not when the key change the value of J, I, so that j=j-1,i=i+ 1 until you find it. Find the value that matches the condition, and the position of the J pointer is unchanged when the exchange is made. In addition, the I==J process must be exactly the time the i+ or j-completes, at which point the loop ends.

(a) a process of sequencing:


Implementation of the algorithm:

Quick sort
int Partition (int *list,int low,int high)
{
 int pivotkey;
 PivotKey = List[low];
 while (Low 
 

Analysis:

A quick sort is generally considered to be the best in the ranking method of the same order of magnitude (O (nlog2n)). But if the initial sequence is ordered or basically ordered by the key code, the fast sort is reduced to bubble sort. In order to improve it, the benchmark record is usually selected by the "three-way Method", which adjusts the center of the two endpoints of the sorted interval and the midpoint of three key codes to the Fulcrum record. A quick sort is an unstable sort of method.

The improvement of the quick sort

In this improved algorithm, only the recursive invocation of the subsequence with length greater than K is ordered quickly, so that the original sequence is basically ordered, and then the whole basic ordered sequence is sorted by the Insert sort algorithm. Practice shows that the improved algorithm has a lower time complexity, and the improved algorithm has the best performance when the K value is about 8.

7. Merge sorting (merge sort)

Basic idea:

The merge (merge) Sort method combines two (or more) ordered tables into a new ordered table, which divides the sequence into several subgroups, each of which is ordered. Then the ordered Subsequence is merged into the whole ordered sequence.

Merge Sort Example:

Merge method:

Set R[I...N] consists of two ordered child tables R[I...M] and R[M+1...N], two child table lengths are N-i + 1, n-m respectively.

1.j=m+1;k=i;i=i; The starting subscript of the two child tables and the starting subscript of the auxiliary array

2. If I>m or j>n, turn ⑷//one of the child tables has been merged, compare Select End

3.//Select R[i] and r[j] smaller in auxiliary array RF
If r[i]<r[j],rf[k]=r[i]; i++; k++; turn ⑵
Otherwise, Rf[k]=r[j] J + + k++; turn ⑵

4.//the elements in a child table that has not yet been processed into the RF
If i<=m, the R[I...M] is saved to RF[K...N]//previous child table is not empty
If j<=n, the R[J...N] is saved RF[K...N]//The latter child table is not empty

5. Completion of the merger.

Merge sort
//Order two ordered array
void Merge (int *list,int start,int mid,int end)
{
 const int len1 = Mid-start +1;
 const int len2 = End-mid;
 const int len = End-start +1;
 int i,j,k;

 int * Front = (int *) malloc (sizeof (int) *len1);
 int * back = (int *) malloc (sizeof (int) *len2);

 for (i=0;i<len1;i++)
  front[i] = list[start+i];
 for (j=0;j<len2;j++)
  back[j] = list[mid+j+1];

 for (i=0,j=0,k=start;i<len1&&j<len2&&k<end;k++)
 {
  if (Front[i]<back[j])
  {
   List[k] = front[i];
   i++;
  } else
  {
   List[k] = back[j];
   j + +;
  }
 }
 while (I<LEN1)
 {
  list[k++] = front[i++];
 }
 while (J<len2)
 {
  list[k++] = back[j++];
 }  

void Msort (int *list,int start,int end)
{
 if (start<end)
 {
  int mid = (start+end)/2;
  Msort (list,0,mid);
  Msort (list,mid+1,end);
  Merge (list,start,mid,end);
 }

} 

void Merge_sort (int *list,int count)
{
 msort (list,0,count-1);
}

8. Bucket sort/cardinal sort (radix sort)

Say Cardinal sort before we first say bucket sort:

The basic idea: to divide the array into a finite number of barrels. Each bucket is sorted individually (it is possible to use another sort algorithm or to continue using bucket sorting recursively). Bucket sorting is a kind of inductive result of pigeon nest sorting. When the values in the array to be sorted are evenly distributed, the bucket sort uses a linear time (Θ (n)). But the bucket sort is not a comparison sort, and he is unaffected by the lower limit of O (n log n).

In short, it's just grouping the data, putting it in a bucket, and then sorting through each bucket.

For example, to sort n integers a[1..n in size [1..1000] Range

First, you can set the bucket to a range of size 10, specifically, set b[1] to store integers of [1..10], set b[2] to store (10..20) integers, ... Set B[i] An integer that stores ((i-1) *10, i*10], i = 1,2,.. 100. There are a total of 100 barrels.

Then, scan the A[1..N] from start to finish, and put each a[i into the corresponding bucket b[j]. Then the number of these 100 barrels in each bucket of the sorting, then can be bubbling, select, or even the fast, in general any sort of method can be.

Finally, the numbers in each bucket are output sequentially, and the numbers in each barrel are output from small to large, thus getting a sequence of all the numbers in order.

Suppose there are n numbers, there are m buckets, and if the numbers are evenly distributed, there is an average n/m number in each barrel. If a quick sort is used for the numbers in each bucket, the complexity of the entire algorithm is O (n + M * N/m*log (n/m)) = O (n + nlogn-nlogm)

From the above, when M is near N, the bucket sort complexity is close to O (n)

Of course, the above computational complexity is based on the assumption that n numbers are evenly distributed. This assumption is very strong, the actual application of the effect is not so good. If all the numbers fall into the same bucket, they degenerate into a normal sort.

In the preceding several large sorting algorithms, most of the time complexity is O (N2), there are some sort of algorithm time complexity is O (NLOGN). But bucket sort can realize the time complexity of O (n). But the disadvantage of bucket sorting is:

1 First, the space complexity ratio is high, the extra overhead required is large. Sorting has two array of space overhead, one is to be sorted array, one is the so-called bucket, such as the value to be sorted from 0 to m-1, then need M bucket, this bucket array will be at least m space.

2 Second, the elements to be sorted must be within a certain range and so on.

Bucket sort is an assignment sort. Assignment sorting is specific without the need for a comparison of key codes, but only if you know some specifics about the sequence you want to arrange.

The basic idea of assigning a sort: to be Frank is to do a lot of bucket-sort.

The cardinality sort process does not need to compare keywords, but rather through the "allocation" and "collection" procedures to achieve sorting. Their time complexity can be achieved in linear Order: O (n).

Instance:

52 cards in poker, can be divided into two fields by color and face value, the size of the relationship is:
Color: Plum < Diamonds < Hearts < Black Heart
Face value: 2 < 3 < 4 < 5 < 6 < 7 < 8 < 9 < < J < Q < K < A

If the cards are sorted in ascending order by suit and face value, the following sequence is obtained:

That is, two cards, if the color is different, regardless of the face value, the lower the card is less than the color of the high, only in the same color, the size of the relationship is determined by the size of the value. This is the sort of multiple key codes.

To get the sort results, we discuss two sorting methods.
Method 1: First to the color sorting, it is divided into 4 groups, that is, Plum Group, block group, Hearts Group, Black Heart Group. Each group is sorted by face value, and finally, 4 groups are connected.
Method 2: 13 numbered groups (2nd, 3rd, ..., A) are given at 13 face value, and the corresponding numbered groups are divided into 13 piles. Then according to the color to give 4 number group (plum, Square, red heart, Black Heart), the 2nd Group of Cards are taken out into the corresponding color group, and then the 3rd Group of Cards out of the corresponding color group, ..., so that 4 color groups are in order in the face value, and then, 4 color groups are connected in turn.

The ordered sequence of n elements contains the D key {K1,K2,...,KD}, and the sequence pairs the key {K1,K2,...,KD} as follows: For any two records in the sequence R[i] and R[j] (1≤i≤j≤n), the following ordered relationships are satisfied:

Among them, K1 is called the most-thematic key code, and KD is called the most important key code.

Two kinds of multiple key sorting methods:

Multiple-key sequences are sorted in sequence from the most-thematic key to the most-important key or from the most-important to the most-thematic key, in two ways:

Highest priority (Most significant Digit first) method, referred to as MSD Law:

1) first by K1 sorting group, the sequence is divided into several subsequence, the same set of records, the key code k1 equal.

2) The groups are sorted into subgroups by K2, after which the following key code continues to be sorted until the subgroups are sorted by the most-important-key KD.

3 and then connect each group together, then get an ordered sequence. The method of playing poker by suit and denomination is MSD method.

Lowest priority (least significant Digit first) method, referred to as LSD:

1 First sort from KD, then sort the kd-1, then repeat until the smallest subsequence is divided into the K1 sorting group.

2 In the end, we can get an orderly sequence by connecting each subsequence, and the method of playing poker by suit and denomination is the LSD method.

The basic idea of chain Cardinal order based on LSD method

The idea of "multiple keyword sorting" realizes "single keyword sorting". A single keyword of a numeric or character type can be viewed as a multiple-keyword consisting of multiple digits or characters, which can be sorted by an allocation-collect method called the cardinal Order, where the number of possible values for each number or character is called the cardinality. For example, playing cards have a base of 4 and a nominal base of 13. In the finishing of poker, you can either be sorted by suit, or you can finish it by face value first. According to the color finishing, first by red, black, square, flower order into 4 stacks (distribution), they are then stacked together (collected) in this order, then divided into 13 stacks (allocated) in the order of the face value, then stacked together in this order (collected), so that the cards can be ordered in two assignments and collections.

Cardinality sort:

is sorted by the lows, then collected, sorted by high order, then collected, and so on until the highest bit. Sometimes some attributes are in order of precedence, sorted first by low priority, and then sorted by high priority. The final order is high priority in the front, high priority of the same low priority high in the front. Cardinality sorting is based on separate sorting, collected separately, so it is stable.

Algorithm implementation:

#define MAX
#define BASE 1

void radix_sort (int *a, int n) {
 int i, B[max], M = a[0], exp =;

 for (i = 1; i < n; i++) {
 if (A[i] > m) {
  m = a[i];
 }
 }

 while (M/exp > 0) {
 int bucket[base] = {0};

 for (i = 0; i < n; i++) {
  bucket[(A[i]/exp)% base]++;

 for (i = 1; i < BASE; i++) {
  Bucket[i] + = bucket[i-1]
 ;

 for (i = n-1 i >= 0; i--) {
  b[--bucket[(A[i]/exp)% BASE]] = A[i];
 }

 for (i = 0; i < n; i++) {
  a[i] = B[i];
 }

 Exp *= BASE;
 }


Summarize

A summary of the stability, time complexity and spatial complexity of various sorts:

We compare the case of time complexity functions:


The growth of time complexity function O (n)


So a large sort record for N. The general choice is the time complexity of O (nlog2n) sorting method.

In terms of time complexity:

(1) Order of squares (O (n2))
All kinds of simple sorting: direct insertion, direct selection and bubble sort;
(2) Order of linear Logarithmic order (O (NLOG2N))
Quick sort, heap sort and merge sort;
(3) O (n1+§)) Sort, § is a constant between 0 and 1.

Hill sort
(4) Linear Order (O (n)) ordering
Cardinality ordering, in addition to bucket, box sorting.

Description

When the original table is ordered or basically ordered, the direct insertion sort and bubble sort will greatly reduce the number of comparisons and the Times of moving records, and the time complexity can be reduced to O (n);

But the quick order is opposite, when the original table is basically orderly, degenerate is the bubble sort, the time complexity increases to O (N2);

The order of the original table has little effect on the time complexity of simple selection, heap sort, merge sort and cardinal order.

Stability:

Stability of the sorting algorithm: if there are multiple records with the same keyword in the sequence to be sorted, the relative order of the records remains unchanged, the algorithm is said to be stable, and if the relative order of the records changes after sorting, the algorithm is said to be unstable.
The benefits of Stability: If the sort algorithm is stable, then sort from one key and then from another, the first key sort results can be sorted by the second key. The Cardinal sort is this, first by the low order, successive orders by the high order, the lower the same level of the same sequence of the same will not change. In addition, if the ranking algorithm is stable, redundant comparisons can be avoided;

Stable sorting algorithm: bubble sort, insert sort, merge sort and cardinal order

Not a stable sort algorithm: Select sort, quick sort, hill sort, heap sort

Select the sorting algorithm guidelines:

Each sort algorithm has its own advantages and disadvantages. Therefore, in practical use according to different circumstances appropriate selection, or even can combine a variety of methods.

Select the basis of the sorting algorithm

There are many factors affecting the ordering, and the algorithm with low average time complexity is not necessarily optimal. Conversely, algorithms that sometimes have a high average time complexity may be more appropriate for some special situations. At the same time, the selection of the algorithm must also consider its readability, in order to facilitate the maintenance of software. In general, the following four points need to be considered:

1. The size of the number of records to be sorted (n);

2. Record the size of the data itself, that is, the amount of other information in the record except for the keyword;

3. The structure and distribution of keywords;

4. Requirements for the stability of the sort.

The number of elements to be sorted is N.

1 when n is large, we should use the time complexity of O (nlog2n) Sorting method: Quick Sort, heap sort or merge sort order.

Quick sort: is currently considered the best method based on the comparison, when the keyword to be sorted is randomly distributed, the average time of fast ordering is shortest;
Heap sort: If memory space allows and requires stability,

Merge sort: It has a certain amount of data movement, so we may go through with the insertion sort combination, first obtains a certain length sequence, then merges, in the efficiency will improve.

2 when n is large, memory space allows, and requires stability = "Merge sort

3 when n is small, direct insertion or direct selection can be used.

Direct insertion Sort: when elements are distributed in an orderly manner, direct insertion of the sort will greatly reduce the number of comparisons and the number of times the record is moved.

Direct selection Sort: elements are distributed in order, and if stability is not required, select Direct selection sort

5) generally do not use or not directly use the traditional bubble sort.

6) Cardinal Order
It is a stable sort algorithm, but it has some limitations:
1, the key word can decompose.
2, the number of key words recorded less, if dense better
3, if it is a number, it is best to unsigned, otherwise it will increase the corresponding mapping complexity, you can first of its positive and negative separately sorted.

The above is the entire content of this article, I hope to help you learn, but also hope that we support the cloud habitat community.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.