Java Fast sequencing time complexity spatial complexity stability

Source: Internet
Author: User

Transferred from: http://blog.csdn.net/believejava/article/details/38434471

——————————————————————————————————————————————

1, the basic idea of fast sorting :

Divide the records to be sorted into separate two parts by a single pass, where some of the recorded keywords are smaller than the other, and then the two sections continue to be sorted until the entire sequence is ordered.

First look at this picture:

Think of the whole sequence as an array, the 0th position as the middle axis, and the last one, if it is smaller than it, does not do any processing than it; The end is exchanged again and the smaller one, than it is smaller than that, which does not exchange much larger than it. This cycle, a trip to the completion of the order, the left side is smaller than the middle axis, the right is larger than the middle axis, and then the division of the method, respectively, the two separate array to sort.

2. Java code:

[Java]View Plaincopyprint?
  1. </pre><pre name="code" class="java" > Packagecom.ynu.www.tool;
  2. Public class QuickSort {
  3. public int Getmiddle (int[] arrayint, int. Low, int.) {
  4. int tmp = Arrayint[low];
  5. While (Low < high) {
  6. While (Low < high && tmp < Arrayint[high]) {
  7. high--;
  8. }
  9. Arrayint[low] = Arrayint[high]; //smaller than mid-axis records moved to the low end
  10. While (Low < high && Arrayint[low] < tmp) {
  11. low++;
  12. }
  13. Arrayint[high] = Arrayint[low];
  14. }
  15. Arrayint[low] = tmp;
  16. return low;
  17. }
  18. public void Quicksort (int[] arrayint, int. Low, int.) {
  19. if (Low < high) {
  20. int mid = Getmiddle (Arrayint, Low, high);
  21. Quicksort (Arrayint, Low, mid);
  22. Quicksort (Arrayint, Mid + 1, high);
  23. }
  24. }
  25. public static void Main (String args[]) {
  26. QuickSort qs = new QuickSort ();
  27. int[] Testarray = { 3, 2, 7, 10};
  28. Qs.quicksort (Testarray, 0, Testarray.length- 1);
  29. For (int i = 0; i < testarray.length; i++) {
  30. System.out.print (Testarray[i] + "");
  31. }
  32. }
  33. }
</pre><pre name= "code" class= "Java" >package Com.ynu.www.tool;public class QuickSort {public int getmiddle ( Int[] arrayint, int low, int.) {int TMP = Arrayint[low];while (Low < High) {while (Low < high && tmp < ; Arrayint[high]) {high--;} Arrayint[low] = arrayint[high];//smaller than the middle axis of the record moved to the low side while (lower < high && Arrayint[low] < tmp) {low++;} Arrayint[high] = Arrayint[low];} Arrayint[low] = Tmp;return low;} public void Quicksort (int[] arrayint, int. Low, int.) {if (Low < high) {int mid = Getmiddle (Arrayint, Low, high); qui Cksort (Arrayint, Low, mid), Quicksort (Arrayint, mid + 1, high);}} public static void Main (String args[]) {QuickSort qs = new QuickSort (); int[] Testarray = {7, 3, 2,,};q S.quicksort (Testarray, 0, testarray.length-1); for (int i = 0; i < testarray.length; i++) {System.out.print (testarray[ I] + "");}}}

3, Algorithm analysis:

Average Time complexity O (NLOGN), Worst time complexity O (n*n), auxiliary space O (LOGN) < each time to give an extra space, and a total of logn times >
Each time divided into two paragraphs, then the number of times is LOGN, each processing needs n times, then the time complexity is NLOGN!
According to the average, O (Nlogn), because in the case of data distribution and other probabilities, for a single data will be placed in the correct position after the Logn movement.
The worst is O (n^2). This is exactly the reverse of the array, and then each time you go to the middle element is the maximum or minimum.

Stability: unstable.

4. Supplementary explanation:

1, time complexity  
(1) Time frequency an algorithm takes time to execute, theoretically it can not be calculated, it must be run on the computer test to know. But we can not and do not need to test each algorithm, just know which algorithm spends more time, which algorithm spends less time on it. And the time that an algorithm spends is proportional to the number of executions of the statement in the algorithm, which algorithm takes more time than the number of statements executed. The number of times a statement is executed in an algorithm is called a statement frequency or time frequency. Note as T (N).  
(2) time complexity in the time frequency mentioned just now, N is called the scale of the problem, and when N is constantly changing, the time frequency t (n) will change constantly. But sometimes we want to know what the pattern is when it changes. To do this, we introduce the concept of time complexity. Under normal circumstances, the number of iterations of the basic operation of the algorithm is a function of the problem size n, denoted by T (n), if there is an auxiliary function f (n), so that when n approaches infinity, the limit value of T (n)/f (n) is not equal to zero constant, then f (n) is the same order of magnitude function of t As T (n) =o (f (n)), called O (f (n)) is the progressive time complexity of the algorithm, which is referred to as the complexity of time.  
in various algorithms, if the algorithm has a constant number of statement executions, the time complexity is O (1), in addition, the time frequency is different, the time complexity may be the same, such as T (n) =n2+3n+4 and T (n) =4n2+2n+1 their frequency, But the time complexity is the same, all O (N2). In order of magnitude increment, the common time complexity is: Constant order O (1), Logarithmic order O (log2n), linear order O (n), linear logarithmic order O (nlog2n), square order O (n2), Cubic O (n3),..., K-order O (NK), exponent-order (2n). With the increasing of the problem scale N, the complexity of the time is increasing and the efficiency of the algorithm is less. 2, spatial complexity is similar to the time complexity, the spatial complexity is the measurement of the storage space required when the algorithm executes in the computer. Note: S (n) =o (f (n)) We are generally talking about the size of the secondary storage unit in addition to the normal memory overhead. The discussion method is similar to the complexity of time, and is not discussed.  
(3) The time performance of progressive time complexity evaluation algorithm is mainly used to evaluate the time performance of an algorithm based on the order of magnitude of time complexity (i.e., the asymptotic time complexity of the algorithm).

2. Similar to the discussion of time complexity, an algorithm's spatial complexity (space complexity) S (n) is defined as the storage space consumed by the algorithm, and it is also a function of the problem size n. Asymptotic spatial complexity is also often referred to as spatial complexity.
Spatial complexity (space complexity) is a measure of the amount of storage space that is temporarily occupied by an algorithm while it is running. The storage space occupied by an algorithm in the computer memory, including the storage space occupied by the storage algorithm itself, the storage space occupied by the input and output data of the algorithm and the storage space occupied by the algorithm in the running process three aspects. The storage space occupied by the input and output data of the algorithm is determined by the problem to be solved, which is passed by the calling function by the parameter table, and it does not change with the algorithm. Storage algorithm itself occupies the storage space and the length of the algorithm written in proportion, to compress the storage space, you must write a shorter algorithm. Algorithm in the running process of temporary occupied storage space varies with the algorithm, some algorithms only need to occupy a small amount of temporary work units, and does not vary with the size of the problem, we call this algorithm "in-place/", is to save the memory of the algorithm, as described in this section of the algorithm is so , some algorithms need to occupy the number of temporary work and solve the problem of the size of N, it increases with the increase of N, when n is large, will occupy more storage units, such as in the Nineth chapter described in the Quick Sort and merge sorting algorithm is the case.

If the spatial complexity of an algorithm is a constant, that is, it can be represented as O (1) when it is not changed by the size of N of the processed data, and when the spatial complexity of an algorithm is proportional to the logarithm of the base N of 2, it can be represented as 0 (10g2n), and when an algorithm's empty I-division complexity is linearly proportional to N, can be represented as 0 (n). If the parameter is an array, it is only necessary to allocate a space for it to store an address pointer transmitted by the argument, that is, a machine word space, and if the formal parameter is a reference, it is only necessary to allocate a space for it to store the address of the corresponding argument variable. To automatically reference the argument variable by the system.

Java Fast sequencing time complexity spatial complexity stability

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.