On the complexity of time

Source: Internet
Author: User

1, about the complexity of time

In computer science, the time complexity of an algorithm is a function that quantitatively describes the time it takes to run the algorithm. This is a function of the length of the string representing the input value of the algorithm. Time complexity is often expressed in large o notation, excluding the lower order and first coefficients of the function. In this way, the time complexity can be called asymptotic, and it examines the situation when the input value is approaching infinity. For example, if an algorithm is running at most of the time it needs for any size input, then its asymptotic time complexity is.

2. Common time complexity list

name Complexity class run time () operating Time Example algorithm Examples
Constant time 10 Determine the parity of a binary number
Anti-Ackerman Time And the split time of the single operation of the check set
Iteration Logarithmic time En:cole-vishkin algorithm
Logarithmic logarithmic time Single operation with bounded priority queue [1]
Logarithmic time Dlogtime Two-point Search
Power Logarithmic time
(less than 1 times) power time , where Search operations for k-d trees
Linear time Search for unordered arrays
Linear Iterative logarithmic time Triangular segmented polygon algorithm for Raimund Seidel
Linear logarithmic time The fastest comparison sort
Secondary time Bubble sort, insert sort
three times Basic realization of matrix multiplication and calculation of partial correlations
Polynomial-time P ,, En:karmarkar ' s Algorithm,aks prime number test in linear programming
Quasi-polynomial-time Qp The most famous approximation algorithm for the problem of a forward Steiner tree
Sub-exponential time (first definition) Subexp , to any of the ε> 0 Assuming complexity theoretic conjectures, BPP is contained in Subexp. [2]
Sub-exponential time (second definition) 2o(n) 2N1/3 Best-known algorithm for integer factorization and graph isomorphism
Index time E 2O(n) 1.1N, tenn Solving travel salesman problems with dynamic planning
Factorial time O (n!) N! Solving travel salesman problems through brute force search
Index time Exptime 2poly (n) 2n, 2n2
Double exponential time 2-exptime 22poly (n) N Deciding the truth of a given statement in Presburger arithmetic

3. Examples of complexity

* O (1) constant-level complexity, which means that the program runs at a time independent of the size of the data that needs to be processed. Usually, the comparison size, subtraction and other simple operations are regarded as the constant degree of complexity. It is worth noting that when dealing with large numbers (binary data longer than 32 bits or decimal more than 8 bits), subtraction operations such as constant complexity are no longer applicable.

* O (log n) converts a 10-binary integer to a 2-binary integer

* O (N): Determines whether an element belongs to a set/list of size n, and finds the maximum value in n number;

* O (n * log n) Quick Sort method

* O (n^2) The most straightforward 22 comparison then the sorting method, need N (n-1)/2 times comparison, so is O (n^2) level.

* O (2^n) lists exhaustive calculations for all 0,1 strings of length n

* O (n!) lists exhaustive calculations such as the full arrangement of n elements

4. Log Time

If the algorithm's T(n) = O (log n), it is said to have a logarithmic time . Since computers use binary notation systems, logarithms are often based on 2 (i.e. log2 n, sometimes writing LG N). However, by a logarithmic commutation formula, Loga n and logb n have only one constant factor, and this factor is discarded in the large O notation. So it is recorded as O (log n), regardless of the base of the logarithm, is the standard notation of the logarithmic time algorithm.

The common algorithm with logarithmic time has two cross-tree related operations and binary search.

The logarithmic time algorithm is very effective because each additional input requires less extra computation time.

A simple example of this class function is to recursively chop a string and output it. It requires O (log n) time because we cut the string in half before each output. This means that if we want to increase the number of outputs, we need to double the string length.

Recursively output the right half of a stringVarRight=function(Str){VarLength=Str.Length;Auxiliary functionsVarHelp=function(Index){Recursion condition: output Right Half partIf(Index<Length){//Output a portion of the console from index to the end of the array .  Log(str.  Substring(index, length)); //Recursive invocation: Call the helper function and pass the right half as a parameter to help (Math.  Ceil((length + index)/2)); } //Basic situation: do nothing} help  (0);}          

This article refers to a description of "time complexity" in Wikipedia.

On the complexity of time

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.