Time complexity and spatial complexity of data structures and algorithms

Source: Internet
Author: User

Objective

In the previous "Data structure and algorithm" I introduced the basic concept of data structure, also introduced the data structure can generally be divided into logical structure and physical structure. The logical structure is divided into set structure, linear structure, tree structure and graphic structure. The physical structure is divided into sequential storage structure and chained storage structure. The characteristics of these structures are also described. Then, the concept of the algorithm and the 5 basic characteristics of the algorithm are introduced, namely, input, output, poor, deterministic and feasible. Finally, it is stated that a good algorithm needs to obey the correctness, readability, robustness, high time efficiency and low storage capacity. In fact, the realization of efficiency and storage is the complexity of time and space. In this article we will focus on these two "complexities" to expand the description. In real development, time complexity is particularly important, and space complexity we do not make too much of the explanation.

Time complexity and space complexity are the measure of algorithm efficiency. In other words, a good algorithm depends on its time complexity and spatial complexity.

Asymptotic growth of functions: given two functions f (n) and g (n), if there is an integer n, it is always greater than g (n) for all n>n,f (n). So, we say that the growth of f (n) is nearly faster than G (n).

If there is a critical value so that f (n) >g (n) is always true, then we assume that "f (N) is growing nearly faster than G (n)".

Here I take 3 functions of the growth curve to illustrate the problem. Such as:

function One: X = 3*n

Function Two: Y = 2*n*n

Function Three: Z = 2*n*n+3*n

When N=1,Y < X < Z.

When n=2,X < Y < Z.

So, there is a value, which is between 1 and 2, so that X < Y < Z will always be true. We say that the incremental growth of y is faster than the x,z of Y, and of course, according to transitive rules, Z's gradual growth is faster than X.

Defined

Algorithm time complexity definition: When the algorithm is analyzed, the total execution number of the statement T (N) is a function of the problem size n, which then analyzes the change of T (n) with N and determines the order of magnitude of T (N).

The time complexity of the algorithm, which is the time measurement of the algorithm, is recorded as: T (n) = O (f (n)).

It indicates that with the increase of the problem size n, the growth rate of the algorithm execution time is the same as the growth rate of f (n), which is called the asymptotic time complexity of the algorithm, referred to as the time complexity. where f (n) is a function of the problem size n.

Computational method of time complexity

1. Replace all the addition constants in the run time with constant 1.

2. In the modified run Count function, only the highest order is preserved.

3. If the highest order exists and is not 1, the constant multiplied by the item is removed.

Finally, the final result is the complexity of the time.

Common time complexity is increased by order of magnitude, common time complexity is: Constant order O (1), logarithmic order O (log n), linear order O (n), linear logarithmic order O (nlog2n), square order O (n^2), Cubic O (n^3),..., K-Order O (n^k), exponent order O (2 ^n). With the increasing of the problem scale N, the complexity of the time is increasing and the efficiency of the algorithm is less. That is: the usual time complexity of the time spent from small to large, in turn: O (1) < O (Logn) < (n) < O (Nlogn) < O (n^2) < O (n^3) < O (2^n) < O (n!) < O (n^n ) Constant order
// constant Order
int n = 0; printf ("Onesong");p rintf ("Onesong");p rintf ("Onesong");p rintf ("Onesong");p rintf ("Onesong");p rintf ("Onesong") );p rintf ("Onesong");p rintf ("Onesong");p rintf ("Onesong");

The time complexity of the above code is O (1). Because, according to the definition of time complexity, n is not related to the scale of the problem. Of course, according to the time complexity calculation method The first article can also be concluded as O (1).

Linear order
// Linear Order int 10086 0  for (i=0; i < n; i++ ) {    = sum + i;}

The time complexity of the above code is O (n), because the size of the problem becomes larger as n grows, and the growth is linear.

Square Order
// Square Order int 998  for (i=0, i < n; i++ ) {    for (j=0; j < N; j + + ) 
    
     {        printf ("Onesong");    }
    

The above code outer layer executes n times, the outer loop executes once, the inner loop executes n times, that total program wants to come out from these two loops, need to execute n*n times, namely N squared. So the time complexity of this code is O (n^2).

Logarithmic order
// Logarithmic order int 1  -  while (I < N) {    2;}

Because after each i*2, a step closer to N, false with x 2 multiplied by greater than or equal to N, will exit the loop. The 2^X = n gets x = log (2) n, so the time complexity of this loop is O (Logn).

The spatial complexity of the algorithm

The spatial complexity of the algorithm is realized by calculating the storage space required by the algorithm, and the computational formula of the algorithm's spatial complexity is as follows: S (n) =o (f (n)), where n is the scale of the problem, and F (n) is the function of the statement about the storage space occupied by N.
In the process of development, we refer to the complexity of the situation is not specifically described, that is, the complexity of time. Now the speed of hardware development makes it possible for us to not think about the memory of the algorithm, usually in exchange for space for time. In addition, the spatial complexity of the algorithm is difficult to calculate, so, whether in the exam or in the development of the project, we are focused on the complexity of time. So, space complexity, skip over.

Photo source reference from: Fish C Studio. Thanks to the Fish C studio for contributing such a good picture.
If not specifically described, the author of all articles are original articles. If you like this article, reprint please indicate the source. If you are interested in data structure, please pay attention to me, follow-up will update a large number of excellent articles for your reference!

PS: This article in the book also has a synchronous update, we can also step into the book to focus on himself, follow-up will update more articles!

Jane Book Address: Http://www.jianshu.com/users/93131dfba96a/latest_articles

Time complexity and spatial complexity of data structures and algorithms

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.