[C + +] time complexity & space complexity __c++

Source: Internet
Author: User
Complexity of time & complexity of space

Time complexity is a common method in algorithm analysis. He gave a mathematical method to analyze the efficiency of the algorithm's execution. This paper discusses how to calculate the time complexity and gives the corresponding examples. And the space complexity is the memory cost that a program consumes. These two directly determine the operating efficiency of a program and whether an algorithm is efficient. Complexity of Time definition Time Frequency

Time frequency an algorithm to carry out the time spent, theoretically can not be calculated, must be on the machine to run the test to know. But we can not and do not need to test each algorithm on the machine, just know which algorithm spends more time, which algorithm spends less time on it. And an algorithm spends time with the algorithm in the execution of the number of statements in direct proportion, which algorithm in the number of statements executed more times, it takes more time. The number of statements executed in an algorithm is called the statement frequency or time frequency. is recorded as t (N). Complexity of Time

If the scale of a problem is n, the time required to solve an algorithm of this problem is T (n), and it is a function t (n) of n that is called the "time complexity" of this algorithm.

When the input n is increasing, the limit of time complexity is called the asymptotic time complexity of the algorithm.

The time frequency is different, but the time complexity may be the same. such as: T (n) =n2+3n+4 and T (n) =4n2+2n+1 their frequency is different, but the time complexity is the same, all are O (N2).

Increased by order of magnitude, the common time complexity is: constant order O (1), Logarithmic order O (log2n), linear order O (n), linear logarithmic order O (nlog2n), square O (n2), Cubic O (n3),..., K-order O (NK), exponential order O (2n). With the increasing of the scale n of the problem, the time complexity is increasing and the execution efficiency of the algorithm is lower. Big O notation

We often use large o notation to denote time complexity, and note that it is the time complexity of an algorithm. The big O means only that there is an upper bound, by definition if f (n) =o (n), then it is obvious that f (n) =o (n^2), it gives you an upper bound, but is not the upper bounds, but people generally used to express the former.

In addition, a problem itself has its complexity, if the complexity of an algorithm reached the lower bound of the problem complexity, it is called such an algorithm is the best algorithm .

"Big O notation": The basic parameter used in this description is N, the scale of the problem instance, the function that expresses complexity or the run-time time as N. The "O" here denotes a magnitude (order), for example, "binary search is O (Logn)", which means that it needs to "retrieve an array of n by Logn steps" notation O (f (n)) to indicate that when n increases, the run time will be proportional to f (n) Rate of growth.

This kind of asymptotic estimation is very valuable to the theory analysis and the approximate comparison of the algorithm, but the details in practice can also cause the difference. For example, an O (N2) algorithm with a low additional cost may run faster than a higher-cost O (NLOGN) algorithm in the case of a small n. Of course, with n big enough, algorithms with slower-rise functions must work faster. worst time and average time

The worst case time complexity is called the worst time complexity . In general, it is not specifically stated that the time complexity of the discussion is the worst-case scenario. The reason for this is that the worst case time complexity is the upper bound of the algorithm's running time on any input instance, which ensures that the algorithm will not run longer than any other time.

In the worst-case scenario, the time complexity is T (n) =0 (n), which means that the algorithm cannot run longer than 0 (n) for any input instance. Average time complexity refers to the expected running time of the algorithm when all possible input instances are presented with equal probability. calculation Method

The specific steps to solve the algorithm's time complexity are:

⑴ find the basic sentence in the algorithm;

The most frequently executed statement in the algorithm is the basic statement, usually the loop body of the most inner loop.

⑵ calculates the order of magnitude of the execution times of the basic statement;

Simply calculate the order of magnitude of the base statement execution, which means that the coefficients of all lower power and highest power can be ignored as long as the highest power in the function that guarantees the execution of the base statement is correct. This simplifies algorithmic analysis and focuses attention on the most important point: growth rates.

⑶ uses a large 0 notation to represent the time performance of the algorithm.

Place the order of magnitude of the base statement execution into the large 0 notation. Concrete Examples O (1)

Constant order, that is, time complexity does not increase with the increase of size n. If the execution time of the algorithm does not grow with the increase of the problem scale n, even if there are thousands of statements in the algorithm, the execution time is only a larger constant

int x=91; 
int y=100;
while (y>0) {
    if (x>100) {
        x=x-10;
        y--;
    } else {
    x + +;
    }
}

T (n) =o (1),

This program looks a bit scary, a total of 1000 times a cycle, but we see n no? The operation of this program is not related to N,
Even if it circulates for 10,000 years, we don't care about him, just a constant-order function. O (n)

Time complexity of the first order. The time spent is in the first order linear relationship with the scale N.

int n;
CIN >> N;
while (n--) {
    cout << n << endl;
}
O (n^3)

Third-order linear relationship. (N-order linear and so on.) It's like matrix multiplication is always O (n^3)

int total = 0;
int n;
CIN >> N;
for (int i = 0; I!= N; i++) {for
    (int j = 0; J!= i; j +) {for
        (int k = 0; J!= J; k++) {
            total++;
        }
    }
}

Although the execution times of the inner loop are not directly related to the problem scale N, however, it is related to the variable value of the outer loop, and the outermost loop is directly related to N, so you can iterate from the inner layer to the outer parsing statement: The time complexity of the program segment is t (n) =o (n^3/6+ lower term) =o (n ^3)

The common algorithm time complexity from the small to the large sequence is:

  0 (1) <0 (log2 (n) <0 (n) <0 (NLOG2 (n)) <0 (n^2) <0 (n^3) ... <0 (2^n) <0 (n!) Analysis Rules

In calculating the time complexity of the algorithm, there are several simple program analysis rules:

1. For some simple input-output statements or assignment statements, approximate to the need O (1) time

2. For sequential structures, the time taken to execute a series of statements sequentially can be based on the "sum rule" under Big O

3. summation rule : refers to the algorithm if the 2 parts of the time complexity of T1 (n) =o (f (n)) and T2 (n) =o (g (n)), then T1 (n) +t2 (n) =o (max (f (n), g (n))

In particular, if T1 (m) =o (f (m)), T2 (n) =o (g (n)), then T1 (m) +t2 (n) =o (f (m) + g (n))
3. For a select structure, such as an if statement, its main time consuming is the time spent executing then or else words, and it should be noted that the test conditions also require O (1) time

4. For the circular structure, the operation time of the circular statement is mainly embodied in the execution of the cyclic body in many iterations and the time consuming of checking the cyclic condition, which can be used under the "multiplication law" under the Big O generally.

multiplication Rule : refers to if the algorithm's 2 part time complexity is T1 (n) =o (f (n)) and T2 (n) =o (g (n)), then T1*t2=o (f (n) *g (n))

5. For complex algorithms, it can be divided into several easy to estimate parts, and then use the sum rule and the multiplication rule technique the time complexity of the whole algorithm

There are also the following 2 algorithms:

(1) if G (n) =o (f (n)), then O (f (n)) + O (g (n)) = O (f (n))

(2) O (Cf (n)) = O (f (n)), where C is a normal number of space complexity

The space complexity of a program is the size of the memory required to run a program. Using the space complexity of the program, you can have a predetermined estimate of how much memory the program needs to run. In addition to the requirements of storage space and the instructions, constants, variables, and input data used by a program, it requires some work units to operate the data and some auxiliary space to store the information needed for the actual calculation . The required storage space for program execution consists of the following two parts.

(1) fixed part . The size of this part of the space is independent of the number of input/output data and numerical values. Mainly includes instruction space (i.e. code space), data space (constant, simple variable) and so on space occupied. This part belongs to the static space.

(2) variable space , this part of the space mainly includes the dynamic allocation of space, and recursive stack required space. The space size of this part is related to the algorithm.

The storage space required by an algorithm is represented by F (n). S (n) =o (f (n)) where n is the scale of the problem, S (n) represents the complexity of the space.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.