**1. Complexity of space**The spatial complexity of a program is the amount of memory required to run a program. With the spatial complexity of the program, you can have a pre-estimate of how much memory is needed to run the program. In addition to requiring storage space and storing the instructions, constants, variables, and input data used by the store itself, a program needs some working units to manipulate the data and a secondary space to store some of the information needed for realistic computing. The storage space required for program execution consists of the following two parts. (1) Fixed part. The size of this part of the space is independent of the number of input/output data. It mainly includes space occupied by instruction space (i.e. code space), data space (constants, simple variables), etc. This part belongs to the static space. (2) Variable space, this part of the space mainly includes the dynamic allocation of space, as well as the space required for the recursive stack. The spatial size of this part is related to the algorithm. The storage space required for an algorithm is expressed in F (N). S (n) =o (f (n)) where n is the size of the problem, S (n) represents spatial complexity.

**2. Complexity of Time**(1) Time frequency an algorithm implementation of the time spent, from the theoretical can not be calculated, must be on the machine to run the test to know. But we can not and do not need to test each algorithm, just know which algorithm spends more time, which algorithm spends less time on it. And the time that an algorithm spends is proportional to the number of executions of the statement in the algorithm, which algorithm takes more time than the number of statements executed. The number of times a statement is executed in an algorithm is called a statement frequency or time frequency. Note as T (N). （

**the basic operations in the algorithm generally refer to the statements in the deepest loop of the algorithm** .) (2) Time complexity in the time frequency mentioned just now, N is called the scale of the problem, and when N is constantly changing, the time frequency t (n) will change constantly. But sometimes we want to know what the pattern is when it changes. To do this, we introduce the concept of time complexity.

**The ****frequency t (n) of the basic operation repetition in the algorithm is a function of the problem size n, f (n), the T (n) =o (f (n** ));The time frequency is different, but the time complexity may be the same. such as: T (n) =n2+3n+4 and T (n) =4n2+2n+1 their frequency is different, but the time complexity is the same, all O (N2). In order of magnitude increment, the common time complexity is: Constant order O (1), Logarithmic order O (log2n), linear order O (n), linear logarithmic order O (nlog2n), square order O (n2), Cubic O (n3),..., K-order O (NK), exponent-order (2n). With the increasing of the problem scale N, the complexity of the time is increasing and the efficiency of the algorithm is less. (3) Worst time complexity and average time complexity the worst case of time complexity is called

**worst-case complexity of time**。 In general, it is not particularly stated that the time complexity of the discussion is the worst-case time complexity. The reason for this is that the worst-case time complexity is the upper bound of the algorithm's run time on any input instance, which guarantees that the algorithm will not run longer than any other. In the worst case, the time complexity is T (n) =0 (n), which indicates that the algorithm will not run longer than 0 (n) for any input instance. The average time complexity is the expected run time of the algorithm when all possible input instances are present with equal probabilities. Exponential order 0 (2n), obviously, the time complexity of the exponential order 0 (2n) algorithm is very inefficient, when the value of n is slightly larger can not be applied. (4) Time complexity "1" If the execution time of the algorithm does not grow with the increase of the problem size n, even if there are thousands of statements in the algorithm, the execution time is only a large constant. The time complexity of such an algorithm is O (1). x=91; y=100;

while (y>0) if (x>100) {x=x-10;y--;} else x + +;

Time complexity t (n) =o (1),

This app looks a little scary and runs 1000 times in total, but did we see n?

Didn't. The operation of this program is independent of N,

Even if it circulates for another 10,000 years, we don't care about him, just a constant order function "2" when there are several loop statements, the time complexity of the algorithm is determined by the frequency f (n) of the most inner statement in the loop statement with the highest number of nested layers. x=1; for (i=1;i<=n;i++) for (j=1;j<=i;j++) for (k=1;k<=j;k++) x + +; The time complexity of the time complexity T (n) =o (m*t*n); "3" algorithm depends not only on the scale of the problem, but also on the initial state of the input instance. The algorithm for finding the given value K in the value a[0..n-1] is roughly the following: int i=n-1; while (i>=0 && (a[i]!=k)) i--; return i; The frequency of the statements (3) In this algorithm is not only related to the problem size n, It is also related to the value of each element of a in the input instance and the value of K: ① If there is no element equal to K in a, the frequency of the statement (3) f (n) =n;② if the last element of a is equal to K, the frequency f (n) of the statement (3) is the constant 0. (5) Time complexity evaluation performance There are two algorithms A1 and A2 solve the same problem, the time complexity is T1 (n) =100n2,t2 (n) =5n3. (1) When the input amount is n<20, there is T1 (n) >t2 (n), which takes less time. (2) with the increase of the problem size n, the time cost of the two algorithms is also increased with the 5N3/100N2=N/20. That is, when the problem scale is large, the algorithm A1 is more effective than the algorithm A2. Their asymptotic time complexity O (n2) and O (N3) Evaluate the temporal quality of the two algorithms on a macroscopic scale. In the algorithm analysis, the time complexity and the asymptotic time complexity of the algorithm are often not distinguished, but the asymptotic time complexity T (n) =o (f (n)) is often referred to as the timeComplexity, where f (n) is generally the most frequent statement frequency in the algorithm.

Time complexity and spatial complexity of the algorithm