Algorithm complexity, that is, the algorithm is written in the executable program, the runtime needs of resources, resources including time resources and memory resources.
The same problem can be solved by different algorithms, and the quality of an algorithm will affect the efficiency of the algorithm and even the program. The purpose of the algorithm analysis is to select the suitable algorithm and the improved algorithm. The evaluation of an algorithm is mainly considered in terms of time complexity and spatial complexity.
time complexity editing (1) Time frequency an algorithm implementation of the time spent, from the theoretical can not be calculated, must be on the machine to run the test to know. But we can not and do not need to test each algorithm, just know which algorithm spends more time, which algorithm spends less time on it. And the time that an algorithm spends is proportional to the number of executions of the statement in the algorithm, which algorithm takes more time than the number of statements executed. The number of times a statement is executed in an algorithm is called a statement frequency or time frequency. Note as T (N). The time complexity of the algorithm refers to the computational effort required to execute the algorithm. (2) Time complexity in the time frequency mentioned earlier, n is called the scale of the problem, and when N is constantly changing, the time frequency t (n) will change constantly. But sometimes we want to know what the pattern is when it changes. To do this, we introduce the concept of time complexity. Under normal circumstances, the number of iterations of the basic operation of the algorithm is a function of the problem size n, denoted by T (n), if there is an auxiliary function f (n), so that when n approaches infinity, the limit value of T (n)/f (n) is not equal to zero constant, then f (n) is the same order of magnitude function of t As T (n) =o (f (n)), called O (f (n)) is the progressive time complexity of the algorithm, which is referred to as the complexity of time. In various algorithms, if the algorithm is a constant number of execution times, the time complexity is O (1), in addition, the time frequency is not the same, the time complexity may be the same, such as T (n) =n^2+3n+4 and T (n) =4n^2+2n+1 their frequency is different, but the time complexity of the same, all O (n^2). In order of magnitude increment, the common time complexity is: Constant order O (1), Logarithmic order O (log2n) (base n logarithm of 2), linear order O (n), linear logarithmic order O (nlog2n), square order O (n^2), Cubic O (n^3),..., K-Order O (n^k), Exponential order O (2^n). With the increasing of the problem scale N, the complexity of the time is increasing and the efficiency of the algorithm is less. Time performance analysis of the algorithm (1) The time spent by an algorithm and the frequency of a statement by the algorithm = the execution time of each statement in the algorithm and the execution time of each statement = The execution times of the statement (that is, the frequency (Frequency Count)) x statement executes once the required time algorithm is converted to a program, The time required for each statement to execute once depends on the hard-to-determine factors such as the machine's instruction performance, speed, and the code quality generated by the compilation. To analyze the time cost of the algorithm independently of the hardware and software of the machine, the time required to execute each statement is the unit time, and the time spent on an algorithm is the sum of the frequency of all the statements in the algorithm. The product C=AXB of two n-order matrices is obtained, and its algorithm is as follows: # define N//n can be defined as needed, this is assumed to be 100void Matrixmultiply (int a[a],int B [n][n],int c[n][n]) {//Right column for each statement frequency int I, j, K, (1) for (i=0; i<n;i++) n+1 (2) for (j=0;j<n;j + +) {N (n+1) (3) c[i][j]=0; n2(4) for (k=0; k<n; k++) n2 (n+1)(5) C[i][j]=c[i][j]+a[i][k]*b[k][j];n3}} The sum of the frequencies of all the statements in the algorithm (i.e. the time spent by the algorithm) is: T (n) =2n3+3n2+2n+1 (1.1) Analysis: The loop control variable I of the statement (1) is added to N, and the test to I=n is terminated. Therefore, its frequency is n+1. But its loop body can only execute n times. The statement (2) as a statement (1) in the body of the loop should be executed n times, but the statement (2) itself executes n+1 times, so the frequency of the statement (2) is n (n+1). Similarly, the frequency of the statements (3), (4) and (5) is n2,n2 (n+1) and N3. The time consuming of the algorithm matrixmultiply t (n) is the function of the order n of the Matrix. (2) Problem size and algorithm time complexity algorithm the input amount of the problem is called the scale of the problem (size), which is generally represented by an integer. The scale of the matrix product problem is the order of the Matrix. The scale of a graph theory problem is the number of vertices or the number of edges in the graph. The time complexity of an algorithm (time complexity, also known as temporal complexity) T (n) is the time consuming of the algorithm and is the function of the algorithm to solve the problem size n. When the size n of a problem tends to infinity, the magnitude (order) of time complexity T (N) is called the asymptotic complexity of the algorithm. The time complexity of the algorithm matrixmultiply t (n) as shown in the (1.1) formula, when n tends to infinity, there is obviously T (N) ~o (n^3), which indicates that when n is sufficiently large, the ratio of T (N) and n^3 is a constant that is not equal to zero. That is, t (n) and n^3 are the same order, or T (n) and n^3 are the same orders of magnitude. The memory of T (N) =o (n^3) is the asymptotic time complexity of the algorithm matrixmultiply. (3) The time performance of progressive time complexity evaluation algorithm is mainly to evaluate the time performance of an algorithm by the order of magnitude of time complexity (i.e., the asymptotic time complexity of the algorithm). The time complexity of the algorithm matrixmultiply is generally t (n) =o (n^3), and F (n) =n^3 is the frequency of the statement (5) in the algorithm. The following example shows how to find the time complexity of the algorithm. Exchange the contents of I and J. Temp=i;i=j;j=temp; The frequency of the above three individual statements is 1, and the execution time of the program segment is a constant independent of the problem size n. The time complexity of the algorithm is the constant order, which is recorded as T (N) =o (1). Note: If the execution time of the algorithm does not grow with the increase of the problem size n, even if there are thousands of statements in the algorithm, the execution time is only a large constant. The time complexity of such an algorithm is O (1). One of the variable counts: (1) x=0;y=0; (2) for (k-1;k<=n;k++) (3) x + +;(4) for (i=1;i<=n;i++) (5) for (j=1;j<=n;j++) (6) y++; The step loop statement only takes into account the number of executions of the statements in the loop body, ignoring the steps in the statement, including 1, the final value, and the control transfer components. Therefore, the most frequent statement in the above program segment is (6), the frequency is f (n) =n^2, so the time complexity of the program segment is t (n) =o (n^2). When there are several loop statements, the time complexity of the algorithm is determined by the frequency f (n) of the most inner statement in the loop statement with the highest number of nesting layers. Variable Count Two: (1) x=1; (2) for (i=1;i<=n;i++) (3) for (j=1;j<=i;j++) (4) for (k=1;k<=j;k++) (5) x + +, the most frequent statement in the program segment is (5), the number of executions in the inner loop is not directly related to the problem size n, but is associated with the variable value of the outer loop, The number of outermost loops is directly related to n, so the number of executions of the statement (5) from the inner Loop to the outer layer can be parsed: the time complexity of the program segment is t (n) =o (n^3/6+ Low) =o (n^3). (4) The time complexity of the algorithm depends not only on the scale of the problem, but also on the initial state of the input instance. The algorithm for finding the given value K in the value a[0..n-1] is roughly as follows: (1) i=n-1, (2) while (i>=0&& (A[i]!=k)) (3) i--;(4) return I; statements in this algorithm (3) Frequency is not only related to the problem size n, but also to the value of the elements of a in the input instance and the value of K: ① If there are no elements equal to K in a, then the frequency of the statement (3) f (n) =n;② if the last element of a is equal to K, the frequency f (n) of the statement (3) like time complexity, spatial complexity is the measure of the storage space required for the algorithm to execute within a computer. Note: S (n) =o (f (n)) the storage space required during the execution of the algorithm consists of 3 parts: • The space occupied by the algorithm program; • The amount of storage space for the initial data entered; • Additional space required during algorithm execution. In many practical problems, compressed storage technology is usually used to reduce the storage space of the algorithm. In order to reduce the complexity of the algorithm, we should consider the input quantity and design a better algorithm.
Algorithmic complexity