While the front-end generally does not require computational algorithm complexity, it is quite necessary to understand
Complexity of Time
1. Time Frequency
Because the algorithm execution time can not be calculated, for the evaluation algorithm to perform the workload, the number of times the statement execution is called the time frequency, recorded as T (N)
2. Complexity of Time
N is called the scale of the problem, and when N is constantly changing, T (n) is constantly changing, and in order to represent the regularity of its presentation , we introduce the concept of time complexity,
If there is an auxiliary function f (n), when n Infinity, t (n)/f (n) is a constant that is not 0, remember T (n) =o (f (n)),O (f (n)) is called the algorithm's progressive time complexity, referred to as the time complexity .
A simple example:
function Testtime () {
for (var i= 0;i< N; i + +) {n+1
for (var j= 0;j< N; j + +) {N (n+1)
Console.log (1) n*n
}
}
}
T (n) = N+1+n (n+1) +n*n= n2+2n+1
T (n) =o (n2)
3. Common time complexity
Constant order O (1), linear order O (2n), logarithmic order O (log2n), sub-order O (NK), exponential order O (2n)
Comparison of time complexity:
Through the pictures I stole, I know:
O (1) < O (log2n) < O (2n) < O (NK) < O (2n)
4. How to quickly calculate O (f (n))
⑴ find the basic statement in the algorithm, the most executed statement in the algorithm is the basic statement, usually the loop body of the most inner loop.
⑵ calculates the order of magnitude of execution of the base statement, which is O (f (n))
Complexity of space
Space complexity is the storage space consumed by the algorithm, including the framework required by the storage algorithm itself, the framework of the input and output data of the algorithm, and the framework that the algorithm occupies temporarily during the operation.
Storage space occupied by input and output data: determined by the problem to be solved, it is passed by the calling function through the parameter table, it does not change with this algorithm.
The storage algorithm itself occupies the storage space: in proportion to the length of the algorithm writing, to compress the storage space, it is necessary to write a shorter algorithm.
In the process of running the algorithm temporarily occupied storage space: With the algorithm of different, some algorithms only need to occupy a small amount of temporary work units, and does not vary with the size of the problem, we call this algorithm "in-place \", is to save the memory of the algorithm, as described in this section of the algorithm is so , some algorithms need to occupy the number of temporary work and solve the problem of the size of N, it increases with the increase of N, when n is large, will occupy more storage units, such as in the Nineth chapter described in the Quick Sort and merge sorting algorithm is the case.
Reference: http://blog.csdn.net/zolalad/article/details/11848739
Complexity of the algorithm