In fact, the method of finding the complexity of time is to find out the most executed statements and then judge their magnitude, but sometimes it is not easy to see directly, so need to use the hypothetical way to assist
for (int i=0;i<n;i=i*3) cout<< "hehe" <<endl;
As in this code, the most frequently executed person is just the output statement, how many times did it execute? We can deduce it by the following method:
Value of execution Count I
1 3
2 3**2
3 3**3
.... ....
K 3**k
Assuming the execution to the time of the K-time end, the end condition is 3**k greater than or equal to n, convenient calculation, we take equal to both: 3**k=n. K is the time complexity we require!
Thus we can introduce the time complexity of the above example to log3n.
So how does a recursive function sometimes solve? The following example
void f (int n) { if (n==0) return; else return F (N/3);}
In fact, as in the example above, we also deduce the results in a recursive way:
The value of the number of executions n (or the value passed in this recursive)
1 n
2 N/3
3 n/(3**2)
,,, ....
K n/(3**k)
When n equals 0 The end of the loop, for the convenience of calculation, we take the k=0 time to end. (The description here is not very accurate, the tacit can)
n/(3**k) =1 we can still solve the time complexity of log3n
Finally, a little bit harder:
int I=0;int S=0;while (s<n) { i++; S=s+i;}
Value of the value s of the execution count I
1 1 1
2 2 1+2
3 3 1+2+3
... ... ...
K K 1+2+3+...K
1+2+3+...+k=n
(1+k) K/2=n
K=SQRT (N)
Complex time-complex reading methods (hypothesis method)