I have learned data structures, and I have also learnedAlgorithmThe time complexity. I don't know if the time complexity will be pushed down in the current year, that is, probably to obtain the highest order of magnitude based on the number of basic statement executions.
For example
I = 0;
While (I <n) {I ++}; // This statement is executed n times
I = 0;
J = 0;
While (I <n)
{While (j <n
{
J ++; // This statement is executed n ^ 2 times
} I ++; j = 0;
}
The total number of executions should be n + N ^ 2, but the time complexity is only the highest, which is the real impact.ProgramSo the time complexity is O (n ^ 2)
Next, let's start with the question. In fact, the time complexity deduction of the Bipartite method is very simple. I just blame my head for being short-circuited. Let's send it to someone with the same short-circuited head as me.
The key idea of the Bipartite method is to assume that the length of the array is N, then the length after the binary is n/2, and the length after the binary is N/4 ...... Until the end of (of course this is the worst case, that is, the middle point we find each time is not what we are looking ), the number of binary statements is the number of basic statement executions. Therefore, we can set the number of times to X, N * (1/2) ^ x = 1, then x = logn, and the base number is 2.