Liu zitao
When it comes to algorithms, I am afraid this is the big European mark, that is, O. What is the comparison-based Sorting Algorithm preferably O (nlgn. (Note: log and lg are not distinguished in this series. Both symbols indicate the base-2 logarithm)
To be more general, we can write it as O (g (n). What is this? In fact, it is a set that represents a series of functions that meet the conditions, what are the characteristics of these functions? That is, g (n) is their upper bound. More accurately, it is a constant c, and c * g (n) is their upper bound. For example, (Introduction to cut-off algorithms)
Suppose f (n) indicates the time of our quick sort, And we write f (n) = O (nlgn). At this time, you may ask, I didn't mean O (nlgn) just now) indicates a set. It is obviously a function in this set. How can I write a function = a set of functions? That's right. In fact, this writing is really not good, but we have already become a convention. The equal sign here indicates the symbols of the relationship between elements and collections.
Now let's take a look at the formal definition of Big O.
O (g (n) = {f (n): There is a normal number c, and an N0, for all N> N0, 0 <= f (n) <= c * g (n )}
In fact, combined with the above figure and the definition, the meaning of Big O is very obvious. It is a upper bound of f (n.
Similarly, we can define Big Omega and Theta.
Omega (g (n) = {f (n): There is a normal number c, and an N0, for all N> N0, 0 <= c * g (n) <= f (n )}
Of course, this is a lower bound of f (n.
Theta (g (n) = {f (n): There are normal numbers c1 and c2, and a N0, for all N> N0, c1 * g (n) <= f (n) <= c2 * g (n )}
The following figure shows the complete introduction to algorithms.
The following describes the concepts of Litter O and Litter Omega.
Through the above definition, we can find that in the definitions of Big O, Big Omega, And Theta, we all see "=". Of course, when we compare numbers, we have greater than or equal to, less than or equal to, and we also have greater than, less.
Of course, for this equal sign, we can say it is a tight bound, such as 2n ^ 2 = O (n ^ 2). This bound is asymptotically tight, however, for 2n = O (n ^ 2), this is not tight. Therefore, our Little O was born. In the introduction to algorithms, We use o-notation (Litter O) to denote an upper bound that is not asymptotically tight.
Let's take a look at the formal definition of Little O.
O (g (n) = {f (n):ArbitraryFor all N> N0, 0 <= f (n) <c * g (n )}
Similarly, we can define Little Omega.
W (g (n) = {f (n):ArbitraryFor all N> N0, 0 <= c * g (n) <= f (n )}
========================================================== ===== Gorgeous split line ============================================ =
For these tags, we can also consider the growth rate and limit.
For example, for little O, it means thatG(X)Grows much fasterF(X), Or similarly, the growthF(X)Is nothing compared to thatG(X). (Just copy it from wikipedia)
The rest of us should also be able to figure it out. (* ^__ ^ *) Xi ......
Finally, let's give a few common and simple mathematical formulas, and use them as remind.