The small theory of algorithm--the third chapter changes the new Peach to the old character

Source: Internet
Author: User
Tags sin

Notes

This chapter is mainly about progressive notation and high school math memories.

Several tags:

    1. θ– Upper and lower bounds, bound values, equivalent to f (n) ∈[c1 * g (n), C2 * g (n)]
    2. ω– closed interval lower bound, preferably run time, equivalent to F (n) ∈[c * g (n), ∞)
    3. ω– open interval lower bound, preferably run time, equivalent to f (n) ∈ (c * g (n), ∞)
    4. 0– closed interval Upper bound, worst run time, equivalent to F (n) ∈[0, c * g (n)]
    5. ο– open interval Upper bound, worst run time, equivalent to F (n) ∈[0, c * g (n))

The rest, the content of high school math, do not copy it.

So what is the meaning of mathematics for the introduction of algorithms? Everyone is afraid of maths, I am also afraid of maths, I think this thing is just destroying my IQ confidence ... However, in the field of science and engineering to mix, mathematics must be faced.

Chenhao (Weibo @ Left ear mouse) There is a blog post, software development of the "triple door", which tells the programmer's 3 stages, business functions, business performance, business intelligence, I still stay in the 1th, 2 stages, if we want to go to the 3rd stage, mathematics is a basic skill.

When I was learning physics, the teacher taught us in class, physicists do not need to be as rigorous as mathematicians to learn to deduce mathematics, but to the mathematics to live, creative use in the field of physics. The goal is to develop an intuition about the relevance of mathematical formulas and natural phenomena, and to gain insight into the world truth, to become tall and handsome, to win white and rich beauty ... Like Yang Ning Yang 82 can also marry 28 ... hahaha, I faked the latter half. The math in the algorithm, too.

Exercise Answer3.1-1

Let f (n) and g (n) is asymptotically nonnegative functions. Using the basic definition ofθ-notation, prove that Max (f (n), g (n)) =θ (f (n) + g (n)).

At first, I thought it was a simple proof of the obvious, and it looked a little troublesome ...

for n > n1, c1 * k(n) <= f(n) <= c2 * k(n), f(n) = Θ(k(n))for n > n2, c3 * l(n) <= g(n) <= c4 * l(n), g(n) = Θ(l(n))for n > max(n1, n2), max(f(n), g(n)) <= f(n) + g(n) <= c2 * k(n) + c4 * l(n)

Here, the proof max(f(n),g(n)) = O(f(n) + g(n)) seems to be almost there, but it's not enough to prove Ω (the Nether). However, it is also possible to be flexible.

max(f(n),g(n) >= 1/2 * f(n) + 1/2 * g(n)

The advantage of this inequality is that at any point N, if

The

Because of the symmetry, the above inequalities are g(n) > f(n) also established at the time, so, hahaha, Ω (Nether) also proved.

for n > max(n1, n2), max(f(n), g(n)) > 1/2(c1 * k(n) + c3 * l(n))

Taking full advantage of the nature of Max is the key to solving this small problem ... IQ less time to fill ah ...

3.1-2

Show that for any real constants A and B, where B > 0,

(n + a)^b = Θ(n^b)

Well, this proves it. It's actually pretty boring. The two-item expansion is then carried out using the highest-order-useful nature. Just push the question.

Oh, here A and B are real numbers, but I guess the rational numbers and real numbers don't affect this conclusion, no way, Haha, as a physicist, we only have the strength to pursue the mathematical rigor, can be lazy lazy.

3.1-3

Explain Why the statement, "The running time of algorithm A was at least O (n^2)" is meaningless.

This is actually a GRE reading comprehension problem.

At least, meaning minimum, minimum, >=

O (n^2), meaning <=, Max, Max.

The two contradict each other, so it's nonsense. The meaning of the sentence is comparable to:

According to one of my friends who didn't want to be named, Mr. Tangmaru said he was a yellow-light teacher ...

3.1-4

is 2 (n+1) = O (2n)? is 22n = O (2n)?

The previous one, 2^(n+1) = 2 * 2^n = O(2^n) and so established

The latter one, n is a variable, 2^2n = 2^n * 2^n and so does not set

3.1-5

Prove theorem 3.1.

Theorem 3.1

For any of the functions f (n) and g (n), we have an F (n) =θ (g (n)) if and only if f (n) = O (g (n)) and f (n) =ω (g (n)).

This is to take the definition of the matter, do not do, hey.

3.1-7

Prove that O (g (n)) andω (g (n)) is the empty set.

Ditto, do not do.

3.1-8

We can extend our notation to the case of the parameters N and m so can go to infinity independently at different rates. For a given function g (n,m), we denote by O (g (N,m) the set of functions

O(g(n,m)) = { f(n,m) : there exist positive constants c, n0, and m0 such that 0 <= f(n,m) <= cg(n,m) for all n >= n0 or m >= m0 }

Give corresponding Definitions forω (g (n,m)) andθ (g (n, m)).

Continue to the same, boring work not to do. Perhaps in some cases it is necessary to use a double argument on the case.

3.2-1

Show that if f (n) and g (n) is monotonically increasing functions, then so is the functions f (n) + (g (n) and F (g ()), and If f (n) and g (n) is in addition nonnegative, then f (n) • g (n) is monotonically increasing.

High school math problem, maybe a junior high school math problem?

3.2-2

Prove equation (3.16).

a^logb(c) = c^ logb(a)          3.16

Go on

3.2-3

Prove equation (3.19). Also prove that and n! = ω(2^n)n! = o(n^n)

lg(n!) = Θ(nlgn)                3.19

It's a difficult one for me.

lg(n!) < lg(n^n) = nlgn

The upper bound is good proof, then the Nether?

lg(n!) = lg(n*(n-1)...1) = lgn + lg(n-1) + lg(n-2) ... lg1

Oh, the hint says to use that sterling approximation.

n! = √(2πn) (n/e)^n (1 + Θ(1/n))lg(n!) = lg√(2πn) + nlg(n/e) + lg(1+Θ(1/n))       = (1/2)lg(2πn) + n(lgn - lge) + lg(1+c/n))       = Θ(lgn) + Θ(nlgn) + lg(n+c) - lgn       = Θ(nlgn)

Well, it's easy to prove the formula by the great God of sterling, but how does sterling think of this approximation? Not to think for a moment ... Maths is so magical.

OK, continue to prove the other two formulas.

n! = n(n-1)...2*1

When n > A number, I guess

n(n-1)..2*1 > 2...2 (n个2)

Because the number of both sides is N, for the front n items to the left is greater than the right, only the last one 1<2, so, find a hedge a bit

(n-1)...2 > 2..2 (n-2个2)n*1 > 2*2 (剩下2个2)

That is, when n > 4 the above two inequalities are established, this time

n! > 2^nn! = ω(2^n)

OK, continue to prove the last formula.

n! < n^nn(n-1)...2*1 < n...nn! = o(n^n)

Both sides are N, the left side is less than the right, proof.

3.2-4 *

is the function"lgn]! polynomially bounded? is the FUNCTION"LG lgn]! polynomially bounded?

For the first question, let me try it out, assuming

n = 32, lgn = 5, 5! = 120n = 16, lgn = 4, 4! = 30n = 8,  lgn = 3, 3! = 6

Here n at the speed of 2^k, "lgn]! at the speed of k!, which becomes faster? What is the difference between the two?

By the above exercise 3.2-3 can know, k! = ω(2^k) k! growth faster, which is also from the temptation above me can be seen, at the same time, k! than 2^k greater than a polynomial gap, so I boldly guess, "lgn]! is not a polynomial growth. The concrete proofs are left to mathematicians, Wahaha.

So the question is a little bit more "lg lgn]!, is the comparison between 22k and k!.

k = 1, 2^2^k = 4,   k! = 1k = 2, 2^2^k = 16,  k! = 2k = 3, 2^2^k = 256, k! = 6k = 4, 2^2^k = 65536, k! = 24

In other words, the growth rate of 2 2 K is far greater than that of k!, and conversely, to "LG lgn]! , n is linearly increased, "lg lgn]! On the axis will be less than N, and finally tend to 0, so I guess

「lg lgn]! = Θ(1)
3.2-5 *

Which is asymptotically larger:lg (LG * N) or LG * (LGN)?

In fact, I do not understand this * number in the end who God horse mean? First, empty. Oh, go back to reading the textbook, this high school didn't teach.

n = 16lg * n = 3lg(lg * n) = lg3lg * (lg 16) = lg * 4             = 2lg 3 < 2, 所以看起来后面那个大一点,换个数字n = 65536lg * n = 4lg(lg * n) = 2lg * (lg 65536) = lg * (16)                = 3

Well, after two guesses, the basic can be seen in the back of the big.

3.2-6

Show that the Golden Ratioφand its conjugateφ ' both satisfy the equation

x^2 = x + 1

The definition of the golden ratio, two times the solution of the equation, Junior math skipped.

3.2-7

Prove by induction The ith Fibonacci number satisfies the equality

F[i] = (φ^i -  φ‘^i) / √5

Whereφis the golden Ratio andφ ' is its conjugate.

Well, the continuation of the inductive method, apparently

F[0] = 0F[1] = 1

Assume

F[i-1] = (φ^(i-1) - φ‘^(i-1)) / √5F[i-2] = (φ^(i-2) - φ‘^(i-2)) / √5

The

F[i] = F[i-1] + F[i-2]     = (φ^(i-1) + φ^(i-2) - φ‘^(i-1) - φ‘^(i-2)) / √5     = (φ^(i-2)(φ + 1) - φ‘^(i-2)(φ‘ + 1)) / √5

Gather up.

φ + 1 = (3 + √5) / 2φ^2   = (1 + 2√5 + 5) / 4 = (3 + √5) / 2φ + 1 = φ^2φ‘+ 1 = φ‘^2 //同理

After substituting, get

F[i] = (φ^i -  φ‘^i) / √5

Evidence.

On the Fibonacci sequence, there's a wonderful speech on Ted, a criticism of the education system, and a look at

The magical Fibonacci sequence.

I've never been able to understand why this series is related to the Golden Section, at least not to see the deep connection, the formula can see a lot of things, but still lack of intuitive understanding.

3.2-8

Show that klnk = Θ(n) impliesk = Θ(n/lnn).

klnk = Θ(n) = nn / lnn = klnk /(ln(klnk))        = klnk / (lnk + lnlnk)        < klnk / lnk (当k足够大之后,lnlnk > 0的情况下)

In this way, it can be seen that the progressive time of K, of course, is very rough proof, the middle of the constant inequality has not been considered. The significance of this topic is that for progressive time, there are not only the upper bound, the lower bound, the conductivity, the commutative law, the symmetry of the function, but also the characteristics of the inverse function, which may be used in some fields.

Answers to questions3-1 Asymptotic behavior of polynomials

Let

p(n) = ∑a[i]n^i (i = 0 to d)

where A[d] > 0, bes a degree-dpolynomial in N, and let K is a constant. Use the definitions of the asymptotic notations to prove the following properties.

a. if k >= d, then p(n) = O(n^k).b. if k <= d, then p(n) = Ω(n^k).c. if k  = d, then p(n) = Θ(n^k).d. if k  > d, then p(n) = ο(n^k).f. if k  < d, then p(n) = ω(n^k).

hahaha, I'm not going to be a problem.

3-2 Relative Asymptotic growths

Indicate, for each pair of expressions (A, B) in the table below, whether A is O, O,ω,ωorθof B. Assume that K >= 1, ∈> 0, and C > 1 are constants. Your answer should is in the form of the table with a "yes" or "no" written in each box.

    A           B           O   o   Ω   ω   Θa.  (lgn)^k     n^∈         √   √   ×   ×   ×b.  n^k         c^n         √   √   ×   ×   ×c.  √n          n^(sin n)   ×   ×   √   √   ×d.  2^n         2^(n/2)     ×   ×   √   √   ×e.  n^(lgc)     c^(lgn)     ×   ×   ×   ×   √f.  lg(n!)      lg(n^n)     ×   ×   ×   ×   √

A. If k=2,∈=1, I calculated is (LGN) ^2=o (n), go back to reading the textbook (textbook is used to turn back haha ha), the book has such a formula:

(lgn)^b = o(n^a)

B. The same

n^b = o(a^n)

C. sin n <= 1, >= -1 , and √n can always grow, so √n is the winner.

D. 2^n = 2^(n/2) * 2^(n/2) , the 2^n is much bigger, more good.

E. These two formulas are actually the same! It's just a fake alibi! There is only one truth! Hahaha ...

F. This is actually a makeover, lg(n^n) = nlgn

lg(n!) = Θ(nlgn)                3.19

So the two are the same, and they're turning a corner.

This problem actually helps us to build some intuition, such as exponential function faster than power function, logarithmic n-time or no power function fast, etc.

3-3 ordering by asymptotic growth rates

A. Rank the following functions by order of growth; That's, find an arrangement G1, G2,..., G30 of the functions satisfying G1 =ω (G2), G2 =ω (G3), ..., g29 =ω (G30). Partition your list to equivalence classes such that functions f (n) and g (n) is in the same class if and only if f (n) = Θ (g (n)).

lg(lg*n)    2^(lg*n)    (√2)^(lgn)  n^2         n!      (lgn)!(3/2)^n     n^3         (lgn)^2     lg(n!)      2^2^n   n^(1/lgn)lnlnn       lg*n        n*2^n       n^lglgn     lnn     12^lgn       (lgn)^lgn   e^n         4^lgn       (n+1)!  (√lgn)lg*(lgn)    2^(√2lgn)   n           2^n         nlgn    2^(2^(n+1))

It's going to be a menial job here. Copy a half-day title, as if the alignment still become a problem?
Not much, the solution to this problem is to first set the benchmark, classification, from higher to lower order direct classification:

指数之指数   2^2^n,2^(2^(n+1))阶乘      n!,(n+1)!指数函数    (3/2)^n,2^n,e^n幂函数     n,n^2,n^3对数函数    lnn,(lgn)^2多重对数    lg(lg*n),lg*(lgn) == lg*n常数      1

Note that, by definition, the multiple logarithm is less than once to mean

lg*(lgn) = (lg*n) - 1=>lg*(lgn) = Θ(lg*n) //这些符号太啰嗦,后面我就用等号大于号小于号之类了的:-)

The next job is to plug in the rest and see where it fits, one by one.

1.2^ (lg*n)
2^(lg*n) < 2^(lgn) = n^(lg2) = n

So this is smaller than the power function, and is it smaller than the logarithmic function?

n = 16lg * n = 32^3 = 8lg n = lg16 = 4

Therefore, it is reasonable to speculate that it is larger than the general multiple logarithm, but less than the logarithmic function

对数函数    lnn,(lgn)^2----        2^(lg*n)多重对数    lg(lg*n),lg*(lgn),lg*n
2. (√2) ^ (LGN)
(√2)^(lgn) = n ^ lg(√2) = n ^ (1/2) = √n

Insert the power function to the smallest grade

幂函数     (√2)^(lgn),n,n^2,n^3
3. (LGN)!
(lgn)! < n!n = 16(lgn)! = 4! = 242^16 >> 24(lgn)! < 2^n16^2 = 256 >>24(lgn)! < n^2n = 32(lgn)! = 5! = 120n^2 = 1024而显然(lgn)! > (lgn)^2

Insert the last of the logarithmic function

对数函数    lnn,(lgn)^2,(lgn)!
4. LG (n!)
lg(n!)

According to the exercise 3.2-3

lg(n!) = nlgn

In the middle of the power function

幂函数     (√2)^(lgn),n,nlgn == lg(n!),n^2,n^3
It's too long to be copied again.
指数之指数   2^2^n,2^(2^(n+1))阶乘      n!,(n+1)!指数函数    (3/2)^n,2^n,e^n幂函数     (√2)^(lgn) == √n 见【2】,n,nlgn == lg(n!) 见【4】,n^2,n^3对数函数    lnn,(lgn)^2,(lgn)! 见【3】----        2^(lg*n) 见【1】多重对数    lg(lg*n),lg*(lgn) == lg*n常数      1

The remaining ones that need to be inserted

n^(1/lgn), lnlnn, n*2^n, n^lglgn, 2^lgn, (lgn)^lgn, 4^lgn, (√lgn), 2^(√2lgn)
5. n^ (1/LGN)

Go on

常数      1 < n^(1/2) (当 n > 4时)n = 16n^(1/lgn) = 16^(1/4) = 2lg16 = 4n^(1/lgn) < lgnn = 64n^(1/lgn) = 64^(1/6) = 2

Guess

n^(1/lgn) = 2n^(1/lgn) = n^(logn 2) = 2

So

常数      1, n^(1/lgn) == 2
6. Lnlnn
lnlnn < lnn但猜测 lnlnn > 2^(lg*n)对数函数    lnn,(lgn)^2,(lgn)!----        lnlnn----        2^(lg*n)
7. N*2^n
n*2^n > 2^ne^n = (2.71)^n = 2^n * (1.x)^n > 2^n * n

Insert

指数函数    (3/2)^n,2^n,n*2^n,e^n
8. N^LGLGN
n^lglgn < n^(lgn)n^lglgn > n^3 (当n足够大)那么和2^n比较呢?n = 8n^(lglgn) = 8^lg3 = (2^3)lg3 = (2^lg3)^3 = 3^3 = 812^8 = 256

Guess

指数函数    (3/2)^n,2^n,e^n----        n^lglgn幂函数     (√2)^(lgn) == √n,n,nlgn == lg(n!),n^2,n^3
9.2LGN, 4LGN
2^lgn = n4^lgn = n^2幂函数     (√2)^(lgn) == √n,2^lgn == n,nlgn == lg(n!),n^2 == 4^lgn,n^3
(LGN) ^LGN
(lgn)^lgn = n^(lglgn)
11. (√LGN)
(√lgn) < lnn对数函数    (√lgn),lnn,(lgn)^2,(lgn)!
12.2^ (√ (2LGN))
2^(√(2lgn)) = (2^lgn)^(√(2/lgn)) = n^(√(2/lgn))当n变大时,指数其实无限趋向于0,总之比0.5要小,所以排名在幂函数里面最小幂函数     2^(√2lgn),(√2)^(lgn) == √n,n,nlgn == lg(n!),n^2,n^3

So, the final total ranking is!

指数之指数2^(2^(n+1))2^2^n阶乘(n+1)!n!指数函数e^nn*2^n                   见【7】2^n(3/2)^n--无名函数--n^lglgn == (lgn)^lgn    见【8】 见【10】幂函数n^3n^2 == 4^lgn            见【9】nlgn == lg(n!)          见【4】2^lgn == n              见【9】(√2)^(lgn) == √n        见【2】2^(√2lgn)               见【12】对数函数(lgn)!                  见【3】【错,详见后】(lgn)^2lnn(√lgn)                  见【11】--无名函数--lnlnn                   见【6】2^(lg*n)                见【1】多重对数lg*(lgn) == lg*nlg(lg*n)常数n^(1/lgn) == 2          见【5】1

Exhausted, one night on such a problem is finished, go back to the answer ~ ~

After the answer, I found that there is a small problem, that is, for

(lgn)!

I misjudged, the answer is

(lgn)! > n^3

The reason is to take the logarithm with both sides

lg(x!) = xlgx (习题3.2-3)lg((lgn)!) = lgn * lg(lgn)

And try to take the logarithm of the n^3.

lg(n^3) = 3lgn

So, this is

lgn * lglg(n)  vs  3lgn

It can be seen that both sides have lgn factors, but LGLG (n)? 3, so the left victory! It can be said that this is beyond my expectation, I was in front of the manual poor lifting method made the wrong judgment, here with exquisite mathematics to do a watertight push to-_-, then, I can learn from it? Can I rely on manual exhaustion in real situations? For a math idiot, there are only 2 ways to handle it.

    1. With a computer simulation, these 2 functions are really drawing out
    2. Get a math expert, talk to him where I'm not sure, ask him to help me out.

B. Give an example of a single nonnegative function f (n) Such this for all functions GI (n) in part (a), f (n) is neither O ( g (n)) norω (g (n)).

The above functions basically fill the axis, if you want to find a function that is neither o nor ω, you need to find a very powerful function, such as

2^(2^(2^n)) * | sin n |

First, the function is larger than the exponential exponent, but it can be 0 at the minimum.

3-4 Asymptotic notation Properties

Let f (n) and g (n) is asymptotically positive functions. Prove or disprove each of the following conjectures.

A. f (n) = O (g (n)) implies g (n) = O (f (n)).

Error, F (n) <= g (n) does not represent G (n) <= f (n)

B. f (n) + g (n) =θ (min (f (n), g (n)))

It looks like 3.1-1, but Max and Min are different in nature, and I set a few of Max's proofs to be found to be out of the question. In turn, suppose

g(n) = n^2, f(n) = n则f(n) + g(n) = n^2 + n = Θ(n^2)min(f(n), g(n)) = n = Θ(n)

So, not set up

C. f (n) = O (g (n)) implies LG (f (n)) = O (LG (g (n))), where LG (g (n)) >= 1 and f (n) >= 1 for all sufficiently large n.

is established because LG is a monotonically increasing function, a

f(n) <= g(n) (当n大于某一常数时)则lg(f(n)) <= f(g(n))f(n) > 1保证了lg(f(n)) > 0 ,满足了本章所有函数都非负的初始条件。

D. f (n) = O (g (n)) implies 2 (f (n)) = O (2g (n)).

Ibid is also established, because the exponential function is positive, so no additional conditions are required

E. f (n) = O ((f (n)) ^2).

Does not hold, if f (n) < 1, about square the smaller, so cannot write so.

F. f (N) = O (g (n)) implies G (n) =ω (g (n))

Established by definition

G. F (N) =θ (f (N/2))

is not established, F (N/2) is the f (n) on the x-axis of the transverse elongated one times, for the polynomial function is established, N/2 does not affect the maximum number of times, but for the exponential function such as 2^n, it does not.

f(n) = 2^nf(n/2) = 2^(n/2)

Obviously F (N/2) and F (n) are not on the same shift.

H. f (n) + O (f (n)) =θ (f (n))

Was founded

f(n) + o(f(n)) >= f(n)f(n) + o(f(n)) <= 2f(n)

Get proof

3-5 Variations on O andω

Some authors define in a slightly different it than we do; Let's useω∞ (read "Omega Infinity") for this alternative definition. We say that f (n) =ω∞ (g (n)) if there exists a positive constant c such this f (n) > CG (N) for infinitely many integers N.

*A. Show that is functions f (n) and g (n) that is asymptotically nonnegative, either f (n) = O (g (n)) or f (n) = =ω∞ (g (n)) or both, where as this is not true if we useωin place ofω∞ *

First O and Ω are mutually exclusive and cannot be both. Secondly, what are the benefits of this ω∞ compatible? It's hard for me to imagine the both situation.

存在c1, f(n) <= c1 g(n), n > n0存在c2, f(n) > c2 g(n), n 有无穷个

B. Describe the potential advantages and disadvantages of Usingω∞instead ofω∞to characterize the running times of prog Rams.

There is no benefit of infinite N, because infinity is not equal to all greater than n0, so it is not guaranteed the properties of f (n), such as sin (n) + 1 =ω∞ (1), because sin (n) fluctuates between 1 and 1, which does not see the beauty of the problem, and skips first.

Some authors also define O in a slightly different manner; Let's use O ' for the alternative definition. We say that f (n) = O ' (g (n)) if and if |f (n) | = O (g (n))

C. What happens-direction of the "if and only if" in theorem 3.1 if we substitute O ' for o but still useω?

Theorem 3.1For any two functions f(n) and g(n), we have f(n) = Θ(g(n)) if and only if f(n) =  O(g(n)) and f(n) =  Ω(g(n)).

I can not understand the problem, out of the absolute value, and this chapter is more than 0 of the function ... Keep skipping.

D The problem is not copied, another departure from the premise of the setting

3-6 iterated functions

*We can apply the iteration operator * used in the LG * function to any monotonically increasing function f (n) over th E reals. For a given constant c∈r, we define the iterated function F.C. by**

f.c.*(n) =  min{ i >=0,   f(i)(n) <= c }

* Which need not being well defined in allcases. In other words, the quantity F.C. Is the number of iterated applications of the function f required to reduce their argument down to C or less.**

For each of the following functions f (n) and constants C, give as tight a bound as possible on F.C.

First of all, F.c.* (n) is also a function of n, so a tight bound is required to represent

   f(n)    |    c     |   f.c.*(n)a. n - 1   |    0     |   n

A is obvious.

b. lgn     |    1     |   lg * n  

b It's a little difficult, what function do you use to indicate that you need to recursive LG several times to 1? This cannot be expressed directly, only with the LG * N in the book

c. n/2     |    1     |   lgn

C is still relatively clear, requires a few 2, that is, 2 of several times to n, verify that n = 2, i = 1, in line with the fix

d. n/2     |    2     |   lgn - 1

Less than once, continue to pass validation

e.√n       |    2     |

This is not, open a few times less than 2? It seems that there is no regularity, the back can not be done, just put it ...

The small theory of algorithm--the third chapter changes the new Peach to the old character

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.