Python Learning note-day027-complexity of the algorithm

Source: Internet
Author: User

The algorithm is the process of dealing with the problem (like the recipe for the wrong dish)


Complexity of algorithm based on time complexity and spatial complexity



Complexity of Time

First, a time frequency T (n) is mentioned, and the number of execution times of a statement in an algorithm is called the frequency of the sentence. An algorithm to execute the time spent, in theory, can be calculated, must be superior test can be obtained, but there is no need for all the algorithms are on the machine test, we just know which algorithm takes more time, which algorithm consumes less time can be. In an algorithm, the time spent by the algorithm is proportional to the number of executions of the statement in the algorithm, and the number of executions in the algorithm is much more than the time spent.


In the earlier mention of T (n), n is called the scale of the problem, and when n changes, T (n) is also constantly changing. His law of change is the complexity of time.

In general, the number of times the basic operation is repeated in the algorithm is a function of the size of the problem, denoted by T (N). If you use an auxiliary function

F (n), when n is approaching and infinitely large, the limit of T (n)/f (n) is a constant that is not equal to 0, then f (n) is the same order of magnitude function of T (N). as T (n) =o (f (n)), called O (f (n)) is the progressive time complexity of the algorithm, which is referred to as the complexity of time.


O (f (n)) is also called the Big O notation, "o" denotes the order of magnitude, used to tell you how long an algorithm spends and deals with a large amount of relationships. He's just a concept mark, and he's not going to be able to really calculate the exact amount of time an algorithm consumes.

The time frequency is different, but the time complexity may be the same. such as: T (n) =N2+3n+4 and T (n) =4n2+2n+1 have different frequency, but the time complexity is the same, all O (n2).


In different algorithms, if the execution times of the statements in the algorithm are controllable, that is, the number of executions is a constant, then the time complexity is O (1).


Common time complexity is: Constant order O (1), logarithmic orderO (log2N), linear order O (n), linear logarithmic order O (Nlog2N), Square Order O (N2), Cubic O (n3),..., K-Order O (nk), exponent order O (2N). With the increasing of the problem scale N, the complexity of the time is increasing and the efficiency of the algorithm is less.


When we talk about the complexity of time, if we don't make a special statement, we generally think of the worst-case time complexity-the worst time complexity. So why am I talking about the worst time complexity? The reason: In the worst case, time complexity is the upper limit of the algorithm run time, so that the algorithm will not run longer than this time.

In the worst case, the time complexity T (n) =0 ( N) indicates that the algorithm's run time cannot be greater than 0 (n) for any input instance.


There is also an average time complexity, which refers to the expected time of the algorithm when all possible input instances are present with equal probabilities.


Worst time complexity and average time complexity

The worst-case time complexity is called the worst time complexity. In general, it is not particularly stated that the time complexity of the discussion is the worst-case time complexity. The reason for this is that the worst-case time complexity is the upper bound of the algorithm's run time on any input instance, which guarantees that the algorithm will not run longer than any other.


In the worst case, the time complexity is T (n) =0 (n), which indicates that the algorithm will not run longer than 0 (n) for any input instance. The average time complexity is the expected run time of the algorithm when all possible input instances are present with equal probabilities.
Index Order 0 (2N), obviously, the time complexity is the exponential order 0 (2N) is extremely inefficient and cannot be applied when the value of n is slightly larger.




Calculate Time complexity

Cases:

There is a sequence lst of length n, from which the algorithm for finding the given value K is probably as follows:

LST = [a1,a2,......,an]

n = Len (LST)

I=1

While I<=n and lst[i]! = k

i + = 1 # tag Statement 01

Print (i)


In this algorithm, if there is no value equal to K in the sequence, then the execution frequency of the marked statement 01 is F (n) =n;

If the last element in the sequence is equal to K, then the value of the frequency f (n) of the marker statement 01 is constant 0




Evaluating the performance of algorithms using time complexity


The existing algorithm A and algorithm B two algorithms to solve the same problem;

The time complexity of algorithm A is T(n) =100n2

The time complexity of algorithm B is T(n) =5n3

When using these two algorithms to solve the problem,

If n<20, then algorithm B takes less time; when n=19

The time for algorithm A is 100*19**2 = 36100

The time for algorithm B is 5*19**3 = 34295

If n>20, then algorithm a spends less time; when n= 21 o'clock

The time for algorithm A is 100*21**2 = 44100

The time for algorithm B is 5*21**3 = 46305


The asymptotic time complexity of the two algorithms is O (n2) and O (n3), whichrepresent the quality of the time complexity of the two algorithms on a macro scale.

In the analysis of the algorithm, the general difference between the worst time complexity and the complexity of the progressive time is not distinguished.



Complexity of space

Spatial complexity is a measure of the amount of storage space that an algorithm occupies in the course of its operation. As S (n), defined as: the storage space consumed by the algorithm is a function of the problem size n. Asymptotic spatial complexity is also often referred to as spatial complexity. S (N) =o (g (n)).

It can be simply understood as the resource consumed by the algorithm during execution.

Generally includes several aspects:

Storage space used by the storage algorithm

The space used by the algorithm input/output (determined by the size of the problem that needs to be resolved, not changing with the algorithm)

Space that is temporarily occupied during the algorithm's operation (varies by algorithm)

The amount of storage space used by the algorithm itself (the writing length of the algorithm is determined)

Where the space occupied by the algorithm is not changed with the size of the problem, we call this algorithm in-place, which saves the storage algorithm.

When the space complexity of the algorithm is a constant, that is, not with the amount of data processed by the size of the change, can be represented as O (1)




Python Learning note-day027-complexity of the algorithm

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.