Time complexity of Python built-in methods

Source: Internet
Author: User

Reprinted from:http://www.orangecube.NET/Python-time-complexity

This page covers the time complexity of several methods in Python (or "big Euro", "large O"). The computation of this time complexity is based on the CPython implementation of the current (at least 2011 ago). Other python implementations, including older versions or CPython implementations that are still in development, may have slightly different performance characteristics, but typically do not exceed an O (log n) entry.

In this article, ' n ' represents the number of elements in the container, ' K ' represents the value of the parameter, or the number of parameters.

List list

Consider the average situation in a completely random list.

The list is implemented as an array. The maximum overhead occurs when the increase exceeds the current allocation size, in which case all elements need to be moved, or the element is inserted or deleted near the starting position, in which case all elements behind that position need to be moved. If you need to make additions or deletions at both ends of a queue, you should use collections.deque (two-way queue)

Operation Average situation Worst case scenario
Copy O (N) O (N)
append[Note 1] O (1) O (1)
Insert O (N) O (N)
Take elements O (1) O (1)
Change element O (1) O (1)
Delete Element O (N) O (N)
Traverse O (N) O (N)
Take slices O (k) O (k)
Delete slices O (N) O (N)
Change slices O (K+n) O (K+n)
extend[Note 1] O (k) O (k)
Sort O (n log n) O (n log n)
List multiplication O (NK) O (NK)
X in S O (N)
Min (s), Max (s) O (N)
Calculate length O (1) O (1)
Bidirectional Queue ( collections.deque

Deque (double-ended queue, bidirectional queues) is implemented in the form of a doubly linked list (well, a list of arrays rather than objects, for greater efficiency). Both sides of the two-way queue are accessible, but the elements from the middle of the lookup queue are slower, and the elements are more slowly deleted.

Operation Average situation Worst case scenario
Copy O (N) O (N)
Append O (1) O (1)
Appendleft O (1) O (1)
Pop O (1) O (1)
Popleft O (1) O (1)
Extend O (k) O (k)
Extendleft O (k) O (k)
Rotate O (k) O (k)
Remove O (N) O (N)
Collection (SET)

Operations that are not listed can refer to dict--implementations that are very similar.

Operation Average situation Worst case scenario
X in S O (1) O (N)
and set S|t O (Len (s) +len (t))
Intersection s&t O (Min (Len (s), Len (t)) O (len (s) * Len (t))
Difference Set S-t O (Len (s))
S.difference_update (t) O (len (t))
Symmetric difference Set S^t O (Len (s)) O (len (s) * Len (t))
S.symmetric_difference_update (t) O (len (t)) O (len (t) * Len (s))

From the source, the difference set ( s-t , or s.difference(t) ) operation and update for the difference set ( s.difference_uptate(t) ) operation time complexity is not the same! The former is the addition of elements in S, but not in T, to the new collection, so the time complexity is O (Len (s)), which is to remove the elements in T from S, so the time complexity is O (len (t)). Therefore, be careful when using, depending on the size of the two collections and whether a new collection is required to select the appropriate method.

The s-t operation of a collection does not require that T must also be a collection. As long as T is an object that can be traversed.

Dictionary (dict)

The average of the following dictionaries is based on the following assumptions:
1. The hash function of the object is adequate for robust, and there is no conflict.
2. The key of the dictionary is randomly selected from the collection of all possible keys.

Tip: Use only strings as keys to the dictionary. This does not affect the time complexity of the algorithm, but it can have a significant impact on the constant term, which determines how much of your program can run.

Operation Average situation Worst case scenario
copy [Note 2] O (N) O (N)
Take elements O (1) O (N)
changing elements [Note 1] O (1) O (N)
Delete Element O (1) O (N)
Traversal [Note 2] O (N) O (N)

Note:
[1] = These operations rely on the ' amortized ' part of ' amortized worst case '. Individual actions surprisingly long, depending on the the container.

[2] = for these operations, the worst case n was the maximum size the container ever achieved, rather than just the current Size. For example, if n objects is added to a dictionary, then N-1 is deleted, the dictionary would still be sized for N object s (at least) until another insertion is made.

Time complexity of Python built-in methods

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.