2 ways to implement LRU algorithm in Python

Source: Internet
Author: User
Tags range in python

This article mainly introduces Python to implement the LRU algorithm 2 kinds of methods, this paper gives the implementation of ordereddict, with dict+list implementation of two methods, the need for friends can refer to the

Lru:least recently used, the least recently used algorithm. Its use of the scene is: in a limited space to store objects, when the space is full, will be based on a certain principle to delete the original object, commonly used principles (algorithms) have LRU,FIFO,LFU and so on. This algorithm is used in the computer's cache hardware, as well as in the page replacement of main memory to virtual RAM, and in the Redis caching system. I met this problem in an interview and a written test.

LRU algorithm is relatively simple, when the key is accessed (generally there are queries, update, add, when implemented in Get () and set () Two methods, place the key on the front (or end) of the queue, so that the key is in descending order of its last access time (or ascending To add a new object to the space, and if the space is full, remove the object from the tail (or team head).

In Python, you can use collections. Ordereddict is very convenient to realize the LRU algorithm, of course, if you do not think with ordereddict, that can be achieved with dict+list. This article mainly refers to the LRU CACHE in PYTHON, written very well, not only to achieve the function, but also simple and easy to read. Method One of the code and reference articles are basically the same, method two is my own thought out, more cumbersome, in fact, ordereddict itself is similar to the mechanism to achieve orderly.

However, the following implementation is problematic, and the cache's Key:value key value pair, value can only be immutable type. Because, if value is a mutable type, then for the same key, all calls to the Get (key) method return value refer to the same Mutable object, and when one of the value is modified, all value is modified, even if you do not invoke the set () method. This is what we do not want to see. Workaround I think of two kinds, one is the Mutable object is serialized and then stored, the Mutable object is converted to an immutable object, and the second is that the Mutable object is still stored, but when get (), it returns a deep copy so that the object returned by each getting () call does not affect each other. Recommend the first method. In addition, for key, the Str/unicode type is recommended.

There is also a problem when concurrency occurs because it involves writing to public resources, so you must lock the set (). In fact, all write operations to public resources are locked in concurrent situations. If there is no concurrency, there is only a single thread, which can be unlocked.

Method One: Implement with Ordereddict (recommended)

The code is as follows:

From collections Import Ordereddict

Class LRUCache (ordereddict):

' Cannot store mutable type objects, cannot concurrently access set () '

def __init__ (self,capacity):

Self.capacity = capacity

Self.cache = Ordereddict ()

def get (Self,key):

If Self.cache.has_key (key):

Value = Self.cache.pop (key)

Self.cache[key] = value

Else

Value = None

return value

def set (Self,key,value):

If Self.cache.has_key (key):

Value = Self.cache.pop (key)

Self.cache[key] = value

Else

If Len (self.cache) = = Self.capacity:

Self.cache.popitem (last = False) #pop出第一个item

Self.cache[key] = value

Else

Self.cache[key] = value

The test code is as follows

The code is as follows:

c = LRUCache (5)

For I in Range (5,10):

C.set (I,10*i)

Print C.cache, C.cache.keys ()

C.get (5)

C.get (7)

Print C.cache, C.cache.keys ()

C.set (10,100)

Print C.cache, C.cache.keys ()

C.set (9,44)

Print C.cache, C.cache.keys ()

Output is as follows

The code is as follows:

Ordereddict ([(5, 50), (6, 60), (7, 70), (8, 80), (9, 90)]) [5, 6, 7, 8, 9]

Ordereddict ([(6, 60), (8, 80), (9, 90), (5, 50), (7, 70)]) [6, 8, 9, 5, 7]

Ordereddict ([(8, 80), (9, 90), (5, 50), (7, 70), (10, 100)]) [8, 9, 5, 7, 10]

Ordereddict ([(8, 80), (5, 50), (7, 70), (10, 100), (9, 90)]) [8, 5, 7, 10, 9]

Method Two: Implement with Dict+list (not recommended)

The code is as follows:

Class LRUCache (object):

' Cannot store mutable type objects, cannot concurrently access set () '

def __init__ (self,capacity):

SELF.L = []

SELF.D = {}

Self.capacity = capacity

def get (Self,key):

If Self.d.has_key (key):

Value = Self.d[key]

Self.l.remove (Key)

Self.l.insert (0,key)

Else

Value = None

return value

def set (Self,key,value):

If Self.d.has_key (key):

Self.l.remove (Key)

Elif len (self.d) = = Self.capacity:

Oldest_key = Self.l.pop ()

Self.d.pop (Oldest_key)

Self.d[key] = value

Self.l.insert (0, key)

The test code is as follows

The code is as follows:

c = LRUCache (5)

For I in Range (5,10):

C.set (I,10*i)

Print C.D,C.L

C.get (5)

C.get (7)

Print C.D,C.L

C.set (10,100)

Print C.D,C.L

C.set (9,44)

Print C.D,C.L

Output is

The code is as follows:

{8:80, 9:90, 5:50, 6:60, 7:70} [9, 8, 7, 6, 5]

{8:80, 9:90, 5:50, 6:60, 7:70} [7, 5, 9, 8, 6]

{5:50, 7:70, 8:80, 9:90, 10:100} [10, 7, 5, 9, 8]

{5:50, 7:70, 8:80, 9:44, 10:100} [9, 7, 5, 8]

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.