LRU Cache Algorithm and PYLRU

Source: Internet
Author: User

This write slightly tangled, the algorithm principle, the library all is ready-made, I called several functions just, this has what good to write? But think about it, or you can introduceLRUthe principle and simple usage of the algorithm. LRU(Least recently used,least recently used) is a memory page replacement algorithm. What is memory page substitution? We know that the speed of the disk is very slow relative to the speed of the memory. We need to query the data, not every time to go to the disk to check, need to set a space in memory, some of the commonly used data in this space, later check the time directly here to check, and do not have to go to the disk, thereby playing the role of "acceleration". But this space is certainly much smaller than the size of the disk. So what kind of data is right here? Of course it's the usual data. What kind of data is the "usual" data? Here are a few strategies. such as the simplest FIFO (first-in-one-out), RR (Time-slice rotation), and so on, of course, this is our most familiar queue and stack practices. LRU is also a strategy that is based on the observation and assumption that recently accessed data is the most frequently accessed in all data. Therefore, in such a space, the most recently visited pages are treated as "frequent access", and have not been accessed has been replaced. This idea is one of the most common ways of operating system storage management, and is now widely used as a "cache" concept. Cache thinking is also widely used, such as registers, such as memory, network, database, IO and so on, as long as there is no input/output speed mismatch, the cache can be used as a powerful weapon. It's a little far. Next, let's talk about the principle of LRU algorithm. Let's say we open a pathetic little space. As a cache, only 5 numbers can be saved and the page number is 0,1,2,3,4. Then a sequence of queries is required: 4,7,0,7,1,0,1,2,1,2,6. Then the situation will appear as shown: The whole query process is: Query 4, the cache does not exist, to the disk to check, and put 4 in the cache, query 7, similar to the above situation, query 0, similar to the above situation, query 7, directly in the cache, then 7 as "recent" data, put in the latest location ; query 1, the cache does not exist, to disk check, and put 4 in the cache, query 0, directly in the cache, then 0 as "recently" checked data, put in the latest position; Later, and so forth. Now you probably know what the LRU algorithm is and why it can be used as a cache. Here is a good analysis of the article, you can refer to: Graphic cache elimination algorithm one of the LRU now my project needs to use the cache, but forget the name, only remember the principle, and asked the group:I need such a data structure, query efficiency, like dictionaries and queues, but recently accessed the data last team, a long time no access to the data is kicked out first。 The big God in the group pointed me to see LRUCache, suddenly, that is the name! So Baidu a bit, the results found a lot of principles and Java implementation (such as http://dennis-zane.iteye.com/blog/128278), is not seen in Python. Then to the group to consult the Great God, the great God of one language: Pylru. Again to Baidu a bit, sure enough there is this thing! PYLRU can be installed using PIP, you can view the download and use method on PyPI: https://pypi.python.org/pypi/pylru/1.0.9 to use and fry the chicken simple, in the words of one person is: You python really do not face. This library is written purely in Python and is interesting to see its implementation. The library is very short, with only hundreds of lines of code, and the annotations account for more than half. Test: A bit verbose, but the results are clear. If the future involves the write cache, can think of this thing, is lucky. This is why we recommend mastering a certain algorithm base, as well as operating system, architecture and other basic courses of reasons: not in practice let you really write a sorting algorithm, write a cache, but when you in a certain situation, can suddenly realize: this special is not a XXX algorithm, I have contacted before. It's better to write a weird data structure than Chi Chi Half-day.This article refers to:1, "Modern operating System" chapter fourth: Storage Management: 4.4: Page replacement algorithm 2, csdn a blog: LRU algorithm: HTTP://BLOG.CSDN.NET/LUOWEIFU/ARTICLE/DETAILS/8297084/3, 360 Library: Graphic cache elimination algorithm one of the LRU:HTTP://WWW.360DOC.COM/CONTENT/14/0704/09/10504424_391894263.SHTML4, Iteye:lrucache Java Version implementation: http://dennis-zane.iteye.com/blog/1282785, Pypi:pylru download and use: https://pypi.python.org/pypi/pylru/1.0.9

LRU Cache Algorithm and PYLRU

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.