Cache algorithm –LRU

Source: Internet
Author: User

Lru

LRU is the abbreviation for least recently used, which translates to "least recently used", that is, the LRU cache removes the least recently used data for the most recent data read. And the most often read, but also the most read, so, using the LRU cache, we can improve the system performance.

LRU implementation

1. Inserting new data into the list head;

2. Whenever the cache hits (that is, the cached data is accessed), the data is moved to the list header;

3. When the list is full, discard the data at the end of the list.

LRU Analysis

Hit rate

When there is hot data, LRU efficiency is very good, but the occasional, periodic batch operation will cause the LRU hit rate drops sharply, the cache pollution is more serious.

"Complexity"

Simple to implement.

Cost

A hit will need to traverse the linked list, find the hit block index, and then need to move the data to the head.

Lru-k

The k in Lru-k represents the number of recent uses, so LRU can be considered as LRU-1. The main purpose of lru-k is to solve the problem of "cache pollution" of LRU algorithm, whose core idea is to extend the criterion of "recently used 1 times" to "recently used K-Times".

Realize

There is a need to maintain a queue more than lru,lru-k to record the history of all cached data being accessed. Data is placed in the cache only when the number of accesses to the data has reached K times. When you need to retire data, Lru-k will eliminate the data that is the largest of the K access time from the current time. The detailed implementation is as follows:

1. Data is accessed for the first time and added to the Access History list;

2. If the data does not reach the K-visit after accessing the history list, it will be phased out according to certain rules (FIFO,LRU);

3. When the number of data accesses in the history queue reaches K times, the data index is deleted from the history queue, the data is moved to the cache queue, and the data is cached, and the cache queue is sorted again by time;

4. The cache data queue is re-accessed and reordered;

5. In the case of data elimination, the data at the end of the cache queue is eliminated, that is, the "last-second-most-visited" data is eliminated.

Lru-k has the advantages of LRU, at the same time can avoid the shortcomings of LRU, the actual application of LRU-2 is a combination of various factors after the optimal choice, LRU-3 or greater K-value hit rate will be high, but the adaptability is poor, need a large number of data access to the history of access records erased.

Analysis

Hit rate

Lru-k reduces the problem caused by "cache pollution", which is higher than LRU.

"Complexity"

Lru-k queue is a priority queue, the algorithm complexity and cost is relatively high.

Cost

Because Lru-k also needs to record objects that have been accessed but not yet cached, memory consumption is much higher than LRU, and memory consumption can be significant when the amount of data is large.

The lru-k needs to be sorted based on time (which can be sorted when it is retired or sorted instantly), and CPU consumption is higher than LRU.

Think the article works? Immediately: With friends A total of learning progress together! Further study is recommended:
    1. An analysis of HTTP protocols, cookies and session mechanisms, browser caching (read: 12383)
    2. Introduction to Distributed Cache System Memcached (read: 11064)
    3. Force refresh of local DNS cache records (read: 7346)
    4. Some thoughts on the design of the cache (read: 4865)
    5. Use Memc-nginx and Srcache-nginx to build efficient and transparent caching mechanisms (read: 4821)
    6. PHP Caching and Acceleration analysis and summarization (read: 4687)
    7. Caching design patterns for Web apps (read: 4629)
    8. Browser caching mechanism (read: 4580)
    9. Web cache acceleration Based on reverse proxy--cacheable CMS system design (read: 4169)
    10. Cache as King (read: 3741)

Cache algorithm –LRU

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.