2--lfu class of "turn" cache elimination algorithm series

Source: Internet
Author: User

Original address: http://www.360doc.com/content/13/0805/16/13247663_304916783.shtml

1. Lfu class 1.1. LFU1.1.1. Principle

The LFU (Least frequently used) algorithm eliminates data based on the historical frequency of data access, and its core idea is "if the data has been accessed many times in the past, it will be accessed more frequently in the future."

1.1.2. Implementing

Each block of Lfu has a reference count, and all data blocks are sorted by reference count, and chunks with the same reference count are sorted by time.

The specific implementation is as follows:

1. Insert the new data into the tail of the queue (because the reference count is 1);

2. When the data in the queue is accessed, the reference count increases and the queue is reordered;

3. When you need to retire data, delete the last chunk of the sorted list.

1.1.3. Analysis

L Hit ratio

In general, LFU efficiency is better than LRU, and can avoid periodic or occasional operations that cause a drop in cache hit ratio. But Lfu need to record the historical access records of the data, and once the data access pattern changes, LFU takes longer to apply the new access pattern, namely: LFU There is a "cache pollution" effect of historical data affecting future data.

L Degree of complexity

A queue is maintained to record access records for all data, and each data needs to maintain a reference count.

L Cost

Access records for all data need to be recorded, memory consumption is high, and need to be sorted based on reference counts, with high performance consumption.

1.2. lfu*

1.2.1. Principle

The core idea of an improved algorithm based on LFU is to "retire only the data accessed once."

1.2.2. Implementing

lfu* data cache implementation and LFU, the difference is that when the data is retired, lfu* only the reference count of 1 of the data, and if all the reference count of 1 of the data size is not as large as the newly added data, the data is not retired, the new data is not cached.

1.2.3. Analysis

L Hit ratio

Similar to LFU, but because it does not eliminate the reference count greater than 1 of the data, once the access mode changes, lfu* cannot cache the new data, so the application of this algorithm is relatively limited.

L Degree of complexity

You need to maintain a queue that records data with a reference count of 1.

L Cost

Much lower than LFU, there is no need to maintain historical access records for all data, just maintain data with a reference count of 1 and do not need to be sorted.

1.3. lfu-aging1.3.1. Principle

The core idea of an improved algorithm based on LFU is "to consider the access time in addition to the number of visits." The main reason for this is to solve the problem of LFU cache pollution.

1.3.2. Implementing

Although Lfu-aging considers the time factor, its algorithm does not directly record the access time of the data, but instead uses the average reference count to identify the time.

Lfu-aging on the basis of LFU, a maximum average reference count was added. When the data reference count average in the current cache reaches or exceeds the maximum average reference count, the reference count for all data is reduced. There are several ways to reduce the number of methods, which can be reduced directly to the original half, or subtract fixed values.

1.3.3. Analysis

L Hit ratio

The efficiency of lfu-aging is similar to that of LFU, when the access mode changes, the lfu-aging can apply the new data access mode faster, and the efficiency is high.

L Degree of complexity

On the basis of LFU to increase the average number of citations to judge and deal with.

L Cost

Similar to LFU, the access list needs to be traversed when the average number of references exceeds the specified threshold (Aging).

1.4. lfu*-aging1.4.1. Principle

Synthesis of lfu* and lfu-aging.

1.4.2. Implementing

Slightly.

1.4.3. Analysis

L Hit ratio

Similar to the lfu-aging.

L Degree of complexity

is simpler than lfu-aging, and does not need to be sorted based on reference counts.

L Cost

Less than lfu-aging and do not need to be sorted based on reference counts.

1.5. window-lfu1.5.1. Principle

WINDOWS-LFU is an improved version of LFU, the difference is that WINDOW-LFU does not record all the data access history, but only records the history of the past period of time, this is the origin of the window, for this reason, the traditional LFU is also known as "Perfect-lfu ”。

1.5.2. Implementing

The implementation is basically the same as the LFU, except that the historical access data of all the data is not recorded and the history of the visit is recorded over a period of time. The specific implementation is as follows:

1) Record of past w access records;

2) When you need to retire, the W access records are sorted according to LFU rules.

Examples are as follows:

Assuming that the historical access record length is set to 9 and the cache size is 3, the different colors in the graph represent access to different blocks of data, and the same color represents multiple accesses to the same data.

Example 1: Yellow access 3 times, blue and orange are two times, orange update, so cache yellow, orange, blue three pieces of data

Example 2: Green Access 3 times, blue two times, red two times, blue update, so cache green, blue, red color three blocks of data

1.5.3. Analysis

L Hit ratio

Window-lfu's hit rate is similar to LFU, but the WINDOW-LFU will change according to the data access mode, adapt to the new data access mode more quickly, the "cache pollution" problem is not serious.

L Degree of complexity

You need to maintain a queue, record the access flow history of the data, and need to sort.

L Cost

Window-lfu records only part of the access history and does not need to record all of the data access history, so memory consumption and sequencing consumption are lower than LFU.

1.6. Lfu class Algorithm comparison

Since different access models result in a large variation in hit ratios, the comparison is based only on theoretical qualitative analysis and no quantitative analysis.

Contrast point

Contrast

Shooting

window-lfu/lfu-aging > Lfu*-aging > LFU > lfu*

Complexity of

lfu-aging > Lfu> lfu*-aging >window-lfu > lfu*

Price

lfu-aging > LFU > Window-lfu > Lfu*-aging > lfu*

2--lfu class of "turn" cache elimination algorithm series

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.