Java Simple LRU Cache implementation

Source: Internet
Author: User

Background

Linkedhashmap inherits from HashMap, which provides a Removeeldestentry method, which is the key to the implementation of the LRU strategy, and HashMap internally provides 3 dedicated callback methods for Linkedhashmap. Afternodeaccess, Afternodeinsertion, Afternoderemoval, the literal meaning of these 3 methods is very easy to understand, is the node after access, after the node is inserted, the node after the deletion of the behavior of the separate execution. Based on the above behavior Linkedhashmap can implement a LRUCache function.

Realize

The implementation of their own LRUCache only need to overwrite removeeldestentry this method, the code is as follows


LRUCache(int initcap, int maxSize) throws Illegalargumentexception{super (Initcap, 0.75f, True); if (maxSize < 0) throw new IllegalArgumentException (); max_elements = maxSize;} @Overrideprotected Boolean removeeldestentry (map.entry<k, v> eldest) {return size () > max_elements;}}

The above code requires a max_elements variable to limit the maximum number of storage nodes, when inserting nodes to determine if The current number of nodes has exceeded this value will be based on the LRU policy will be the least access to delete the node, it is important to note that the default linkedhashmap is to ensure that the insertion order, that is, the node according to the insertion sequence, so even if the deletion is also deleted the first inserted node, But we passed a true in the constructor, which determines how the nodes inside the Linkedhashmap are sorted, the argument is true when the internal node is sorted by the time of the most recent visit, and the description is ordered in the order of insertion when false. A simple LRUCache implementation has been completed.

Attention

Because the linkedhahsmap itself is not thread-safe, that is, the LRUCache is not thread-safe, and if you want to be able to access multiple threads, you can use it this way: LRUCache cache = Collections.synchronizedmap (New LRUCache (10, 10)). This allows the cache to perform operations such as get\put under multiple threads, but the caches obtained in this way are still unsafe for multithreaded traversal. So it is not possible to traverse the cache under multiple threads, and the official documentation suggests using map itself to synchronize when traversing Synchronizedmap.

Java Simple LRU Cache implementation

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.