Background
Linkedhashmap inherits from HashMap, which provides a Removeeldestentry method, which is the key to the implementation of the LRU strategy, and HashMap internally provides 3 dedicated callback methods for Linkedhashmap. Afternodeaccess, Afternodeinsertion, Afternoderemoval, the literal meaning of these 3 methods is very easy to understand, is the node after access, after the node is inserted, the node after the deletion of the behavior of the separate execution. Based on the above behavior Linkedhashmap can implement a LRUCache function.
Realize
The implementation of their own LRUCache only need to overwrite removeeldestentry this method, the code is as follows
LRUCache(int initcap, int maxSize) throws Illegalargumentexception{super (Initcap, 0.75f, True); if (maxSize < 0) throw new IllegalArgumentException (); max_elements = maxSize;} @Overrideprotected Boolean removeeldestentry (map.entry<k, v> eldest) {return size () > max_elements;}}
The above code requires a max_elements variable to limit the maximum number of storage nodes, when inserting nodes to determine if The current number of nodes has exceeded this value will be based on the LRU policy will be the least access to delete the node, it is important to note that the default linkedhashmap is to ensure that the insertion order, that is, the node according to the insertion sequence, so even if the deletion is also deleted the first inserted node, But we passed a true in the constructor, which determines how the nodes inside the Linkedhashmap are sorted, the argument is true when the internal node is sorted by the time of the most recent visit, and the description is ordered in the order of insertion when false. A simple LRUCache implementation has been completed.
Attention
Because the linkedhahsmap itself is not thread-safe, that is, the LRUCache is not thread-safe, and if you want to be able to access multiple threads, you can use it this way: LRUCache cache = Collections.synchronizedmap (New LRUCache (10, 10)). This allows the cache to perform operations such as get\put under multiple threads, but the caches obtained in this way are still unsafe for multithreaded traversal. So it is not possible to traverse the cache under multiple threads, and the official documentation suggests using map itself to synchronize when traversing Synchronizedmap.
Java Simple LRU Cache implementation