In the previous section, we implemented the random cache algorithm and the FIFO cache algorithm. Now, we will continue to implement two other famous cache algorithms: LFU and LRU. Once again, this code is only used for demonstration. If you want to use it in an application, you need to add additional work.
Let's look at the implementation of the LFU cache algorithm.
public synchronized Object getElement(Object key) {Object obj;obj = table.get(key);if (obj != null) {CacheElement element = (CacheElement) obj;element.setHitCount(element.getHitCount() + 1);return element.getObjectValue();}return null;}public final synchronized void addElement(Object key, Object value) {Object obj;obj = table.get(key);if (obj != null) {CacheElement element;// Just replace the value.element = (CacheElement) obj;element.setObjectValue(value);element.setObjectKey(key);return;}if (!isFull()) {index = numEntries;++numEntries;} else {CacheElement element = removeLfuElement();index = element.getIndex();table.remove(element.getObjectKey());}cache[index].setObjectValue(value);cache[index].setObjectKey(key);cache[index].setIndex(index);table.put(key, cache[index]);}public CacheElement removeLfuElement() {CacheElement[] elements = getElementsFromTable();CacheElement leastElement = leastHit(elements);return leastElement;}public static CacheElement leastHit(CacheElement[] elements) {CacheElement lowestElement = null;for (int i = 0; i < elements.length; i++) {CacheElement element = elements[i];if (lowestElement == null) {lowestElement = element;} else {if (element.getHitCount() < lowestElement.getHitCount()) {lowestElement = element;}}}return lowestElement;}
The most important code should be the leasthit method. This Code is to find the lowest hitcount element, delete it, and leave a location for the new cache element.
Look at the LRU cache Algorithm Implementation
private void moveToFront(int index) {int nextIndex, prevIndex;if(head != index) {nextIndex = next[index];prevIndex = prev[index];// Only the head has a prev entry that is an invalid index so// we don't check.next[prevIndex] = nextIndex;// Make sure index is valid. If it isn't, we're at the tail// and don't set prev[next].if(nextIndex >= 0)prev[nextIndex] = prevIndex;elsetail = prevIndex;prev[index] = -1;next[index] = head;prev[head] = index;head = index;}}public final synchronized void addElement(Object key, Object value) {int index;Object obj;obj = table.get(key);if(obj != null) {CacheElement entry;// Just replace the value, but move it to the front.entry = (CacheElement)obj;entry.setObjectValue(value);entry.setObjectKey(key);moveToFront(entry.getIndex());return;}// If we haven't filled the cache yet, place in next available spot// and move to front.if(!isFull()) {if(_numEntries > 0) {prev[_numEntries] = tail;next[_numEntries] = -1;moveToFront(numEntries);}++numEntries;} else {// We replace the tail of the list.table.remove(cache[tail].getObjectKey());moveToFront(tail);}cache[head].setObjectValue(value);cache[head].setObjectKey(key);table.put(key, cache[head]);}
The logic of this Code is the same as that described by the LRU algorithm. The cache used again is extracted to the beginning, and each deletion is the last element.
Conclusion
We have seen the implementation methods of the LFU cache algorithm and the LRU cache algorithm. As for how to implement the LFU cache algorithm, whether to use arrays or javashashmap is determined by you. If not, I usually use Arrays for small cache capacity, large uses of linkedhashmap.
In the following section, we will talk about the cache framework, the cache algorithms they use, and make some comparisons.