A common approach to implementing an LRU cache in Java is to use LinkedHashMap. LinkedHashMap inherits from HashMap and features predictable iteration order. Internally, it maintains a doubly linked list to track insertion order or access order.
To implement an LRU cache, we can leverage LinkedHashMap's constructor, which allows specifying the accessOrder boolean parameter. Setting accessOrder to true ensures that access order is considered during iteration, which is precisely what we need for implementing an LRU cache.
We can customize when to remove the oldest entries by inheriting from LinkedHashMap and overriding its removeEldestEntry method. This method is invoked after each new element is added, and it determines whether to remove the oldest element by returning true or false.
Here is a simple example of implementing an LRU cache:
javaimport java.util.LinkedHashMap; import java.util.Map; public class LRUCache<K, V> extends LinkedHashMap<K, V> { private final int capacity; public LRUCache(int capacity) { super(capacity, 0.75f, true); // Initial capacity, load factor, and access order this.capacity = capacity; } @Override protected boolean removeEldestEntry(Map.Entry<K, V> eldest) { return size() > capacity; // When the current size exceeds the capacity, remove the oldest element } public static void main(String[] args) { LRUCache<Integer, String> cache = new LRUCache<>(3); cache.put(1, "A"); cache.put(2, "B"); cache.put(3, "C"); System.out.println(cache); // Output: {1=A, 2=B, 3=C} cache.get(1); // Access element 1 cache.put(4, "D"); // Add new element, which removes 2=B as it is the least recently accessed System.out.println(cache); // Output: {3=C, 1=A, 4=D} } }
In this example, we create an LRU cache with a capacity of 3. We add and access elements, and observe whether the least recently accessed element (the oldest element) is correctly removed when a new element is added beyond the capacity.
The advantage of this method is its simplicity and direct utilization of Java's standard library, without needing to implement a doubly linked list from scratch. However, note that this usage of LinkedHashMap may not be thread-safe in a multithreaded environment. If you need to use an LRU cache in a multithreaded environment, consider wrapping LRUCache with Collections.synchronizedMap or using other concurrency control mechanisms.