乐闻世界logo
搜索文章和话题

How would you implement an LRU cache in Java?

2个答案

1
2

A common approach to implementing an LRU cache in Java is to use LinkedHashMap. LinkedHashMap inherits from HashMap and features predictable iteration order. Internally, it maintains a doubly linked list to track insertion order or access order.

To implement an LRU cache, we can leverage LinkedHashMap's constructor, which allows specifying the accessOrder boolean parameter. Setting accessOrder to true ensures that access order is considered during iteration, which is precisely what we need for implementing an LRU cache.

We can customize when to remove the oldest entries by inheriting from LinkedHashMap and overriding its removeEldestEntry method. This method is invoked after each new element is added, and it determines whether to remove the oldest element by returning true or false.

Here is a simple example of implementing an LRU cache:

java
import java.util.LinkedHashMap; import java.util.Map; public class LRUCache<K, V> extends LinkedHashMap<K, V> { private final int capacity; public LRUCache(int capacity) { super(capacity, 0.75f, true); // Initial capacity, load factor, and access order this.capacity = capacity; } @Override protected boolean removeEldestEntry(Map.Entry<K, V> eldest) { return size() > capacity; // When the current size exceeds the capacity, remove the oldest element } public static void main(String[] args) { LRUCache<Integer, String> cache = new LRUCache<>(3); cache.put(1, "A"); cache.put(2, "B"); cache.put(3, "C"); System.out.println(cache); // Output: {1=A, 2=B, 3=C} cache.get(1); // Access element 1 cache.put(4, "D"); // Add new element, which removes 2=B as it is the least recently accessed System.out.println(cache); // Output: {3=C, 1=A, 4=D} } }

In this example, we create an LRU cache with a capacity of 3. We add and access elements, and observe whether the least recently accessed element (the oldest element) is correctly removed when a new element is added beyond the capacity.

The advantage of this method is its simplicity and direct utilization of Java's standard library, without needing to implement a doubly linked list from scratch. However, note that this usage of LinkedHashMap may not be thread-safe in a multithreaded environment. If you need to use an LRU cache in a multithreaded environment, consider wrapping LRUCache with Collections.synchronizedMap or using other concurrency control mechanisms.

2024年6月29日 12:07 回复

LRU (Least Recently Used) cache is a commonly used cache eviction algorithm that removes the least recently used data to make space for new data. In Java, a common approach to implementing LRU cache is by utilizing the LinkedHashMap class. LinkedHashMap is a subclass of HashMap that maintains the order of insertion or access, making it particularly suitable for implementing LRU cache.

The following are the specific steps to implement LRU cache using LinkedHashMap:

  1. Extend the LinkedHashMap class: Create a new class that extends LinkedHashMap and override the removeEldestEntry(Map.Entry eldest) method. This method is automatically invoked after inserting elements to determine when to remove the oldest entry (i.e., the least recently accessed entry).

  2. Set capacity and access order: In the constructor, call the super method to set the initial capacity, load factor, and access order. Set the access order to true, so that after each access, the accessed element is moved to the end of the linked list, and the oldest element remains at the head of the list.

  3. Implement removeEldestEntry: In this method, compare the current cache size with the maximum capacity. Return true when the size exceeds the maximum capacity, which triggers the removal of the oldest element, i.e., the element at the head of the linked list.

Here is a concrete implementation example:

java
import java.util.LinkedHashMap; import java.util.Map; public class LRUCache<K, V> extends LinkedHashMap<K, V> { private int capacity; // cache capacity public LRUCache(int capacity) { super(capacity, 0.75f, true); // Set to access order this.capacity = capacity; } @Override protected boolean removeEldestEntry(Map.Entry<K, V> eldest) { // Remove the oldest element when the preset capacity is reached return size() > capacity; } public static void main(String[] args) { LRUCache<Integer, String> cache = new LRUCache<>(3); cache.put(1, "a"); cache.put(2, "b"); cache.put(3, "c"); System.out.println(cache); cache.get(1); cache.put(4, "d"); System.out.println(cache); // Output {2=b, 3=c, 1=a} } }

Advantages and Improvements

The advantage of implementing LRU cache using LinkedHashMap is that it is simple, easy to understand, and straightforward to implement. However, for more efficient concurrent processing, one might consider using other concurrent collections, such as ConcurrentHashMap combined with other data structures to implement thread-safe LRU cache. Additionally, one can leverage third-party libraries such as Google Guava's Cache builder to implement more complex caching strategies and features.

2024年6月29日 12:07 回复

你的答案