As a Senior Engineer, I’ve found that mastering algorithms isn’t just about memorizing their definitions but truly understanding their applications, limitations, and performance trade-offs. One such foundational algorithm is the Least Recently Used (LRU) Cache. It’s a simple yet powerful concept that finds its way into real-world scenarios like caching images, managing session data, or optimizing memory usage.
In this blog, I’ll walk you through building your own LRU cache in Kotlin, discuss its complexities, and explore Android’s built-in LruCache
class. We’ll also evaluate when it makes sense to build your own versus relying on Android’s utility.
An LRU Cache is a data structure that removes the least recently used items when the cache exceeds its capacity. It ensures efficient memory management by prioritizing frequently accessed items.
Each time an item is accessed, it’s marked as recently used.
If the cache exceeds its maximum capacity, the least recently used item is evicted.
Image Caching: Storing images for faster retrieval in mobile apps.
Web Browsers: Caching recently visited web pages.
Database Query Results: Storing results of expensive queries to reduce latency.
To achieve an efficient LRU cache, we rely on two data structures working in tandem:
HashMap (Dictionary): Provides O(1) access to cache items by key.
Doubly Linked List: Maintains the order of usage, with the head representing the most recently used item and the tail representing the least recently used.
This combination allows us to:
Insert/Delete in O(1) via linked list operations.
Lookup in O(1) using the hash map.
Constant-Time Operations: Retrievals, insertions, and updates should happen in O(1).
Order Management: Maintain the order of usage to quickly identify the least recently used item.
Fixed Capacity: Ensure the cache doesn’t grow uncontrollably.
Here’s a step-by-step implementation:
Each node in our doubly linked list will represent a cache entry. It will store the key-value pair along with pointers to the next and previous nodes:
data class CacheNode<K, V>(
val key: K,
var value: V,
var prev: CacheNode<K, V>? = null,
var next: CacheNode<K, V>? = null
)
The LRUCache class ties everything together. Here's the complete implementation, with detailed comments explaining each step:
class LRUCache<K, V>(private val capacity: Int) {
private val map = mutableMapOf<K, CacheNode<K, V>>() // For O(1) lookups
private var head: CacheNode<K, V>? = null // Most recently used
private var tail: CacheNode<K, V>? = null // Least recently used
// Retrieve a value from the cache
fun get(key: K): V? {
val node = map[key] ?: return null
moveToHead(node) // Accessing the node makes it most recently used
return node.value
}
// Add or update a key-value pair in the cache
fun put(key: K, value: V) {
val existingNode = map[key]
if (existingNode != null) {
existingNode.value = value
moveToHead(existingNode) // Update the usage
} else {
val newNode = CacheNode(key, value)
if (map.size == capacity) {
removeTail() // Evict the least recently used item
}
addToHead(newNode)
map[key] = newNode
}
}
// Move a node to the head (most recently used position)
private fun moveToHead(node: CacheNode<K, V>) {
removeNode(node)
addToHead(node)
}
// Add a node to the head of the list
private fun addToHead(node: CacheNode<K, V>) {
node.next = head
node.prev = null
head?.prev = node
head = node
if (tail == null) {
tail = node // If this is the first node, it’s also the tail
}
}
// Remove a node from the list
private fun removeNode(node: CacheNode<K, V>) {
node.prev?.next = node.next
node.next?.prev = node.prev
if (node == head) head = node.next
if (node == tail) tail = node.prev
}
// Remove the least recently used (tail) node
private fun removeTail() {
tail?.let {
map.remove(it.key) // Remove it from the map
removeNode(it)
}
}
// Debug helper: Print the current state of the cache
fun printCache() {
var current = head
while (current != null) {
print("${current.key}:${current.value} -> ")
current = current.next
}
println("NULL")
}
}
get(key)
:
HashMap lookup: O(1).
Moving the node to the head: O(1).
Total: O(1).
put(key, value)
:
HashMap insertion: O(1).
Adding or removing a node in the doubly linked list: O(1).
Total: O(1).
These constant-time operations make the LRU cache highly efficient, even at scale.
fun main() {
val lruCache = LRUCache<Int, String>(3)
// Adding elements
lruCache.put(1, "A")
lruCache.put(2, "B")
lruCache.put(3, "C")
println("Initial Cache:")
lruCache.printCache() // Output: 3:C -> 2:B -> 1:A -> NULL
// Access an item
println("Accessing key 2:")
lruCache.get(2)
lruCache.printCache() // Output: 2:B -> 3:C -> 1:A -> NULL
// Add a new item, evicting the least recently used
println("Adding key 4 (evicts key 1):")
lruCache.put(4, "D")
lruCache.printCache() // Output: 4:D -> 2:B -> 3:C -> NULL
}
Initial Cache:
3:C -> 2:B -> 1:A -> NULL
Accessing key 2:
2:B -> 3:C -> 1:A -> NULL
Adding key 4 (evicts key 1):
4:D -> 2:B -> 3:C -> NULL
LinkedHashMap
?Kotlin's LinkedHashMap
can simplify LRU implementation with its accessOrder parameter. While it's an excellent choice for production, building the cache from scratch offers:
Control: Tailor the cache behavior for specific use cases.
Learning: Understand the underlying mechanics of algorithm design.
Here’s how you can implement an LRU cache with LinkedHashMap
in a single line:
val lruCache = object : LinkedHashMap<Int, String>(capacity, 0.75f, true) {
override fun removeEldestEntry(eldest: MutableMap.MutableEntry<Int, String>) = size > capacity
}
Elegant, but lacks the flexibility of custom implementations.
LruCache
ClassWhen working with Android, you’ll find that the platform already provides a convenient android.util.LruCache
class for implementing an LRU caching mechanism. This built-in utility simplifies caching in Android applications, optimized for use cases like memory caching for images, strings, or data objects.
LruCache
import android.util.LruCache
fun main() {
val lruCache = LruCache<Int, String>(3)
lruCache.put(1, "A")
lruCache.put(2, "B")
lruCache.put(3, "C")
println("Initial Cache: ${lruCache.snapshot()}")
lruCache.get(2) // Access key 2
println("After Accessing Key 2: ${lruCache.snapshot()}")
lruCache.put(4, "D") // Add new, evict least recently used
println("After Adding Key 4: ${lruCache.snapshot()}")
}
While LruCache
is powerful, there are specific reasons why building your own LRU cache might still be the better choice:
Prioritized eviction policies (e.g., time, weights).
Fine-tuning memory usage for specific applications.
LruCache
is Android-specific. Custom implementations work across platforms, like Kotlin Multiplatform.
Building your own cache deepens your understanding of:
Algorithm design.
Optimizing time and space complexities.
Custom caches avoid Android framework lock-in, enabling easier migration to other technologies.
LruCache
vs. Building Your Own?Use android.util.LruCache
When...
1. You need a quick, reliable cache for Android-specific use cases like image or data caching.
2. Your cache is memory-bound and doesn’t require integration with external systems.
3. You want to save development time and rely on well-tested code.
4. The default LruCache
API fits your project needs.
Build Your Own LRU Cache When...
1. You need complete flexibility over the cache’s behavior or require custom eviction policies.
2. Your use case spans multiple platforms (e.g., KMP) or requires distributed caching strategies.
3. You’re optimizing for edge cases or fine-tuning performance for specific scenarios.
4. You want to deeply understand and control caching algorithms for educational or architectural reasons.
The Android LruCache
class is a reliable tool for many applications, but building your own LRU cache offers flexibility, deeper insights, and cross-platform compatibility. Whether you choose LruCache
or a custom solution, the key is to align the approach with your project’s requirements.