Mastering LRU Cache in Kotlin

As a Senior Engineer, I’ve found that mastering algorithms isn’t just about memorizing their definitions but truly understanding their applications, limitations, and performance trade-offs. One such foundational algorithm is the Least Recently Used (LRU) Cache. It’s a simple yet powerful concept that finds its way into real-world scenarios like caching images, managing session data, or optimizing memory usage.

In this blog, I’ll walk you through building your own LRU cache in Kotlin, discuss its complexities, and explore Android’s built-in LruCache class. We’ll also evaluate when it makes sense to build your own versus relying on Android’s utility.

What is an LRU Cache?

An LRU Cache is a data structure that removes the least recently used items when the cache exceeds its capacity. It ensures efficient memory management by prioritizing frequently accessed items.

How It Works

  • Each time an item is accessed, it’s marked as recently used.

  • If the cache exceeds its maximum capacity, the least recently used item is evicted.

Real-World Applications

  1. Image Caching: Storing images for faster retrieval in mobile apps.

  2. Web Browsers: Caching recently visited web pages.

  3. Database Query Results: Storing results of expensive queries to reduce latency.

Building Your Own LRU Cache in Kotlin

Algorithmic Foundation of LRU

To achieve an efficient LRU cache, we rely on two data structures working in tandem:

  1. HashMap (Dictionary): Provides O(1) access to cache items by key.

  2. Doubly Linked List: Maintains the order of usage, with the head representing the most recently used item and the tail representing the least recently used.

This combination allows us to:

  • Insert/Delete in O(1) via linked list operations.

  • Lookup in O(1) using the hash map.

Key Design Goals of an LRU Cache:

  1. Constant-Time Operations: Retrievals, insertions, and updates should happen in O(1).

  2. Order Management: Maintain the order of usage to quickly identify the least recently used item.

  3. Fixed Capacity: Ensure the cache doesn’t grow uncontrollably.

Kotlin Implementation

Here’s a step-by-step implementation:

1. CacheNode Class

Each node in our doubly linked list will represent a cache entry. It will store the key-value pair along with pointers to the next and previous nodes:

data class CacheNode<K, V>(
    val key: K,
    var value: V,
    var prev: CacheNode<K, V>? = null,
    var next: CacheNode<K, V>? = null
)

2. The LRUCache Class

The LRUCache class ties everything together. Here's the complete implementation, with detailed comments explaining each step:

class LRUCache<K, V>(private val capacity: Int) {
    private val map = mutableMapOf<K, CacheNode<K, V>>() // For O(1) lookups
    private var head: CacheNode<K, V>? = null // Most recently used
    private var tail: CacheNode<K, V>? = null // Least recently used

    // Retrieve a value from the cache
    fun get(key: K): V? {
        val node = map[key] ?: return null
        moveToHead(node) // Accessing the node makes it most recently used
        return node.value
    }

    // Add or update a key-value pair in the cache
    fun put(key: K, value: V) {
        val existingNode = map[key]
        if (existingNode != null) {
            existingNode.value = value
            moveToHead(existingNode) // Update the usage
        } else {
            val newNode = CacheNode(key, value)
            if (map.size == capacity) {
                removeTail() // Evict the least recently used item
            }
            addToHead(newNode)
            map[key] = newNode
        }
    }

    // Move a node to the head (most recently used position)
    private fun moveToHead(node: CacheNode<K, V>) {
        removeNode(node)
        addToHead(node)
    }

    // Add a node to the head of the list
    private fun addToHead(node: CacheNode<K, V>) {
        node.next = head
        node.prev = null
        head?.prev = node
        head = node
        if (tail == null) {
            tail = node // If this is the first node, it’s also the tail
        }
    }

    // Remove a node from the list
    private fun removeNode(node: CacheNode<K, V>) {
        node.prev?.next = node.next
        node.next?.prev = node.prev
        if (node == head) head = node.next
        if (node == tail) tail = node.prev
    }

    // Remove the least recently used (tail) node
    private fun removeTail() {
        tail?.let {
            map.remove(it.key) // Remove it from the map
            removeNode(it)
        }
    }

    // Debug helper: Print the current state of the cache
    fun printCache() {
        var current = head
        while (current != null) {
            print("${current.key}:${current.value} -> ")
            current = current.next
        }
        println("NULL")
    }
}

Analyzing the Time Complexity

  1. get(key):

    • HashMap lookup: O(1).

    • Moving the node to the head: O(1).
      Total: O(1).

  2. put(key, value):

    • HashMap insertion: O(1).

    • Adding or removing a node in the doubly linked list: O(1).
      Total: O(1).

These constant-time operations make the LRU cache highly efficient, even at scale.

Usage Example

fun main() {
    val lruCache = LRUCache<Int, String>(3)

    // Adding elements
    lruCache.put(1, "A")
    lruCache.put(2, "B")
    lruCache.put(3, "C")
    println("Initial Cache:")
    lruCache.printCache() // Output: 3:C -> 2:B -> 1:A -> NULL

    // Access an item
    println("Accessing key 2:")
    lruCache.get(2)
    lruCache.printCache() // Output: 2:B -> 3:C -> 1:A -> NULL

    // Add a new item, evicting the least recently used
    println("Adding key 4 (evicts key 1):")
    lruCache.put(4, "D")
    lruCache.printCache() // Output: 4:D -> 2:B -> 3:C -> NULL
}

Output

Initial Cache:
3:C -> 2:B -> 1:A -> NULL
Accessing key 2:
2:B -> 3:C -> 1:A -> NULL
Adding key 4 (evicts key 1):
4:D -> 2:B -> 3:C -> NULL

Why Not Use LinkedHashMap?

Kotlin's LinkedHashMap can simplify LRU implementation with its accessOrder parameter. While it's an excellent choice for production, building the cache from scratch offers:

  1. Control: Tailor the cache behavior for specific use cases.

  2. Learning: Understand the underlying mechanics of algorithm design.

Here’s how you can implement an LRU cache with LinkedHashMap in a single line:

val lruCache = object : LinkedHashMap<Int, String>(capacity, 0.75f, true) {
    override fun removeEldestEntry(eldest: MutableMap.MutableEntry<Int, String>) = size > capacity
}

Elegant, but lacks the flexibility of custom implementations.

Exploring Android's Built-In LruCache Class

When working with Android, you’ll find that the platform already provides a convenient android.util.LruCache class for implementing an LRU caching mechanism. This built-in utility simplifies caching in Android applications, optimized for use cases like memory caching for images, strings, or data objects.

Example Usage of LruCache

import android.util.LruCache

fun main() {
    val lruCache = LruCache<Int, String>(3)

    lruCache.put(1, "A")
    lruCache.put(2, "B")
    lruCache.put(3, "C")
    println("Initial Cache: ${lruCache.snapshot()}")

    lruCache.get(2) // Access key 2
    println("After Accessing Key 2: ${lruCache.snapshot()}")

    lruCache.put(4, "D") // Add new, evict least recently used
    println("After Adding Key 4: ${lruCache.snapshot()}")
}

Why Build Your Own LRU Cache?

While LruCache is powerful, there are specific reasons why building your own LRU cache might still be the better choice:

1. Customization

  • Prioritized eviction policies (e.g., time, weights).

  • Fine-tuning memory usage for specific applications.

2. Cross-Platform Compatibility

LruCache is Android-specific. Custom implementations work across platforms, like Kotlin Multiplatform.

3. Algorithm Learning and Control

Building your own cache deepens your understanding of:

  • Algorithm design.

  • Optimizing time and space complexities.

4. Framework Independence

Custom caches avoid Android framework lock-in, enabling easier migration to other technologies.

When to Use LruCache vs. Building Your Own?

Use android.util.LruCache When...
1. You need a quick, reliable cache for Android-specific use cases like image or data caching.
2. Your cache is memory-bound and doesn’t require integration with external systems.
3. You want to save development time and rely on well-tested code.
4. The default LruCache API fits your project needs.

Build Your Own LRU Cache When...
1. You need complete flexibility over the cache’s behavior or require custom eviction policies.
2. Your use case spans multiple platforms (e.g., KMP) or requires distributed caching strategies.
3. You’re optimizing for edge cases or fine-tuning performance for specific scenarios.
4. You want to deeply understand and control caching algorithms for educational or architectural reasons.

Final Thoughts

The Android LruCache class is a reliable tool for many applications, but building your own LRU cache offers flexibility, deeper insights, and cross-platform compatibility. Whether you choose LruCache or a custom solution, the key is to align the approach with your project’s requirements.

Love from our past students

Excellent list of questions really helped me to coverup all the topics before interview.

Saiteja Janjirala

10th Oct, 2024

I had an exceptional experience with the 1:1 mentorship session. Akshay was incredibly friendly and provided invaluable guidance on focusing on long-term goals. They also gave great interview tips, including a thorough resume review. Additionally, the discussion on Data Structures and Algorithms (DSA) was insightful and practical. Highly recommended for anyone looking to advance their career!

Nayab khan

11th Sep, 2024

Cleared my major points for what I am missing in the resume and also suggested what I can work on for further growth in the career.

Ketan Chaurasiya

7th Aug, 2024

What impressed me most was his personalized approach and practical tips that I could immediately apply. Akshay’s guidance not only improved my technical skills but also boosted my confidence in navigating my career path. His insights and encouragement have been a game-changer for me. I highly recommend Akshay’s mentorship to anyone looking to advance their Android development career.

Hardik Kubavat

5th Aug, 2024

2025© Made with   by Android Engineers.