Table of contents
1.
Introduction
2.
What is LRU Cache?
3.
LRU Cache Operations 
4.
Data structures needed to implement LRU Cache
5.
Algorithm to Implement LRU cache
5.1.
Example
6.
Implementation-1
6.1.
Java
7.
Implementation of LRU using Double Linked List and hashing
7.1.
Java
8.
Advantages of LRU Cache
9.
Disadvantages of LRU Cache
10.
Real-World Applications of LRU Cache
11.
Frequently Asked Questions
11.1.
What is the best data structure to implement LRU cache?
11.2.
What is the stack implementation of LRU?
11.3.
What is the difference between LRU and MRU cache?
11.4.
What is the LRU algorithm for Cache?
12.
Conclusion
Last Updated: Jan 7, 2025
Easy

LRU Cache Implementation

Career growth poll
Do you think IIT Guwahati certified course can help you in your career?

Introduction

Least Recently Used (LRU) is a widely used technique  to manage the memory efficiently. It arranges data in order of use, making it easy to see which ones haven't been used in the longest time. This was the drawback of earlier algorithms like FIFO and LIFO etc.

LRU Cache Implementation

In this article, we’ll discuss the implementation of the LRU cache. Before that, you can revise your concepts of Memory management techniques in OS.

What is LRU Cache?

LRU (Least Recently Used) Cache is a data structure that stores a limited number of items, discarding the least recently used item when it reaches capacity. It provides fast access to frequently used data by maintaining items in order of their last access time.

LRU Cache Operations 

LRU Cache Operations:

  • Get: Retrieve an item and mark it as most recently used
  • Put: Add a new item or update an existing one
  • Evict: Remove the least recently used item when cache is full
  • Update order: Adjust item positions based on access
  • Check capacity: Ensure cache size doesn't exceed limit

Data structures needed to implement LRU Cache

LRU cache can be implemented by combining these two data structures: 

  1. A Doubly Linked List implementation of Queue: The most frequently used item will be at the head of the list and the least-recently-used item at the tail. 
  2. Hash set: To store the references of a key in the cache.

Algorithm to Implement LRU cache

A process is stored in memory frames. And the processes are divided into pages. Every time a process executes, its pages are brought to the LRU Cache.
 

If a page needed by the processor is not present in the LRU Cache, we bring that in it. We simply add a new node in the queue and change its address in the hash table to implement this. If the LRU cache(doubly linked list) is full, we must replace a page with the required one.
 

If the page is already present somewhere in the Cache, we have to take it to the front. Let's understand this with the help of one example.

Example

 

Example

 

Initially, there are no pages in the memory. The cache is implemented as a queue, so, the insertion and deletion will take place as shown in the diagram below:

 

illustration image

 

The processing of reference strings using the given LRU Cache is shown below. 
 

Figure 1  

                                                                                                              

Figure 2

 

 Figure 3


You can also read about the Longest Consecutive Sequence.

Implementation-1

The implementation of the above algorithm is given below. 

  • We’ve utilized a linked list taking advantage of its standard queue implementation. It uses a linked list internally to model a queue or a deque. 
  • We've used HashSet to store references of pages in Cache. 
  • Java

Java

import java.util.*;



public class LRU_Cache

{



//Doubly linked list implementation of Queue

private Deque<Integer> LRU_Queue;



//Hash set to store references of key in cache

private HashSet<Integer> hashSet;



//Total number of frames in an LRU Cache

private final int TOTAL_FRAMES;



//Constructor for initialization

LRU_Cache(int capacity)

{

LRU_Queue =  new LinkedList<>();

hashSet =  new HashSet<>();

TOTAL_FRAMES = capacity;

}



//Calling pages one by one

public void call(int page)

{

//If the page is not present in the LRU Cache

if (!hashSet.contains(page))

{

if (LRU_Queue.size() == TOTAL_FRAMES)

{

//Finding and removing the last page

int last = LRU_Queue.removeLast();

hashSet.remove(last);

}

}

//The page is present in the cache

else {

LRU_Queue.remove(page);

}

//Inserting that page in the front of Cache

LRU_Queue.push(page);

//Updating the address in the hash set

hashSet.add(page);

}



public void display()

{

Iterator<Integer> itr = LRU_Queue.iterator();

System.out.println("The LRU Cache contains:");

while (itr.hasNext())

{

System.out.println("|_" + itr.next() + "_|");

}

}



public static void main(String[] args) {

LRU_Cache cache =  new LRU_Cache(3);

cache.call(1);

cache.call(2);

cache.call(3);

cache.call(4);

cache.call(1);

cache.call(2);

cache.display();

cache.call(5);

cache.call(1);

cache.call(2);

cache.call(4);

cache.call(3);

cache.display();

}

}
You can also try this code with Online Java Compiler
Run Code

Output

The LRU Cache contains:

|_2_|

|_1_|

|_4_|

The LRU Cache contains:

|_3_|

|_4_|

|_2_|

Time Complexity

The time complexity of the above approach depends on the addition and deletion in the linked list.

  • Adding an element at the beginning will take O(1) time. 
  • Removing an element will take O(N) time.

Space Complexity

The space complexity of this approach is O(N), where N is the total number of frames in the cache. 

Also see, Difference Between HahsMap and HashTable in Java

Implementation of LRU using Double Linked List and hashing

In this approach, we are using a hash map of keys and double linked nodes. This approach is more efficient than the first one. 

  • Java

Java

import java.util.*;

class Cache_Node {
int key;
int value;
Cache_Node prev_node;
Cache_Node next_node;

public Cache_Node(int key,  int value)
{
this.key = key;
this.value = value;
}
}

//A class to implement LRU Cache
class LRUCache {
private HashMap<Integer, Cache_Node> map;
private int number_of_frames, count;
private Cache_Node head, tail;

public LRUCache(int number_of_frames)
{
this.number_of_frames = number_of_frames;
map =  new HashMap<>();
head = tail = null ;
count =  0;
}

public void deleteNode(Cache_Node node)
{
Cache_Node prev = node.prev_node;
Cache_Node next = node.next_node;
if (prev != null) {
prev.next_node = next;
}

if (next != null) {
next.prev_node = prev;
}

}

public void addToHead(Cache_Node node)
{
if (head == null) {
head = tail = node;
return;
}
node.next_node = head ;  
if (head != null) {
head.prev_node = node;
}
head = node;


}

public int get(int key)
{
if (map.get(key) != null) {
Cache_Node node =  map.get(key);
int result = node.value;
deleteNode(node);
addToHead(node);
return result;
}


return  -1;


}

public void set(int key,  int value)
{
if (map.get(key) != null) {
Cache_Node node =  map.get(key);
node.value = value;
deleteNode(node);
addToHead(node);
}
else {
Cache_Node node =  new Cache_Node(key, value);
map.put(key, node);
if (count < number_of_frames) {
count++;
addToHead(node);
}
else {
Cache_Node tail2 = tail;
tail = tail.prev_node;
tail.next_node = null;

deleteNode(tail2);
map.remove(tail2.key);

addToHead(node);
}
}
}

public void printCache() {
Cache_Node temp = head;
while (temp != null) {
System.out.println("| " + temp.key +  ": " + temp.value +  " |");
temp = temp.next_node;
}
}
}

public class LRU_Test {
public static void main(String[] args)
{
//Creates a cache with three frames
LRUCache cache =  new LRUCache(3);

cache.set(1,  10);
cache.set(2,  20);
cache.set(3,  30);
System.out.println("Current sequence in cache");
System.out.println(" -------");
cache.printCache();
System.out.println(" -------");
System.out.println("Value for the key: 2 is " + cache.get(2));
cache.set(4,  40);
System.out.println("Current sequence in cache");
System.out.println(" -------");
cache.printCache();
System.out.println(" -------");
System.out.println("Value for the key: 1 is " + cache.get(1));
System.out.println("Value for the key: 3 is " + cache.get(3));
System.out.println("Value for the key: 4 is " + cache.get(4));

System.out.println("Current sequence in cache");
System.out.println(" -------");
cache.printCache();
System.out.println(" -------");
}
}
You can also try this code with Online Java Compiler
Run Code

Output

Current sequence in cache
-------
| 3: 30 |
| 2: 20 |
| 1: 10 |
-------
Value for the key: 2 is 20
Current sequence in cache
-------
| 4: 40 |
| 2: 20 |
| 3: 30 |
-------
Value for the key: 1 is -1
Value for the key: 3 is 30
Value for the key: 4 is 40
Current sequence in cache
-------
| 4: 40 |
| 3: 30 |
| 2: 20 |
-------

Time Complexity 

The hash map makes the time of the get operation to be O(1). The list of double linked nodes makes the nodes adding/removal operations O(1).

 

Space Complexity

The space complexity of this approach is O(N), where N is the total number of nodes in the cache.

 

Now, let’s move on to the frequently asked questions on this topic. 

Check out this problem - Queue Implementation

Advantages of LRU Cache

  • Efficient Memory Management
    LRU ensures frequently accessed data stays in memory, leading to more efficient memory utilization by evicting the least recently used items.
  • Improved Cache Hit Ratio
    Since it prioritizes recently accessed data, LRU often leads to a higher cache hit ratio compared to simpler strategies like FIFO.
  • Good for Temporal Locality
    LRU is highly effective in scenarios where recent items are likely to be accessed again soon, leveraging temporal locality patterns in data access.
  • Widely Used in Practice
    LRU is easy to implement and widely used in various systems like web browsers, databases, and operating systems for efficient cache management.

Disadvantages of LRU Cache

  • High Overhead in Maintaining Order
    Maintaining the access order of items in the cache requires additional data structures like linked lists, increasing implementation complexity and overhead.
  • Expensive in Large Systems
    LRU’s operations can be costly in systems with large cache sizes, as it requires keeping track of the usage order for every item.
  • Not Always Optimal
    In workloads with unpredictable access patterns, LRU may not perform well and can lead to suboptimal eviction choices compared to other algorithms.
  • Memory Usage
    Additional memory is needed to store metadata (like timestamps or access order), which can be significant in systems with limited resources.

Real-World Applications of LRU Cache

  1. Web Browsers
    Browsers use LRU to store recently accessed webpages or images in cache, so frequently visited sites load faster without fetching from the network.
  2. Database Systems
    LRU is applied to manage database buffers, keeping frequently accessed data in memory to reduce disk access times and enhance query performance.
  3. Operating Systems
    OS memory management utilizes LRU for paging and virtual memory systems, evicting the least recently used pages when memory is low.
  4. Content Delivery Networks (CDNs)
    CDNs use LRU to cache popular content closer to users. Less frequently accessed content is removed, making room for the most current or popular data.

Frequently Asked Questions

What is the best data structure to implement LRU cache?

The best data structure to implement an LRU cache is a combination of a HashMap (or dictionary) and a doubly linked list. The HashMap provides constant-time lookups, while the doubly linked list efficiently tracks the order of usage, supporting quick insertions and deletions.

What is the stack implementation of LRU?

In the stack implementation of LRU, a stack is used to track cache items by usage. When an item is accessed, it is moved to the top of the stack. When the cache exceeds its capacity, the least recently used item at the bottom is removed.

What is the difference between LRU and MRU cache?

LRU (Least Recently Used) evicts the least recently accessed items, assuming they are least likely to be used again. MRU (Most Recently Used) evicts the most recently accessed items, assuming they are most likely to be used soon. LRU works well for temporal locality, while MRU suits specific cases like bursty access patterns.

What is the LRU algorithm for Cache?

This caching method keeps items that have been recently used near the top of the Cache. The LRU inserts a new item to the top of the Cache whenever it is accessed. When the cache limit is reached, things that have been accessed less recently are deleted from the Cache beginning at the bottom.

Conclusion

In this article, we’ve learned about the implementation of LRU Cache. LRU cache is an essential technique for efficient memory management, especially in systems that frequently access data. By ensuring that the least recently used items are evicted first, LRU effectively optimizes cache hit ratios and memory utilization. Its implementation, often using a combination of a HashMap and a doubly linked list, offers fast access and eviction operations.

You can also prepare the top Operating System Interview Questions and their answers asked by leading tech companies.

Recommended Reading:

Live masterclass