Table of contents
1.
Introduction
2.
What is the LFU Cache algorithm?
3.
Implementation
3.1.
Example
3.2.
Data structures required
3.3.
Code
4.
Frequently Asked Questions
4.1.
What are the different caching algorithms?
4.2.
What is the difference between LFU and LRU algorithms?
5.
Conclusion
Last Updated: Mar 27, 2024
Medium

Least Frequently Used Cache

Author Pakhi Garg
0 upvote
Career growth poll
Do you think IIT Guwahati certified course can help you in your career?
Operating Systems

Introduction

To understand the Least Frequently Used Cache, let's get a quick recap of what caching is.

Caching is the process of storing the data near where it is required, not at a distant place from where the fetching of that data would become an expensive operation. Caching is beneficial since caches are everywhere, from our CPU to our browser. 

To decide which data will be stored where and which data will be removed is the task of Cache Replacement Algorithms. In this article, we will study the Least Frequently Used (LFU) Cache.

Also see, Multiprogramming vs Multitasking and Open Source Operating System

What is the LFU Cache algorithm?

Least Frequently Used (LFU) is one of the most famous caching algorithms.

As the name suggests, LFU removes the least frequently used cache block from the cache that hasn’t been used for some time whenever a new block enters the cache. For this purpose, a frequency count of each cache block is maintained, which helps determine the least frequently used cache block. A cache block is only removed when the cache overflows; otherwise, the new block is accommodated in the cache only and the frequency of that block is increased by 1. If the frequency count of all the present cache blocks is the same, then the cache block is removed on FIFO, i.e., first-in, first-out basis.

Implementation

To implement the LFU cache algorithm, we must perform two functions: put(key, value) and get(key).

  • put(key, value): This function takes key and value as parameters and assigns the provided key to the provided value. If the key already exists, this function updates the value against that key.
  • get(key): This function takes a key as a parameter and fetches the value stored against this key. This function also increments the frequency count of the provided key by 1.


To implement the LFU Cache algorithm, a cache block and a counter are introduced when the put(key, value) function is called. This counter is increased when the get(key) function is called. If the cache overflows, the cache block with a minimum count value is removed. If all the cache blocks present in the cache have the same count value, then the cache block to be removed is decided on a FIFO basis.

Example

Let’s understand the LFU Cache algorithm using an example.

We have a cache block of size 3, and we will perform some put(key, value) and get(key) functions.

  1. put(1,10)


First, we will check the cache’s capacity. Since the cache is empty, we will simply insert the key and value against the frequency count 1.

Illustration Image

2. put(2,20)

We will check the cache whether it has a capacity or not. Since the cache is not empty, we will check the current size of the cache. The current size is 1, so we will simply insert the key and value in against the frequency count 1.

Illustration Image

3. put(3,30)

Yet again the cache is not empty, we will check the current size of the cache. The current size is 2, so we will simply insert the key and value in against the frequency count 1.

Illustration Image

4. put(4,40)

The current size is 3; we can't simply insert the block simply. We need to remove a block. So for that, we will find the least frequently used block. Since all the blocks are at the same frequency level,i.e. 1, the block to be removed will be decided on a FIFO basis. Block with key 1 was entered first. It will be removed, and a block with key 4 will be inserted.

Illustration Image

5. get(3)

To get the value at key 3, we will simply find key 3 and will return the value against it.

Illustration Image

Data structures required

For implementing the LFU Cache algorithm, the required data structures are:

  1. map<freq,DoublyLinkedList> frequencyMap
    We will require a hashmap containing the frequency as the key of the integer type and a Doubly Linked List as the value. This frequencyMap will be used to keep a track of all the cache blocks present at each frequency.
    A doubly linked list is used so that we can easily insert a new node (cache block) at its head and easily remove a node at its tail.
     
  2. map<key,DLLNode> cache
    We will require another hashmap containing the key of the integer type and a node as value which contains the address of the key.
     
  3. curSize
    We will require an integer variable named curSize to keep a record of the current size of the cache.
     
  4. minFrequency
    We will require another integer variable named minFrequency which will keep the value of the minimum value of the frequency count of the cache block.

Code

Below is the implementation of LFU Cache algorithm in Java:

import java.util.*;
class LFUCache {
    final int capacity;
    int curSize;
    int minFrequency;
    Map<Integer, DLLNode> cache;
    Map<Integer, DoubleLinkedList> frequencyMap;

    public LFUCache(int capacity) {
        this.capacity = capacity;
        this.curSize = 0;
        this.minFrequency = 0;
        this.cache = new HashMap<>();
        this.frequencyMap = new HashMap<>();
    }

    // Implementing get(key) function 
    public int get(int key) {
        DLLNode curNode = cache.get(key);
        if (curNode == null) {
            return -1;
        }
        updateNode(curNode);
        return curNode.val;
    }

    // Implementing put(key, value) function 
    public void put(int key, int value) {
        // if the cache size is 0
        if (capacity == 0) {
            return;
        }
        // if the key already exists, update its frequency 
        if (cache.containsKey(key)) {
            DLLNode curNode = cache.get(key);
            curNode.val = value;
            updateNode(curNode);
        }
        // if the key does not exist
        else {
            curSize++;
            if (curSize > capacity) {
                // get minimum frequency list 
                DoubleLinkedList minFreqList = frequencyMap.get(minFrequency);
                cache.remove(minFreqList.tail.prev.key);
                minFreqList.removeNode(minFreqList.tail.prev);
                curSize--;
            }
            // reset min frequency to 1 because of adding new node 
            minFrequency = 1;
            DLLNode newNode = new DLLNode(key, value);
            /** get the list with frequency 1, and then add new node into the list, as well as into LFU cache **/
            DoubleLinkedList curList = frequencyMap.getOrDefault(1, new DoubleLinkedList());
            curList.addNode(newNode);
            frequencyMap.put(1, curList);
            cache.put(key, newNode);
            System.out.println("Cache block "+key+" added");
        }
    }
    
    // update the doubly linked list 
    public void updateNode(DLLNode curNode) {
        int curFreq = curNode.frequency;
        DoubleLinkedList curList = frequencyMap.get(curFreq);
        curList.removeNode(curNode);

        /** if the current list is the last list which has lowest frequency and the current node is the only node in that list, we need to remove the entire list and then increase min frequency value by 1 **/
        if (curFreq == minFrequency && curList.listSize == 0) {
            minFrequency++;
        }

        curNode.frequency++;

        /** add the current node to another list having current frequency + 1,
         if we do not have the list with this frequency, initialise it **/
        DoubleLinkedList newList =     frequencyMap.getOrDefault(curNode.frequency, new DoubleLinkedList());
        newList.addNode(curNode);
        frequencyMap.put(curNode.frequency, newList);
    }

    class DLLNode {
        int key;
        int val;
        int frequency;
        DLLNode prev;
        DLLNode next;

        public DLLNode(int key, int val) {
            this.key = key;
            this.val = val;
            this.frequency = 1;
        }
    }

    // Implementing Doubly Linked List 
    class DoubleLinkedList {
        int listSize;
        DLLNode head;
        DLLNode tail;
        public DoubleLinkedList() {
            this.listSize = 0;
            this.head = new DLLNode(0, 0);
            this.tail = new DLLNode(0, 0);
            head.next = tail;
            tail.prev = head;
        }

        /** add new node into head of list and increase list size by 1 **/
        public void addNode(DLLNode curNode) {
            DLLNode nextNode = head.next;
            curNode.next = nextNode;
            curNode.prev = head;
            head.next = curNode;
            nextNode.prev = curNode;
            listSize++;
        }

        // remove input node and decrease list size by 1
        public void removeNode(DLLNode curNode) {
            DLLNode prevNode = curNode.prev;
            DLLNode nextNode = curNode.next;
            prevNode.next = nextNode;
            nextNode.prev = prevNode;
            listSize--;
        }
    }

    // Driver code
    public static void main(String args[]) {
        LFUCache obj=new LFUCache(3);
        obj.put(1,10);
        obj.put(2,20);
        obj.put(3,30);
        obj.put(4,40);
        int val=obj.get(3);
        System.out.println("The value against cache block 3 is "+val);
    }
}
You can also try this code with Online Java Compiler
Run Code


The output of the above code will be-

Cache block 1 added
Cache block 2 added
Cache block 3 added
Cache block 4 added
The value against cache block 3 is 30

Time complexity: get(key) and put(key, value) will be performed with O(1) complexity.

You can also read about the Multilevel Queue Scheduling.

Frequently Asked Questions

What are the different caching algorithms?

There are many caching algorithms. Some of them are the First In First Out (FIFO) algorithm, Least Recently Used (LRU) algorithm, Least Frequently Used Algorithm (LFU) algorithm, and Random Replacement (RR) algorithm.

What is the difference between LFU and LRU algorithms?

In LFU, the cache block with the least frequency is removed while in LRU, the longest unused cache block is removed.

Conclusion

To summarise this article, we have studied the least frequently used cache algorithm. We have also understood how to implement this algorithm.

Reader, don’t stop here. Start your learning journey in the operating system with Coding Ninjas by taking the OS course.

Recommended Reading:

Do check out The Interview guide for Product Based Companies as well as some of the Popular Interview Problems from Top companies like Amazon, Adobe, Google, Uber, Microsoft, etc. on Coding Ninjas Studio.

Also check out some of the Guided Paths on topics such as Data Structure and Algorithms, Competitive Programming, Operating Systems, Computer Networks, DBMS, System Design, etc. as well as some Contests, Test Series, Interview Bundles, and some Interview Experiences curated by top Industry Experts only on Coding Ninjas Studio.

Happy Learning!

Live masterclass