lru-cache

Implement Least Recently Used(LRU) Cache

Posted by

Introduction

In this post, we are going to discuss Least recently used a.k.a. (LRU) cache implementation. There are many Cache replacement algorithms Each of them follows its implementations.

Simple Queue-based policies

    • First in First Out: In this algorithm, the cache works as a FIFO queue, It evicts the item in the same order it inserts.
    • Last In First Out: In this algorithm cache works as a stack data structure, whichever element comes last evicts first. 

Recency Based Algorithms

    • Most Recently Used: In this algorithm, the element accessed most frequently is removed from the cache once the cache size gets full, and a new item comes into the cache.

    • Least Recently Used: In this algorithm, the element which accessed least(the element in which the operation happened before any other element) is removed from the cache, once the cache has no space left for the new elements.
    • In this post, we are going to discuss the implementation of the LRU using Java.

Frequency Based Algorithms

Most Frequently Used and Least Frequently Used

Implementation

For Implementing LRU since we need to maintain the order also that in which order which element is accessed then we will be able to get the recency of the element. So here we need to use two data structures one for maintaining order and another for getting the element faster i.e. in O(1) time.

We will use Doubly Linked List and HashMap.
We can implement LRU using other approaches as well, but since the Doubly Linked List and HashMap approach is very common I am using this.

We will maintain one Node class for storing the Key, value, and the address of next and previous nodes.

So next and previous pointers will make the combination of Nodes a Doubly Linked List.

HashMap will be used for storing <Key, Node>, It will help in faster access of particular nodes by key.

In this implementation, the get and put operation should be in constant time.

Put method: The first time when a new node comes it will just create a node object and insert it at the beginning.
Next time if the key is already present then it will just replace the value of the node and move to the head.
If the size is full then first evict the last element and then insert the element.

Get Method: It just checks whether the key is present in the cache or not, If it is present then get the node to move to the head of the list and return the value of the node.
If the key is not present then return -1.
Below is the complete code with comments for better understanding.

Now let’s discuss the usage of LRU Cache.

  • In Database Systems for fast query results.
  • In Operating Systems to minimize page faults.
  • Text Editors and IDEs to store frequently used files or code snippets
  • Text Prediction and autocompletion tools

Thanks for reading.

Leave a Reply

Your email address will not be published. Required fields are marked *