Data Structures24 min read

LRU Cache

Build an LRU cache using dict and deque ideas, and understand why caches are essential for performance in real systems.

David Miller
December 21, 2025
0.0k0

LRU means **Least Recently Used**.

When cache is full: remove the item not used for longest time.

Where used - web servers - databases - API clients

Simple LRU using OrderedDict ```python from collections import OrderedDict

class LRUCache: def __init__(self, cap): self.cap = cap self.data = OrderedDict()

def get(self, key): if key not in self.data: return None self.data.move_to_end(key) return self.data[key]

def put(self, key, value): if key in self.data: self.data.move_to_end(key) self.data[key] = value if len(self.data) > self.cap: self.data.popitem(last=False) ```

Example ```python c = LRUCache(2) c.put(1, "A") c.put(2, "B") c.get(1) c.put(3, "C") # removes key 2 ```

Graph: LRU idea ```mermaid flowchart LR A[Oldest] --> B[Cache] --> C[Newest] ```

Remember - caches trade memory for speed - LRU keeps recent items

#Python#Advanced#Cache