This blog is a part of Caching 101 to Advanced.
To make it easy, we implement cache in multiple levels. Starting from level 0, where practical caching with keys, then TTLs, hit/miss flow, and the trade‑offs of invalidation and eviction—focused on keeping reads fast and data correct, with clear JavaScript examples.
- Level 0: The Primitive Cache (Using Map)
- 0. Naive cache (no eviction, no TTL)
- Level 1: Address problems one by one in the Primitive cache
- 0. Cache with TTL (time-based expiry)
- 1. Cache with size limit
- Level 2: LRU cache (real-world, interview-grade)
- Level 3: Plug cache into a fake “DB call”
Level 0: The simplest possible cache (baseline)
Before fancy terms, a cache is just:
- A key
- A value
- Stored in fast memory
Goal
Understand what caching really means in code.
Implementation
class SimpleCache {
constructor() {
this.store = new Map();
}
get(key) {
return this.store.get(key);
}
set(key, value) {
this.store.set(key, value);
}
}
Usage
const cache = new SimpleCache();
cache.set("user:1", { id: 1, name: "Hari" });
console.log(cache.get("user:1")); // { id: 1, name: "Hari" }
What you learned
- Cache is just a fast key–value store
- Map is better than `{}` because it preserves insertion order and has O(1) ops
Problem
- Cache grows forever
- No expiry
- No eviction
This is not production-safe.
Level 1.0: Add TTL (time-based expiration)
Now we solve the stale data problem.
Idea
Store:
- value
- expiry timestamp
Implementation
class TTLCache {
constructor() {
this.store = new Map();
}
set(key, value, ttlMs) {
const expiresAt = Date.now() + ttlMs;
this.store.set(key, { value, expiresAt });
}
get(key) {
const entry = this.store.get(key);
if (!entry) return null;
if (Date.now() > entry.expiresAt) {
this.store.delete(key);
return null;
}
return entry.value;
}
}
Usage
const cache = new TTLCache();
cache.set("token", "abc123", 2000);
console.log( "The cache lookup: " ,cache.get("token")); // The cache lookup: abc123
setTimeout(() => {
// The cache lookup after few secs: null
console.log("The cache lookup after few secs: ", cache.get("token"));
}, 3000);
What you learned
- TTL is invalidation, not eviction
- Expired entries must be cleaned up
- Lazy deletion (on `get`) is common
Problem
- Cache can still grow forever
- No control over memory size
Level 1.1: Add a size limit (FIFO-style eviction)
Now we introduce capacity constraints.
Idea
When cache is full:
- Remove the oldest entry
This is FIFO eviction.
Implementation
class SizeLimitedCache {
constructor(limit) {
this.limit = limit;
this.store = new Map();
}
set(key, value) {
if (this.store.size >= this.limit) {
const oldestKey = this.store.keys().next().value;
this.store.delete(oldestKey);
}
this.store.set(key, value);
}
get(key) {
return this.store.get(key) || null;
}
}
Usage
const cache = new SizeLimitedCache(2);
cache.set("a", 1);
cache.set("b", 2);
cache.set("c", 3);
console.log(cache.get("a")); // null
What you learned
- Capacity-based eviction is required
- FIFO is simple but not optimal
Problem
- Recently used items can still be evicted
- This hurts performance
Level 2: LRU Cache (real-world cache)
Now we build something production-worthy.
LRU idea (core intuition)
- Recently used data is more likely to be used again
- On every `get` or `set`, move item to the “most recent” position
- Evict the least recently used item
- All operations should be O(1)
Why Map works perfectly
- Maintains insertion order
- Deleting + re-inserting moves item to the end
Implementation
class LRUCache {
constructor(limit) {
this.limit = limit;
this.cache = new Map();
}
get(key) {
if (!this.cache.has(key)) return null;
const value = this.cache.get(key);
this.cache.delete(key);
this.cache.set(key, value);
return value;
}
set(key, value) {
if (this.cache.has(key)) {
this.cache.delete(key);
} else if (this.cache.size >= this.limit) {
const lruKey = this.cache.keys().next().value;
this.cache.delete(lruKey);
}
this.cache.set(key, value);
}
}
Usage
const cache = new LRUCache(2);
cache.set("a", 1);
cache.set("b", 2);
cache.get("a"); // a becomes most recently used
cache.set("c", 3); // evicts b
console.log(cache.get("b")); // null
What you learned
- LRU ≠ FIFO
- Eviction is based on usage, not insertion
- This is how Redis and many caches behave internally
In JavaScript, LRU can be implemented using
Mapalone because it preserves insertion order. However, the doubly linked list approach remains the canonical solution because it works consistently across languages and makes the cache behavior explicit.
Level 3: Use cache with a fake database (cache-aside pattern)
Now let’s simulate a real backend scenario.
Fake DB
function fetchUserFromDB(id) {
console.log("Fetching from DB...");
return { id, name: "User " + id };
}
Cached service
const userCache = new LRUCache(3);
function getUser(id) {
const cached = userCache.get(id);
if (cached) {
console.log("Cache hit");
return cached;
}
console.log("Cache miss");
const user = fetchUserFromDB(id);
userCache.set(id, user);
return user;
}
Usage
getUser(1); // DB
getUser(1); // Cache
getUser(2); // DB
getUser(3); // DB
getUser(4); // Evicts least used
Conclusion
We have now:
- Built cache from first principles
- Understood TTL vs eviction
- Implemented LRU
- Used cache like real backend systems
This is the easy way of learning caching.
Next steps
- Add TTL to LRU
- Handle async DB calls
- Prevent cache stampede
- Swap in Redis with same interface
Let me know what you want to build next, and we’ll go deeper together.