categories.performance Intermediate

What are common caching strategies? Explain Cache-Aside, Write-Through, and Write-Behind.

AI Practice

Three Main Caching Strategies

1. Cache-Aside (Lazy Loading)

The most common strategy. The application manages the cache itself.

Read flow:

1. Check Cache (Redis)
2. Cache Hit → return immediately
3. Cache Miss → query DB → write to Cache → return

Write flow:

1. Write to database
2. Delete (or update) the corresponding key in Cache

Pros: Only caches data that is actually accessed; ideal for read-heavy workloads Cons: First access always misses; app code must handle TTL and cache invalidation


2. Write-Through

Synchronously updates both Cache and DB on every write.

Write data → update Cache AND DB synchronously

Pros: Cache is always consistent with DB; no stale data Cons: Higher write latency; may cache data that is never read


3. Write-Behind / Write-Back

Writes update only the Cache; DB is updated asynchronously in batches.

Write data → update Cache only → batch flush to DB asynchronously

Pros: Lowest write latency; great for high-write scenarios (counters, view counts) Cons: Risk of data loss if Cache node fails before flushing (consistency tradeoff)


Summary

Strategy Consistency Write Perf Best For
Cache-Aside Eventually consistent Medium Read-heavy (product pages, user profiles)
Write-Through Strong Slower Financial transactions, order data
Write-Behind Eventually consistent Fastest View counts, likes — tolerates slight lag

Interview bonus: Mention the Cache Stampede problem and solutions (Mutex Lock / Probabilistic Early Expiration), plus implementing distributed locks in Redis with SETNX.

✦ AI Mock Interview

Type your answer and get instant AI feedback

Sign in to use AI scoring

Copyright © 2026 Wood All Rights Reserved · FE Interview Hub