Module 4
Speed up reads and reduce load: caching strategies, eviction policies, distributed caches, and the hardest problem in CS — cache invalidation.
The three fundamental caching write strategies, their consistency guarantees, performance characteristics, and ideal use cases.
LRU, LFU, FIFO, TTL, and random eviction. How each policy works, memory overhead, and when to pick one over another.
In-depth comparison of Redis and Memcached: data structures, persistence, clustering, pub/sub, and choosing between them.
How CDN edge caching works: cache headers (Cache-Control, ETag), cache keys, purging strategies, and dynamic content caching.
In-process caching, request-scoped caching, memoization, and computed caching. When local caches beat distributed ones.
The hardest problem in CS: TTL-based, event-driven, and versioned invalidation. Cache stampede prevention and stale-while-revalidate.
Scaling caches across nodes: consistent hashing, replication, cache clusters, and handling node failures without cache avalanche.