Menu
Back to Discussions

When does caching go from being helpful to being a pain in system design?

Everyone always brings up caching as this magic fix for performance, especially when you're getting ready for interviews. But I've been wondering, when does adding a cache actually cause more trouble than it helps? I'm not just talking about how hard cache invalidation is, because yeah, that's a pain. I'm more interested in the specific times when the extra work to manage a cache, the way it makes the whole system more complicated, or even the chance of new ways for things to break, just isn't worth it. Like, what if your data is accessed super randomly? Or your dataset is tiny, and the database is already screaming fast? And what if a cache miss is a huge deal, say in a real-time system where old data is a definite problem? What do you think about when you should skip caching, even if you're worried about speed?
15 comments

Comments

Loading comments...