Core Cache Access Patterns explains that a cache is not only defined by where it lives, but by how the application uses it.
Core Cache Access Patterns explains that a cache is not only defined by where it lives, but by how the application uses it. A cache-aside read path behaves differently from read-through loading. A synchronous write-through path has very different guarantees from write-behind persistence. The pattern determines who owns the cache logic, where failures surface, and how stale or lost data can become.
The four lessons move through the main read and write styles in practical systems. The first covers the common cache-aside pattern where the application owns miss handling. The second covers read-through designs where the cache infrastructure loads data on miss. The third covers write-through flows that keep cache and store synchronized in one path. The fourth covers write-behind and write-back approaches that trade durability immediacy for speed and batching.
Use this chapter when you need to choose the control flow around the cache, not just the storage layer. The goal is to leave the child lessons with a clearer sense of where loading logic belongs, how writes propagate, and which failure modes each pattern introduces.