Cache-managed loading, where the cache infrastructure fills entries on miss instead of the application doing it directly.
Read-through caching moves miss handling into the cache layer itself. Instead of the application checking cache and then loading from the store manually, the application reads through an abstraction that either returns a cached value or uses a configured loader to fetch and populate the cache automatically.
This centralizes the loading policy and can make application code cleaner. It also makes the cache behavior more uniform across services. The trade-off is tighter coupling between the cache mechanism and the backing data source. The loader logic, refresh rules, and error behavior now live closer to infrastructure than to plain application code.
sequenceDiagram
participant App
participant CacheLayer
participant Store
App->>CacheLayer: read(key)
alt hit
CacheLayer-->>App: cached value
else miss
CacheLayer->>Store: load(key)
Store-->>CacheLayer: value
CacheLayer-->>App: value
end
This pattern is useful when the team wants one consistent loading path instead of repeated cache-aside logic scattered across application services. It can also make stampede control, refresh policy, and loader instrumentation easier to standardize. But it is only a win if the coupling is acceptable. The cache is no longer just a storage layer. It becomes part of the data access contract.
Read-through is strongest when:
It is weaker when data loading is highly context-specific or when infrastructure-level coupling would make testing and evolution harder.
This example sketches a loader-based cache API. The key idea is that the caller asks for the value, while the cache abstraction decides whether to hit the store.
1type Loader<T> = (key: string) => Promise<T>;
2
3async function readThrough<T>(
4 key: string,
5 ttlSeconds: number,
6 loader: Loader<T>
7): Promise<T> {
8 const cached = await cache.get(key);
9 if (cached) {
10 return JSON.parse(cached) as T;
11 }
12
13 const fresh = await loader(key);
14 await cache.set(key, JSON.stringify(fresh), ttlSeconds);
15 return fresh;
16}
17
18const profile = await readThrough("user:42", 120, profileStore.fetch);
What to notice:
Why might a team prefer read-through over cache-aside even though it increases infrastructure coupling?
The stronger answer is that it can centralize loading behavior, reduce duplicated miss logic, and make refresh policy more consistent across callers. The trade-off is that the cache layer now owns more of the data access contract and must be treated as a first-class dependency.