🎯 Cache-Aside is King
Most applications use cache-aside. It’s flexible, understandable, and gives you control.
Imagine you’re a librarian. Every time someone asks for a book, you could walk to the massive warehouse (database) to find it. Or, you could keep the 100 most popular books on a cart right next to you (cache). When someone asks for a popular book, you grab it instantly. That’s caching.
Caching is storing frequently accessed data in fast storage (usually memory) to avoid slow operations like database queries or external API calls.
| Problem | How Caching Solves It |
|---|---|
| Slow Database Queries | Cache stores results, avoiding repeated queries |
| High Database Load | Reduces database requests by 90%+ |
| Expensive External APIs | Cache API responses, avoid rate limits |
| Repeated Computations | Cache expensive calculation results |
| Geographic Latency | Cache data closer to users (CDN) |
There are four main ways to integrate caching into your application. Each has different trade-offs.
The most common pattern. Your application manages the cache directly.
How it works:
When to use:
Trade-offs:
The cache acts as a proxy. Your application only talks to the cache; the cache handles database access.
How it works:
When to use:
Trade-offs:
Writes go to both cache and database simultaneously. Ensures they stay in sync.
How it works:
When to use:
Trade-offs:
Write to cache immediately, database write happens later. Fastest writes, but risky.
How it works:
When to use:
Trade-offs:
At the code level, caching patterns translate to decorator patterns and repository abstractions.
The decorator pattern is perfect for adding caching to existing repositories:
1from functools import wraps2from typing import Callable, Any3import time4
5class CacheDecorator:6 def __init__(self, cache: dict, ttl: int = 300):7 self.cache = cache8 self.ttl = ttl # Time to live in seconds9
10 def __call__(self, func: Callable) -> Callable:11 @wraps(func)12 def wrapper(*args, **kwargs):13 # Create cache key from function args14 cache_key = f"{func.__name__}:{args}:{kwargs}"15
16 # Check cache (cache-aside pattern)17 if cache_key in self.cache:18 cached_data, timestamp = self.cache[cache_key]19 if time.time() - timestamp < self.ttl:20 return cached_data21
22 # Cache miss - fetch from source23 result = func(*args, **kwargs)24
25 # Store in cache26 self.cache[cache_key] = (result, time.time())27 return result28
29 return wrapper30
31# Usage32cache = {}33@CacheDecorator(cache, ttl=300)34def get_user(user_id: int):35 # Simulate database query36 return {"id": user_id, "name": "John"}1import java.util.Map;2import java.util.concurrent.ConcurrentHashMap;3import java.util.function.Function;4
5public class CacheDecorator<T, R> {6 private final Map<String, CacheEntry<R>> cache;7 private final long ttlMillis;8
9 public CacheDecorator(long ttlSeconds) {10 this.cache = new ConcurrentHashMap<>();11 this.ttlMillis = ttlSeconds * 1000;12 }13
14 public R apply(String key, Function<T, R> function, T input) {15 // Check cache (cache-aside pattern)16 CacheEntry<R> entry = cache.get(key);17 if (entry != null && !entry.isExpired()) {18 return entry.value;19 }20
21 // Cache miss - fetch from source22 R result = function.apply(input);23
24 // Store in cache25 cache.put(key, new CacheEntry<>(result, System.currentTimeMillis()));26 return result;27 }28
29 private static class CacheEntry<R> {30 final R value;31 final long timestamp;32
33 CacheEntry(R value, long timestamp) {34 this.value = value;35 this.timestamp = timestamp;36 }37
38 boolean isExpired() {39 return System.currentTimeMillis() - timestamp >40 CacheDecorator.this.ttlMillis;41 }42 }43}A more complete example showing cache-aside in a repository:
1from abc import ABC, abstractmethod2from typing import Optional3
4class UserRepository(ABC):5 @abstractmethod6 def get_user(self, user_id: int) -> Optional[dict]:7 pass8
9class DatabaseUserRepository(UserRepository):10 def get_user(self, user_id: int) -> Optional[dict]:11 # Simulate database query12 return {"id": user_id, "name": "John"}13
14class CachedUserRepository(UserRepository):15 def __init__(self, db_repo: UserRepository, cache: dict):16 self.db_repo = db_repo17 self.cache = cache18
19 def get_user(self, user_id: int) -> Optional[dict]:20 # Cache-aside pattern21 cache_key = f"user:{user_id}"22
23 # Check cache first24 if cache_key in self.cache:25 return self.cache[cache_key]26
27 # Cache miss - fetch from DB28 user = self.db_repo.get_user(user_id)29
30 # Store in cache31 if user:32 self.cache[cache_key] = user33
34 return user1import java.util.Optional;2import java.util.Map;3
4public interface UserRepository {5 Optional<User> getUser(int userId);6}7
8class DatabaseUserRepository implements UserRepository {9 public Optional<User> getUser(int userId) {10 // Simulate database query11 return Optional.of(new User(userId, "John"));12 }13}14
15class CachedUserRepository implements UserRepository {16 private final UserRepository dbRepo;17 private final Map<String, User> cache;18
19 public CachedUserRepository(UserRepository dbRepo, Map<String, User> cache) {20 this.dbRepo = dbRepo;21 this.cache = cache;22 }23
24 public Optional<User> getUser(int userId) {25 // Cache-aside pattern26 String cacheKey = "user:" + userId;27
28 // Check cache first29 if (cache.containsKey(cacheKey)) {30 return Optional.of(cache.get(cacheKey));31 }32
33 // Cache miss - fetch from DB34 Optional<User> user = dbRepo.getUser(userId);35
36 // Store in cache37 user.ifPresent(u -> cache.put(cacheKey, u));38
39 return user;40 }41}| Pattern | Read Latency | Write Latency | Consistency | Complexity | Use Case |
|---|---|---|---|---|---|
| Cache-Aside | Low (cache hit) | Low | Eventual | Medium | Most applications |
| Read-Through | Low (cache hit) | Low | Eventual | Low | Read-heavy apps |
| Write-Through | Low (cache hit) | High (waits for DB) | Strong | Medium | Critical data |
| Write-Behind | Low (cache hit) | Very Low | Eventual | High | High write volume |
🎯 Cache-Aside is King
Most applications use cache-aside. It’s flexible, understandable, and gives you control.
⚡ Speed Matters
Cache lookups are 100x faster than database queries. At scale, this difference is massive.
🔄 Consistency Trade-offs
Faster writes (write-behind) = weaker consistency. Stronger consistency (write-through) = slower writes.
🏗️ Decorator Pattern
Use decorator pattern in code to add caching transparently to existing repositories.