乐闻世界logo
搜索文章和话题

What are the differences between Redis cache penetration, cache breakdown, and cache avalanche? How to solve them?

2月19日 19:36

Redis caching strategy is a core issue when using Redis as a cache, needing to solve problems like cache penetration, cache breakdown, and cache avalanche, while also designing reasonable cache update strategies.

1. Cache Penetration

Problem Description: Cache penetration refers to querying non-existent data. Since the cache doesn't have this data, requests directly hit the database. If there are many such requests, it puts huge pressure on the database.

Solutions:

Solution 1: Cache Null Objects

java
public User getUserById(Long id) { User user = redis.get("user:" + id); if (user != null) { return user.equals("NULL") ? null : user; } user = db.queryUserById(id); if (user == null) { redis.set("user:" + id, "NULL", 300); // Cache null object, expire in 5 minutes } else { redis.set("user:" + id, user, 3600); } return user; }

Solution 2: Bloom Filter

java
// Use bloom filter to check if key exists if (!bloomFilter.mightContain("user:" + id)) { return null; // Return directly, don't query database } User user = redis.get("user:" + id); if (user != null) { return user; } user = db.queryUserById(id); if (user != null) { redis.set("user:" + id, user, 3600); } return user;

2. Cache Breakdown

Problem Description: Cache breakdown refers to the moment when a hot key expires in the cache, and many requests simultaneously query this key, causing all requests to hit the database.

Solutions:

Solution 1: Mutex Lock

java
public User getUserById(Long id) { User user = redis.get("user:" + id); if (user != null) { return user; } // Get distributed lock String lockKey = "lock:user:" + id; try { if (redis.setnx(lockKey, "1", 10)) { // 10 seconds expiration user = db.queryUserById(id); redis.set("user:" + id, user, 3600); } else { // Wait and retry Thread.sleep(100); return getUserById(id); } } finally { redis.del(lockKey); } return user; }

Solution 2: Logical Expiration

java
public User getUserById(Long id) { String value = redis.get("user:" + id); if (value != null) { JSONObject json = JSON.parseObject(value); if (json.getBoolean("expired")) { // Async update cache asyncUpdateCache(id); } return json.getObject("data", User.class); } // Cache doesn't exist, query database directly User user = db.queryUserById(id); JSONObject json = new JSONObject(); json.put("data", user); json.put("expired", false); redis.set("user:" + id, json.toJSONString(), 3600); return user; }

3. Cache Avalanche

Problem Description: Cache avalanche refers to many keys expiring at the same time, or Redis crashing, causing many requests to directly hit the database.

Solutions:

Solution 1: Set Random Expiration Time

java
// Add random value when setting expiration time int expire = 3600 + new Random().nextInt(600); // 3600-4200 seconds redis.set("user:" + id, user, expire);

Solution 2: Cache Warm-up

java
// Warm up cache when system starts @PostConstruct public void init() { List<User> users = db.queryAllUsers(); for (User user : users) { redis.set("user:" + user.getId(), user, 3600); } }

Solution 3: Use Redis High Availability

  • Use Redis Sentinel or Redis Cluster
  • Avoid single point of failure

4. Cache Update Strategy

Strategy 1: Cache Aside Pattern

java
// Read operation public User getUserById(Long id) { User user = redis.get("user:" + id); if (user != null) { return user; } user = db.queryUserById(id); redis.set("user:" + id, user, 3600); return user; } // Write operation public void updateUser(User user) { db.updateUser(user); redis.del("user:" + user.getId()); // Delete cache, don't update cache }

Strategy 2: Write Through Pattern

java
public void updateUser(User user) { db.updateUser(user); redis.set("user:" + user.getId(), user, 3600); // Update cache and database simultaneously }

Strategy 3: Write Behind Pattern

java
public void updateUser(User user) { redis.set("user:" + user.getId(), user, 3600); // Update cache first // Async write to database asyncWriteToDB(user); }

5. Cache Consistency

Problem Description: Data inconsistency between cache and database, leading to dirty data reads.

Solutions:

Solution 1: Delayed Double Delete

java
public void updateUser(User user) { db.updateUser(user); redis.del("user:" + user.getId()); // First delete try { Thread.sleep(500); // Delay } catch (InterruptedException e) { e.printStackTrace(); } redis.del("user:" + user.getId()); // Second delete }

Solution 2: Subscribe to Binlog

java
// Subscribe to database Binlog, automatically update cache when database changes @CanalEventListener public class CacheUpdateListener { @ListenPoint(destination = "example", schema = "test", table = "user") public void onEvent(CanalEntry.Entry entry) { // Parse Binlog, update cache User user = parseUserFromBinlog(entry); redis.set("user:" + user.getId(), user, 3600); } }

6. Cache Warm-up

Problem Description: Cache is empty when system starts, many requests directly hit the database.

Solutions:

Solution 1: Scheduled Task Warm-up

java
@Scheduled(cron = "0 0 2 * * ?") // Warm up at 2 AM every day public void warmUpCache() { List<User> users = db.queryHotUsers(); for (User user : users) { redis.set("user:" + user.getId(), user, 3600); } }

Solution 2: Async Loading

java
public User getUserById(Long id) { User user = redis.get("user:" + id); if (user != null) { return user; } // Async load cache asyncLoadCache(id); // Return default value or query from database return db.queryUserById(id); }

7. Cache Degradation

Problem Description: How to ensure system availability when Redis fails.

Solutions:

Solution 1: Return Default Value Directly

java
public User getUserById(Long id) { try { User user = redis.get("user:" + id); if (user != null) { return user; } } catch (Exception e) { log.error("Redis error", e); } // Redis failure, query database directly return db.queryUserById(id); }

Solution 2: Use Local Cache

java
public User getUserById(Long id) { try { User user = redis.get("user:" + id); if (user != null) { return user; } } catch (Exception e) { log.error("Redis error", e); // Use local cache return localCache.get("user:" + id); } User user = db.queryUserById(id); redis.set("user:" + id, user, 3600); return user; }

Summary

Redis caching strategy needs to comprehensively consider problems like cache penetration, cache breakdown, and cache avalanche, while also designing reasonable cache update strategies and cache consistency solutions. In practical applications, appropriate caching strategies need to be selected based on specific business scenarios. At the same time, continuous monitoring of cache hit rate and performance is needed to adjust caching strategies in a timely manner.

标签:Redis