目录
- Django缓存策略:Redis、Memcached与数据库缓存对比
-
- [1. 引言](#1. 引言)
-
- [1.1 缓存的重要性](#1.1 缓存的重要性)
- [1.2 Django缓存框架概述](#1.2 Django缓存框架概述)
- [2. Django缓存基础](#2. Django缓存基础)
-
- [2.1 缓存配置](#2.1 缓存配置)
- [2.2 缓存API使用](#2.2 缓存API使用)
- [3. Redis缓存深入](#3. Redis缓存深入)
-
- [3.1 Redis配置和优势](#3.1 Redis配置和优势)
- [3.2 Redis高级特性](#3.2 Redis高级特性)
- [3.3 Redis性能优化](#3.3 Redis性能优化)
- [4. Memcached缓存深入](#4. Memcached缓存深入)
-
- [4.1 Memcached配置和特点](#4.1 Memcached配置和特点)
- [4.2 Memcached特性和最佳实践](#4.2 Memcached特性和最佳实践)
- [5. 数据库缓存](#5. 数据库缓存)
-
- [5.1 数据库缓存配置](#5.1 数据库缓存配置)
- [5.2 数据库缓存特性和使用](#5.2 数据库缓存特性和使用)
- [6. 三种缓存方案对比](#6. 三种缓存方案对比)
-
- [6.1 功能特性对比](#6.1 功能特性对比)
- [6.2 实际性能测试](#6.2 实际性能测试)
- [7. 实际应用案例](#7. 实际应用案例)
-
- [7.1 电商平台缓存策略](#7.1 电商平台缓存策略)
- [7.2 缓存监控和维护](#7.2 缓存监控和维护)
- [8. 总结](#8. 总结)
-
- [8.1 缓存策略总结](#8.1 缓存策略总结)
- [8.2 最佳实践建议](#8.2 最佳实践建议)
- [8.3 选择指南](#8.3 选择指南)
『宝藏代码胶囊开张啦!』------ 我的 CodeCapsule 来咯!✨写代码不再头疼!我的新站点 CodeCapsule 主打一个 "白菜价"+"量身定制 "!无论是卡脖子的毕设/课设/文献复现 ,需要灵光一现的算法改进 ,还是想给项目加个"外挂",这里都有便宜又好用的代码方案等你发现!低成本,高适配,助你轻松通关!速来围观 👉 CodeCapsule官网
Django缓存策略:Redis、Memcached与数据库缓存对比
1. 引言
1.1 缓存的重要性
在现代Web应用中,缓存是提升性能的关键技术。Django作为一个高效的Web框架,提供了强大的缓存框架,支持多种缓存后端。合理的缓存策略可以:
- 显著提升响应速度:减少数据库查询和计算时间
- 降低服务器负载:减少CPU和内存的使用
- 提高应用扩展性:支持更高的并发用户数
- 改善用户体验:提供更快的页面加载速度
1.2 Django缓存框架概述
Django的缓存框架提供了不同粒度的缓存策略:
python
# Django缓存层次结构
"""
1. 全站缓存
2. 视图缓存
3. 模板片段缓存
4. 低级缓存API
"""
from django.core.cache import cache
from django.views.decorators.cache import cache_page
from django.core.cache.backends.base import BaseCache
2. Django缓存基础
2.1 缓存配置
Django支持多种缓存后端,配置方式灵活:
python
# settings.py 中的缓存配置示例
CACHES = {
'default': {
'BACKEND': 'django.core.cache.backends.locmem.LocMemCache',
'LOCATION': 'unique-snowflake',
'TIMEOUT': 300, # 5分钟,默认300秒
'OPTIONS': {
'MAX_ENTRIES': 1000, # 最大缓存条目数
'CULL_FREQUENCY': 3, # 当达到MAX_ENTRIES时删除1/3的条目
},
'KEY_PREFIX': 'myapp', # 缓存键前缀
'VERSION': 1, # 缓存版本
}
}
# 开发环境本地内存缓存
DEVELOPMENT_CACHE = {
'default': {
'BACKEND': 'django.core.cache.backends.locmem.LocMemCache',
'LOCATION': 'dev-cache',
'TIMEOUT': 60 * 15, # 15分钟
'OPTIONS': {
'MAX_ENTRIES': 1000,
}
}
}
# 文件系统缓存
FILE_CACHE = {
'default': {
'BACKEND': 'django.core.cache.backends.filebased.FileBasedCache',
'LOCATION': '/var/tmp/django_cache',
'TIMEOUT': 60 * 60 * 24, # 24小时
'OPTIONS': {
'MAX_ENTRIES': 10000,
}
}
}
2.2 缓存API使用
Django提供了统一的缓存API,无论使用哪种后端:
python
# Django缓存API基础使用
from django.core.cache import cache
from django.core.cache import caches
class BasicCacheOperations:
"""基础缓存操作示例"""
@staticmethod
def demonstrate_basic_operations():
"""演示基础缓存操作"""
# 设置缓存
cache.set('my_key', 'my_value', timeout=300)
# 获取缓存
value = cache.get('my_key')
print(f"获取缓存值: {value}")
# 获取或设置(如果不存在)
value = cache.get_or_set('computed_value', lambda: expensive_computation(), 300)
# 设置多个值
cache.set_many({'key1': 'value1', 'key2': 'value2'})
# 获取多个值
values = cache.get_many(['key1', 'key2'])
print(f"获取多个值: {values}")
# 删除缓存
cache.delete('my_key')
# 清空所有缓存
cache.clear()
# 检查键是否存在
if cache.has_key('my_key'):
print("键存在")
else:
print("键不存在")
@staticmethod
def advanced_cache_operations():
"""高级缓存操作"""
# 原子操作:如果键不存在则设置
added = cache.add('unique_key', 'value', timeout=300)
if added:
print("键已添加")
else:
print("键已存在")
# 自增/自减操作
cache.set('counter', 0)
cache.incr('counter') # 1
cache.incr('counter', 10) # 11
cache.decr('counter', 5) # 6
# 获取并删除
value = cache.get('my_key', default=None, version=None)
if value is not None:
cache.delete('my_key')
# 使用不同的缓存配置
special_cache = caches['special']
special_cache.set('special_key', 'special_value')
def expensive_computation():
"""模拟耗时计算"""
import time
time.sleep(2)
return "计算结果"
# 缓存装饰器使用
from django.views.decorators.cache import cache_page
@cache_page(60 * 15) # 缓存15分钟
def my_view(request):
"""使用缓存装饰器的视图"""
return HttpResponse("缓存的内容")
# 基于条件的缓存
class ConditionalCaching:
"""条件缓存示例"""
@cache_page(60 * 5)
def cached_view(self, request):
"""基础视图缓存"""
# 这个视图将被缓存5分钟
return HttpResponse("缓存视图")
def vary_on_headers_view(self, request):
"""基于请求头变化的缓存"""
response = HttpResponse("基于用户代理缓存")
# 基于用户代理区分缓存
response['Vary'] = 'User-Agent'
return response
@cache_page(60 * 10, key_prefix='user_specific')
def user_specific_cache(self, request):
"""用户特定缓存"""
# 每个用户有不同的缓存
if request.user.is_authenticated:
return HttpResponse(f"用户 {request.user.username} 的缓存")
return HttpResponse("匿名用户缓存")
3. Redis缓存深入
3.1 Redis配置和优势
Redis是一个高性能的键值存储系统,特别适合作为缓存后端:
python
# Redis缓存配置
REDIS_CACHE_CONFIG = {
'default': {
'BACKEND': 'django_redis.cache.RedisCache',
'LOCATION': [
'redis://127.0.0.1:6379/1', # 主服务器
# 'redis://127.0.0.1:6378/1', # 从服务器(可选)
],
'OPTIONS': {
'CLIENT_CLASS': 'django_redis.client.DefaultClient',
'PASSWORD': 'your_password', # 如果有密码
'SOCKET_CONNECT_TIMEOUT': 5, # 连接超时(秒)
'SOCKET_TIMEOUT': 5, # socket超时(秒)
'CONNECTION_POOL_KWARGS': {
'max_connections': 100, # 连接池最大连接数
},
'COMPRESSOR': 'django_redis.compressors.zlib.ZlibCompressor',
'SERIALIZER': 'django_redis.serializers.json.JSONSerializer',
'PICKLE_VERSION': -1, # 使用最新pickle协议
},
'KEY_PREFIX': 'myapp',
'TIMEOUT': 60 * 60 * 24, # 24小时
},
'session': {
'BACKEND': 'django_redis.cache.RedisCache',
'LOCATION': 'redis://127.0.0.1:6379/2',
'OPTIONS': {
'CLIENT_CLASS': 'django_redis.client.DefaultClient',
}
}
}
# 安装要求:pip install django-redis
3.2 Redis高级特性
python
# Redis高级特性使用
import redis
from django_redis import get_redis_connection
class RedisAdvancedFeatures:
"""Redis高级特性"""
def __init__(self):
self.redis_client = get_redis_connection("default")
def demonstrate_data_structures(self):
"""演示Redis数据结构"""
# 字符串
self.redis_client.set('user:1:name', '张三')
self.redis_client.setex('user:1:session', 3600, 'session_data')
# 哈希 - 适合存储对象
self.redis_client.hset('user:1:profile', 'name', '张三')
self.redis_client.hset('user:1:profile', 'email', 'zhangsan@example.com')
self.redis_client.hset('user:1:profile', 'age', 25)
self.redis_client.expire('user:1:profile', 3600)
# 列表 - 适合时间线、消息队列
self.redis_client.lpush('user:1:activities', '登录系统')
self.redis_client.lpush('user:1:activities', '浏览商品')
self.redis_client.ltrim('user:1:activities', 0, 99) # 只保留最近100条
# 集合 - 适合标签、唯一值
self.redis_client.sadd('product:1:tags', '电子产品', '手机', '苹果')
self.redis_client.sadd('user:1:followed_products', 1, 2, 3)
# 有序集合 - 适合排行榜
self.redis_client.zadd('product_ranking', {'product:1': 100, 'product:2': 85})
self.redis_client.zincrby('product_ranking', 10, 'product:1')
def demonstrate_advanced_patterns(self):
"""演示高级模式"""
# 缓存穿透保护 - 缓存空值
def get_product_details(product_id):
cache_key = f'product:{product_id}:details'
data = self.redis_client.get(cache_key)
if data is not None:
if data == b'NULL': # 缓存空值
return None
return data
# 从数据库获取
from myapp.models import Product
try:
product = Product.objects.get(id=product_id)
# 缓存数据
self.redis_client.setex(cache_key, 3600, product.to_json())
return product.to_json()
except Product.DoesNotExist:
# 缓存空值,防止缓存穿透
self.redis_client.setex(cache_key, 300, 'NULL') # 5分钟
return None
# 分布式锁
def acquire_lock(lock_name, acquire_timeout=10, lock_timeout=30):
"""获取分布式锁"""
import time
identifier = str(time.time())
lock_key = f'lock:{lock_name}'
end = time.time() + acquire_timeout
while time.time() < end:
if self.redis_client.setnx(lock_key, identifier):
self.redis_client.expire(lock_key, lock_timeout)
return identifier
elif not self.redis_client.ttl(lock_key):
self.redis_client.expire(lock_key, lock_timeout)
time.sleep(0.001)
return False
def release_lock(lock_name, identifier):
"""释放分布式锁"""
lock_key = f'lock:{lock_name}'
# 使用Lua脚本保证原子性
lua_script = """
if redis.call("get", KEYS[1]) == ARGV[1] then
return redis.call("del", KEYS[1])
else
return 0
end
"""
result = self.redis_client.eval(lua_script, 1, lock_key, identifier)
return result == 1
# 使用锁的例子
lock_identifier = acquire_lock('product_update')
if lock_identifier:
try:
# 执行需要互斥的操作
update_product_inventory()
finally:
release_lock('product_update', lock_identifier)
def demonstrate_pub_sub(self):
"""演示发布订阅模式"""
# 发布消息
def publish_event(channel, message):
self.redis_client.publish(channel, message)
# 订阅消息(通常在单独的进程中)
def start_subscriber():
pubsub = self.redis_client.pubsub()
pubsub.subscribe('user_registered', 'order_created')
for message in pubsub.listen():
if message['type'] == 'message':
channel = message['channel'].decode()
data = message['data'].decode()
print(f"收到消息: 频道={channel}, 数据={data}")
if channel == 'user_registered':
self.handle_user_registered(data)
elif channel == 'order_created':
self.handle_order_created(data)
def demonstrate_pipeline(self):
"""演示管道操作 - 减少网络往返"""
# 使用管道批量操作
pipeline = self.redis_client.pipeline()
for i in range(100):
pipeline.set(f'key:{i}', f'value:{i}')
# 一次性执行所有命令
pipeline.execute()
def demonstrate_lua_scripting(self):
"""演示Lua脚本 - 原子操作"""
# 原子性的计数器增加和获取
lua_script = """
local current = redis.call('get', KEYS[1])
if current then
current = tonumber(current)
redis.call('set', KEYS[1], current + 1)
return current + 1
else
redis.call('set', KEYS[1], 1)
return 1
end
"""
counter = self.redis_client.eval(lua_script, 1, 'my_counter')
print(f"计数器值: {counter}")
def update_product_inventory():
"""模拟更新产品库存"""
import time
time.sleep(1)
print("库存更新完成")
3.3 Redis性能优化
python
# Redis性能优化策略
class RedisPerformanceOptimization:
"""Redis性能优化"""
def __init__(self):
self.redis_client = get_redis_connection("default")
def memory_optimization(self):
"""内存优化策略"""
# 1. 使用合适的数据结构
# 小哈希使用ziplist编码
self.redis_client.config_set('hash-max-ziplist-entries', 512)
self.redis_client.config_set('hash-max-ziplist-value', 64)
# 2. 使用压缩
self.redis_client.set('compressed_key', '大量重复的数据...', ex=3600)
# 3. 设置合理的过期时间
self.redis_client.setex('temporary_data', 3600, '临时数据') # 1小时过期
# 4. 监控内存使用
info = self.redis_client.info('memory')
print(f"已用内存: {info['used_memory_human']}")
print(f"内存碎片率: {info['mem_fragmentation_ratio']}")
def connection_optimization(self):
"""连接优化"""
# 使用连接池
from django_redis.pool import ConnectionFactory
connection_factory = ConnectionFactory(
max_connections=100,
socket_connect_timeout=5,
socket_timeout=5,
retry_on_timeout=True,
)
def key_design_optimization(self):
"""键设计优化"""
# 好的键设计
good_keys = [
'user:123:profile', # 使用冒号分隔
'product:456:views', # 有意义的命名
'session:789:data', # 包含数据类型
]
# 避免的键设计
bad_keys = [
'a', # 无意义
'very_long_key_name_that_takes_more_memory', # 太长
'user_profile_123', # 不一致的命名
]
# 使用哈希存储相关数据
user_data = {
'name': '张三',
'email': 'zhangsan@example.com',
'last_login': '2024-01-01 10:00:00'
}
self.redis_client.hset('user:123', mapping=user_data)
def monitoring_and_analysis(self):
"""监控和分析"""
# 获取Redis信息
info = self.redis_client.info()
# 关键指标
metrics = {
'connected_clients': info['connected_clients'],
'used_memory': info['used_memory_human'],
'instantaneous_ops_per_sec': info['instantaneous_ops_per_sec'],
'keyspace_hits': info['keyspace_hits'],
'keyspace_misses': info['keyspace_misses'],
}
# 计算命中率
hits = info['keyspace_hits']
misses = info['keyspace_misses']
total = hits + misses
hit_rate = (hits / total) * 100 if total > 0 else 0
print(f"缓存命中率: {hit_rate:.2f}%")
# 慢查询日志
slow_log = self.redis_client.slowlog_get(10) # 最近10个慢查询
for entry in slow_log:
print(f"慢查询: {entry}")
def backup_and_recovery(self):
"""备份和恢复策略"""
# 定期备份
def create_backup():
# 在生产环境中,使用BGSAVE进行后台保存
self.redis_client.bgsave()
print("Redis备份已启动")
# 监控备份状态
last_save = self.redis_client.lastsave()
print(f"最后备份时间: {last_save}")
# AOF持久化配置
config = {
'appendonly': 'yes',
'appendfsync': 'everysec', # 每秒同步
'auto-aof-rewrite-percentage': 100,
'auto-aof-rewrite-min-size': '64mb',
}
# Redis集群配置
REDIS_CLUSTER_CONFIG = {
'default': {
'BACKEND': 'django_redis.cache.RedisCache',
'LOCATION': [
'redis://127.0.0.1:7000/1',
'redis://127.0.0.1:7001/1',
'redis://127.0.0.1:7002/1',
],
'OPTIONS': {
'CLIENT_CLASS': 'django_redis.client.HiredisClient',
'CONNECTION_POOL_CLASS': 'redis.ClusterConnectionPool',
'REDIS_CLIENT_CLASS': 'rediscluster.RedisCluster',
}
}
}
4. Memcached缓存深入
4.1 Memcached配置和特点
Memcached是一个高性能的分布式内存对象缓存系统:
python
# Memcached缓存配置
MEMCACHED_CACHE_CONFIG = {
'default': {
'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
'LOCATION': [
'127.0.0.1:11211', # 单个服务器
# '192.168.1.1:11211', # 多个服务器
# '192.168.1.2:11211',
],
'OPTIONS': {
'server_max_value_length': 1024 * 1024 * 4, # 4MB最大值
},
'KEY_PREFIX': 'myapp',
'TIMEOUT': 3600, # 1小时
'VERSION': 1,
},
'session': {
'BACKEND': 'django.core.cache.backends.memcached.PyLibMCCache',
'LOCATION': '127.0.0.1:11211',
}
}
# 安装要求:
# pip install python-memcached # 使用python-memcached
# pip install pylibmc # 使用pylibmc(性能更好)
4.2 Memcached特性和最佳实践
python
# Memcached特性和最佳实践
import memcache
from django.core.cache import cache
class MemcachedFeatures:
"""Memcached特性演示"""
def __init__(self):
# 直接使用memcache客户端
self.mc = memcache.Client(['127.0.0.1:11211'], debug=0)
def demonstrate_basic_operations(self):
"""演示基本操作"""
# 设置缓存
self.mc.set('string_key', 'string_value')
self.mc.set('int_key', 42)
self.mc.set('dict_key', {'name': '张三', 'age': 25})
# 获取缓存
string_value = self.mc.get('string_key')
int_value = self.mc.get('int_key')
dict_value = self.mc.get('dict_key')
print(f"字符串值: {string_value}")
print(f"整数值: {int_value}")
print(f"字典值: {dict_value}")
# 批量操作
keys = ['key1', 'key2', 'key3']
values = {'key1': 'value1', 'key2': 'value2', 'key3': 'value3'}
self.mc.set_multi(values)
results = self.mc.get_multi(keys)
print(f"批量获取结果: {results}")
def demonstrate_advanced_features(self):
"""演示高级特性"""
# 原子操作
self.mc.add('unique_key', 'initial_value') # 只在键不存在时设置
self.mc.replace('existing_key', 'new_value') # 只在键存在时替换
# 自增自减
self.mc.set('counter', 0)
self.mc.incr('counter') # 1
self.mc.incr('counter', 10) # 11
self.mc.decr('counter', 5) # 6
# 删除操作
self.mc.delete('key_to_delete')
self.mc.delete_multi(['key1', 'key2', 'key3'])
def demonstrate_distributed_nature(self):
"""演示分布式特性"""
# Memcached自动在服务器间分布数据
servers = [
'127.0.0.1:11211',
'127.0.0.1:11212',
'127.0.0.1:11213',
]
distributed_mc = memcache.Client(servers, debug=0)
# 数据自动分布在多个服务器上
for i in range(100):
distributed_mc.set(f'key_{i}', f'value_{i}')
# 获取统计信息
stats = distributed_mc.get_stats()
for server, server_stats in stats:
print(f"服务器 {server} 统计:")
for stat_name, stat_value in server_stats.items():
print(f" {stat_name}: {stat_value}")
def best_practices(self):
"""Memcached最佳实践"""
# 1. 键设计
good_key = 'user:123:profile' # 有意义的键名
bad_key = 'u123p' # 无意义的键名
# 2. 值大小控制(通常建议<1MB)
large_data = 'x' * (1024 * 1024) # 1MB数据
self.mc.set('large_key', large_data)
# 3. 合理的过期时间
self.mc.set('short_lived', 'data', time=60) # 1分钟
self.mc.set('long_lived', 'data', time=3600) # 1小时
# 4. 错误处理
try:
value = self.mc.get('some_key')
if value is None:
# 缓存未命中
value = self.fetch_from_database()
self.mc.set('some_key', value, time=300)
except Exception as e:
print(f"Memcached错误: {e}")
# 降级到数据库查询
value = self.fetch_from_database()
def performance_monitoring(self):
"""性能监控"""
stats = self.mc.get_stats()
for server, server_stats in stats:
print(f"\n服务器 {server}:")
# 关键指标
hits = int(server_stats.get('get_hits', 0))
misses = int(server_stats.get('get_misses', 0))
total_gets = hits + misses
hit_rate = (hits / total_gets * 100) if total_gets > 0 else 0
print(f" 命中率: {hit_rate:.2f}%")
print(f" 当前连接: {server_stats.get('curr_connections', 0)}")
print(f" 已使用内存: {server_stats.get('bytes', 0)} bytes")
print(f" 缓存项数: {server_stats.get('curr_items', 0)}")
def fetch_from_database(self):
"""模拟数据库查询"""
import time
time.sleep(0.1) # 模拟数据库查询延迟
return "数据库数据"
# Memcached与Django集成
class MemcachedDjangoIntegration:
"""Memcached与Django集成"""
@staticmethod
def view_caching():
"""视图缓存"""
from django.views.decorators.cache import cache_page
@cache_page(60 * 15, cache='default') # 缓存15分钟
def cached_view(request):
return HttpResponse("缓存内容")
return cached_view
@staticmethod
def template_fragment_caching():
"""模板片段缓存"""
# 在模板中使用
template_code = """
{% load cache %}
{% cache 500 sidebar request.user.username %}
... 侧边栏内容 ...
{% endcache %}
"""
return template_code
@staticmethod
def low_level_caching():
"""低级缓存API使用"""
# 缓存复杂查询结果
def get_popular_products():
cache_key = 'popular_products'
products = cache.get(cache_key)
if products is None:
from myapp.models import Product
products = list(Product.objects.filter(
is_active=True
).order_by('-view_count')[:10])
# 缓存1小时
cache.set(cache_key, products, 3600)
return products
return get_popular_products
# Memcached会话后端
MEMCACHED_SESSION_CONFIG = {
'SESSION_ENGINE': 'django.contrib.sessions.backends.cache',
'SESSION_CACHE_ALIAS': 'default',
}
5. 数据库缓存
5.1 数据库缓存配置
数据库缓存使用数据库表存储缓存数据,适合小型应用:
python
# 数据库缓存配置
DATABASE_CACHE_CONFIG = {
'default': {
'BACKEND': 'django.core.cache.backends.db.DatabaseCache',
'LOCATION': 'my_cache_table', # 数据库表名
'TIMEOUT': 300,
'OPTIONS': {
'MAX_ENTRIES': 10000,
'CULL_FREQUENCY': 3,
},
'KEY_PREFIX': 'myapp',
},
'long_term': {
'BACKEND': 'django.core.cache.backends.db.DatabaseCache',
'LOCATION': 'long_term_cache',
'TIMEOUT': 60 * 60 * 24 * 7, # 1周
}
}
# 创建缓存表命令
# python manage.py createcachetable [table_name]
# python manage.py createcachetable my_cache_table
# python manage.py createcachetable long_term_cache
5.2 数据库缓存特性和使用
python
# 数据库缓存特性和使用
from django.core.cache import cache
from django.db import models, connection
import time
class DatabaseCacheFeatures:
"""数据库缓存特性"""
def demonstrate_usage(self):
"""演示使用方法"""
# 基本使用与其它缓存后端相同
cache.set('db_cached_key', 'db_cached_value', 300)
value = cache.get('db_cached_key')
print(f"数据库缓存值: {value}")
# 批量操作
data = {
'key1': 'value1',
'key2': 'value2',
'key3': 'value3',
}
cache.set_many(data)
results = cache.get_many(['key1', 'key2', 'key3'])
print(f"批量结果: {results}")
def performance_considerations(self):
"""性能考虑"""
start_time = time.time()
# 大量缓存操作
for i in range(100):
cache.set(f'test_key_{i}', f'test_value_{i}')
end_time = time.time()
print(f"100次设置操作耗时: {end_time - start_time:.3f}秒")
# 与内存缓存对比
from django.core.cache import caches
mem_cache = caches['locmem'] # 本地内存缓存
start_time = time.time()
for i in range(100):
mem_cache.set(f'test_key_{i}', f'test_value_{i}')
end_time = time.time()
print(f"内存缓存100次设置操作耗时: {end_time - start_time:.3f}秒")
def maintenance_operations(self):
"""维护操作"""
# 清理过期缓存
from django.core.cache.backends.db import DatabaseCache
db_cache = DatabaseCache('default', {})
db_cache._cull() # 清理过期条目
# 手动清理
from django.db import connection
with connection.cursor() as cursor:
cursor.execute(
"DELETE FROM my_cache_table WHERE expires < %s",
[time.time()]
)
# 获取缓存统计
with connection.cursor() as cursor:
cursor.execute("SELECT COUNT(*) FROM my_cache_table")
total_entries = cursor.fetchone()[0]
cursor.execute(
"SELECT COUNT(*) FROM my_cache_table WHERE expires < %s",
[time.time()]
)
expired_entries = cursor.fetchone()[0]
print(f"总缓存条目: {total_entries}")
print(f"过期缓存条目: {expired_entries}")
print(f"有效缓存条目: {total_entries - expired_entries}")
class CustomDatabaseCacheModel(models.Model):
"""自定义数据库缓存模型"""
cache_key = models.CharField(max_length=255, unique=True)
value = models.TextField()
expires = models.DateTimeField(db_index=True)
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
class Meta:
db_table = 'custom_cache_table'
indexes = [
models.Index(fields=['expires']),
]
@classmethod
def cleanup_expired(cls):
"""清理过期缓存"""
from django.utils import timezone
cls.objects.filter(expires__lt=timezone.now()).delete()
@classmethod
def get_cache_stats(cls):
"""获取缓存统计"""
from django.utils import timezone
total = cls.objects.count()
expired = cls.objects.filter(expires__lt=timezone.now()).count()
valid = total - expired
return {
'total_entries': total,
'expired_entries': expired,
'valid_entries': valid,
'expired_percentage': (expired / total * 100) if total > 0 else 0,
}
# 数据库缓存后端实现
class CustomDatabaseCache:
"""自定义数据库缓存后端"""
def __init__(self, location, params):
self._table = location
self._params = params
self._max_entries = params.get('MAX_ENTRIES', 300)
self._cull_frequency = params.get('CULL_FREQUENCY', 3)
def get(self, key, default=None, version=None):
"""获取缓存"""
from django.utils import timezone
try:
item = CustomDatabaseCacheModel.objects.get(
cache_key=self.make_key(key, version)
)
if item.expires < timezone.now():
item.delete()
return default
return self.decode(item.value)
except CustomDatabaseCacheModel.DoesNotExist:
return default
def set(self, key, value, timeout=None, version=None):
"""设置缓存"""
from django.utils import timezone
if timeout is None:
timeout = self._params.get('TIMEOUT', 300)
expires = timezone.now() + timezone.timedelta(seconds=timeout)
try:
item = CustomDatabaseCacheModel.objects.get(
cache_key=self.make_key(key, version)
)
item.value = self.encode(value)
item.expires = expires
item.save()
except CustomDatabaseCacheModel.DoesNotExist:
CustomDatabaseCacheModel.objects.create(
cache_key=self.make_key(key, version),
value=self.encode(value),
expires=expires
)
# 检查是否需要清理
if CustomDatabaseCacheModel.objects.count() > self._max_entries:
self._cull()
def _cull(self):
"""清理缓存"""
from django.utils import timezone
# 删除过期缓存
CustomDatabaseCacheModel.cleanup_expired()
# 如果仍然超过限制,删除最旧的缓存
count = CustomDatabaseCacheModel.objects.count()
if count > self._max_entries:
num_to_delete = count // self._cull_frequency
items_to_delete = CustomDatabaseCacheModel.objects.order_by('created_at')[:num_to_delete]
CustomDatabaseCacheModel.objects.filter(
id__in=[item.id for item in items_to_delete]
).delete()
def make_key(self, key, version=None):
"""生成缓存键"""
return f"{self._params.get('KEY_PREFIX', '')}:{version or 1}:{key}"
def encode(self, value):
"""编码值"""
import pickle
return pickle.dumps(value)
def decode(self, value):
"""解码值"""
import pickle
return pickle.loads(value)
6. 三种缓存方案对比
6.1 功能特性对比
python
# 三种缓存方案功能对比
from dataclasses import dataclass
from typing import List, Dict, Any
import time
@dataclass
class CacheBackendComparison:
"""缓存后端对比"""
name: str
backend: str
performance: str
scalability: str
data_structures: List[str]
persistence: bool
memory_usage: str
use_cases: List[str]
class CacheComparison:
"""缓存方案对比"""
@staticmethod
def feature_comparison():
"""功能特性对比"""
comparisons = {
'Redis': CacheBackendComparison(
name='Redis',
backend='django_redis.cache.RedisCache',
performance='非常高',
scalability='优秀,支持集群',
data_structures=['字符串', '哈希', '列表', '集合', '有序集合'],
persistence=True,
memory_usage='中等,可优化',
use_cases=['会话存储', '消息队列', '排行榜', '实时分析']
),
'Memcached': CacheBackendComparison(
name='Memcached',
backend='django.core.cache.backends.memcached.MemcachedCache',
performance='非常高',
scalability='优秀,分布式',
data_structures=['字符串'],
persistence=False,
memory_usage='较低',
use_cases=['页面缓存', '会话存储', '数据库查询缓存']
),
'Database': CacheBackendComparison(
name='Database',
backend='django.core.cache.backends.db.DatabaseCache',
performance='较低',
scalability='有限',
data_structures=['字符串'],
persistence=True,
memory_usage='依赖数据库',
use_cases=['小型应用', '开发环境', '长期缓存']
)
}
return comparisons
@staticmethod
def performance_benchmark():
"""性能基准测试"""
test_cases = [
{
'name': '单次设置操作',
'operations': 1000,
'function': lambda cache: cache.set('test_key', 'test_value')
},
{
'name': '单次获取操作',
'operations': 1000,
'function': lambda cache: cache.get('test_key')
},
{
'name': '批量设置操作',
'operations': 100,
'function': lambda cache: cache.set_many(
{f'key_{i}': f'value_{i}' for i in range(100)}
)
},
{
'name': '批量获取操作',
'operations': 100,
'function': lambda cache: cache.get_many(
[f'key_{i}' for i in range(100)]
)
}
]
results = {}
for cache_name in ['default', 'redis', 'memcached', 'database']:
try:
cache_backend = caches[cache_name]
results[cache_name] = {}
for test_case in test_cases:
# 预热
test_case['function'](cache_backend)
# 执行测试
start_time = time.time()
for _ in range(test_case['operations']):
test_case['function'](cache_backend)
end_time = time.time()
duration = end_time - start_time
results[cache_name][test_case['name']] = {
'total_time': duration,
'ops_per_second': test_case['operations'] / duration
}
except Exception as e:
print(f"测试 {cache_name} 时出错: {e}")
return results
@staticmethod
def memory_usage_comparison():
"""内存使用对比"""
# 测试不同缓存后端的存储效率
test_data = {
'small_string': 'a' * 100, # 100字节
'medium_string': 'b' * 1024, # 1KB
'large_string': 'c' * 10240, # 10KB
'dict_data': {'key1': 'value1', 'key2': 'value2', 'key3': [1, 2, 3]},
}
memory_usage = {}
for cache_name in ['redis', 'memcached', 'database']:
try:
cache_backend = caches[cache_name]
memory_usage[cache_name] = {}
for data_name, data_value in test_data.items():
key = f'memory_test_{data_name}'
cache_backend.set(key, data_value)
# 注意:实际内存使用需要通过监控工具获取
# 这里只是示意
memory_usage[cache_name][data_name] = '需要监控工具获取'
except Exception as e:
print(f"内存测试 {cache_name} 时出错: {e}")
return memory_usage
# 选择指南
class CacheSelectionGuide:
"""缓存选择指南"""
@staticmethod
def select_by_use_case():
"""根据使用场景选择"""
guidelines = {
'高并发Web应用': {
'推荐': ['Redis', 'Memcached'],
'理由': '高性能,低延迟',
'配置建议': '使用连接池,合理设置超时时间'
},
'会话存储': {
'推荐': ['Redis'],
'理由': '支持复杂数据结构,可持久化',
'配置建议': '设置合理的过期时间,启用持久化'
},
'页面缓存': {
'推荐': ['Memcached'],
'理由': '纯内存操作,性能极高',
'配置建议': '使用多服务器分布负载'
},
'消息队列': {
'推荐': ['Redis'],
'理由': '支持列表和发布订阅',
'配置建议': '使用专门的数据结构'
},
'开发环境': {
'推荐': ['Database', 'LocalMemory'],
'理由': '简单易用,无需额外服务',
'配置建议': '使用默认配置即可'
},
'小型项目': {
'推荐': ['Database'],
'理由': '无需额外基础设施',
'配置建议': '定期清理过期缓存'
}
}
return guidelines
@staticmethod
def hybrid_cache_strategy():
"""混合缓存策略"""
strategies = {
'分层缓存': {
'描述': '使用多级缓存,如本地内存 + Redis',
'实现': """
# 伪代码
def get_data(key):
# 1. 检查本地内存缓存
value = local_cache.get(key)
if value:
return value
# 2. 检查Redis缓存
value = redis_cache.get(key)
if value:
local_cache.set(key, value, SHORT_TIMEOUT)
return value
# 3. 从数据库获取
value = database_query(key)
redis_cache.set(key, value, LONG_TIMEOUT)
local_cache.set(key, value, SHORT_TIMEOUT)
return value
"""
},
'读写分离': {
'描述': '读操作使用Memcached,写操作和复杂数据结构使用Redis',
'实现': """
# 配置多个缓存后端
CACHES = {
'read': {
'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
'LOCATION': '127.0.0.1:11211'
},
'write': {
'BACKEND': 'django_redis.cache.RedisCache',
'LOCATION': 'redis://127.0.0.1:6379/1'
}
}
# 读操作使用Memcached
def get_cached_data(key):
return caches['read'].get(key)
# 写操作使用Redis
def set_cached_data(key, value):
caches['write'].set(key, value)
"""
},
'缓存预热': {
'描述': '在低峰期预先加载常用数据到缓存',
'实现': """
def preload_cache():
# 预加载热门数据
popular_products = Product.objects.filter(
is_popular=True
)[:100]
for product in popular_products:
cache_key = f'product:{product.id}'
cache.set(cache_key, product.to_dict(), 3600)
"""
}
}
return strategies
6.2 实际性能测试
python
# 实际性能测试
import time
import statistics
from django.test import TestCase
from django.core.cache import caches
class CachePerformanceTests(TestCase):
"""缓存性能测试"""
def setUp(self):
"""测试准备"""
self.test_data_sizes = [100, 1024, 10240] # 100B, 1KB, 10KB
self.operation_counts = [100, 1000, 10000]
def test_single_operation_performance(self):
"""测试单操作性能"""
results = {}
for cache_name in ['default', 'redis', 'memcached', 'database']:
try:
cache_backend = caches[cache_name]
results[cache_name] = {}
for size in self.test_data_sizes:
test_data = 'x' * size
key = f'perf_test_{size}'
# 测试设置操作
set_times = []
for _ in range(100):
start_time = time.perf_counter()
cache_backend.set(key, test_data)
end_time = time.perf_counter()
set_times.append(end_time - start_time)
# 测试获取操作
get_times = []
for _ in range(100):
start_time = time.perf_counter()
cache_backend.get(key)
end_time = time.perf_counter()
get_times.append(end_time - start_time)
results[cache_name][size] = {
'set_avg': statistics.mean(set_times),
'set_std': statistics.stdev(set_times),
'get_avg': statistics.mean(get_times),
'get_std': statistics.stdev(get_times),
}
except Exception as e:
print(f"测试 {cache_name} 时出错: {e}")
return results
def test_concurrent_performance(self):
"""测试并发性能"""
import threading
results = {}
def worker(cache_backend, operation_count, results_list):
"""工作线程"""
thread_times = []
for i in range(operation_count):
key = f'concurrent_test_{threading.current_thread().name}_{i}'
value = f'value_{i}'
start_time = time.perf_counter()
cache_backend.set(key, value)
cache_backend.get(key)
end_time = time.perf_counter()
thread_times.append(end_time - start_time)
results_list.extend(thread_times)
for cache_name in ['default', 'redis', 'memcached']:
try:
cache_backend = caches[cache_name]
thread_count = 10
operations_per_thread = 100
threads = []
all_times = []
# 创建并启动线程
for i in range(thread_count):
thread = threading.Thread(
target=worker,
args=(cache_backend, operations_per_thread, all_times),
name=f'Thread_{i}'
)
threads.append(thread)
thread.start()
# 等待所有线程完成
for thread in threads:
thread.join()
results[cache_name] = {
'total_operations': len(all_times),
'avg_time': statistics.mean(all_times),
'std_time': statistics.stdev(all_times),
'total_time': sum(all_times),
'ops_per_second': len(all_times) / sum(all_times)
}
except Exception as e:
print(f"并发测试 {cache_name} 时出错: {e}")
return results
def test_memory_efficiency(self):
"""测试内存效率"""
results = {}
for cache_name in ['redis', 'memcached']:
try:
cache_backend = caches[cache_name]
# 存储大量数据
entry_count = 10000
entry_size = 1024 # 1KB
start_memory = self._get_cache_memory_usage(cache_name)
for i in range(entry_count):
key = f'mem_test_{i}'
value = 'x' * entry_size
cache_backend.set(key, value, timeout=3600)
end_memory = self._get_cache_memory_usage(cache_name)
memory_used = end_memory - start_memory
results[cache_name] = {
'entries_stored': entry_count,
'total_data_size': entry_count * entry_size,
'memory_used': memory_used,
'overhead_ratio': memory_used / (entry_count * entry_size)
}
# 清理测试数据
for i in range(entry_count):
key = f'mem_test_{i}'
cache_backend.delete(key)
except Exception as e:
print(f"内存效率测试 {cache_name} 时出错: {e}")
return results
def _get_cache_memory_usage(self, cache_name):
"""获取缓存内存使用情况"""
# 这里需要根据具体的缓存后端实现
# 例如,对于Redis可以使用INFO命令
if cache_name == 'redis':
redis_client = caches['redis'].client.get_client()
info = redis_client.info('memory')
return info['used_memory']
elif cache_name == 'memcached':
# Memcached内存使用需要通过stats获取
# 这里返回0作为占位符
return 0
else:
return 0
def generate_performance_report(self):
"""生成性能报告"""
print("缓存性能测试报告")
print("=" * 50)
# 单操作性能
single_op_results = self.test_single_operation_performance()
print("\n1. 单操作性能 (单位: 秒):")
for cache_name, sizes in single_op_results.items():
print(f"\n{cache_name}:")
for size, metrics in sizes.items():
print(f" 数据大小 {size}B:")
print(f" 设置: {metrics['set_avg']:.6f} ± {metrics['set_std']:.6f}")
print(f" 获取: {metrics['get_avg']:.6f} ± {metrics['get_std']:.6f}")
# 并发性能
concurrent_results = self.test_concurrent_performance()
print("\n2. 并发性能:")
for cache_name, metrics in concurrent_results.items():
print(f"\n{cache_name}:")
print(f" 总操作数: {metrics['total_operations']}")
print(f" 平均时间: {metrics['avg_time']:.6f}秒")
print(f" 吞吐量: {metrics['ops_per_second']:.2f} 操作/秒")
return {
'single_operation': single_op_results,
'concurrent': concurrent_results,
}
7. 实际应用案例
7.1 电商平台缓存策略
python
# 电商平台缓存策略
from django.core.cache import cache, caches
from django.db import models
import json
class ECommerceCacheStrategy:
"""电商平台缓存策略"""
def __init__(self):
self.redis_cache = caches['redis']
self.memcached_cache = caches['memcached']
def product_caching(self):
"""商品信息缓存"""
def get_product_details(product_id):
"""获取商品详情(带缓存)"""
cache_key = f'product:{product_id}:details'
# 先尝试从Redis获取
product_data = self.redis_cache.get(cache_key)
if product_data:
return json.loads(product_data)
# 从数据库查询
from myapp.models import Product
try:
product = Product.objects.select_related(
'category', 'brand'
).prefetch_related(
'images', 'variants'
).get(id=product_id)
# 序列化数据
product_data = {
'id': product.id,
'name': product.name,
'price': str(product.price),
'category': product.category.name,
'brand': product.brand.name,
'images': [img.url for img in product.images.all()],
'variants': [
{
'size': variant.size,
'color': variant.color,
'stock': variant.stock
}
for variant in product.variants.all()
]
}
# 缓存到Redis(1小时)
self.redis_cache.setex(
cache_key,
3600,
json.dumps(product_data)
)
return product_data
except Product.DoesNotExist:
# 缓存空值,防止缓存穿透(5分钟)
self.redis_cache.setex(cache_key, 300, 'NULL')
return None
return get_product_details
def product_list_caching(self):
"""商品列表缓存"""
def get_product_list(category_id=None, page=1, page_size=20):
"""获取商品列表(带缓存)"""
cache_key = f'product_list:cat_{category_id}:page_{page}'
# 使用Memcached缓存列表数据(性能更好)
product_list = self.memcached_cache.get(cache_key)
if product_list:
return json.loads(product_list)
# 从数据库查询
from myapp.models import Product
queryset = Product.objects.filter(is_active=True)
if category_id:
queryset = queryset.filter(category_id=category_id)
products = list(queryset.order_by('-created_at')[
(page-1)*page_size : page*page_size
])
# 序列化
product_list = [
{
'id': p.id,
'name': p.name,
'price': str(p.price),
'image': p.primary_image.url if p.primary_image else None
}
for p in products
]
# 缓存到Memcached(15分钟)
self.memcached_cache.set(
cache_key,
json.dumps(product_list),
900
)
return product_list
return get_product_list
def shopping_cart_caching(self):
"""购物车缓存"""
def get_user_cart(user_id):
"""获取用户购物车"""
cache_key = f'cart:user:{user_id}'
cart_data = self.redis_cache.get(cache_key)
if cart_data:
return json.loads(cart_data)
# 从数据库加载购物车
from myapp.models import ShoppingCart, CartItem
try:
cart = ShoppingCart.objects.get(user_id=user_id)
items = CartItem.objects.filter(cart=cart).select_related('product')
cart_data = {
'id': cart.id,
'items': [
{
'product_id': item.product_id,
'product_name': item.product.name,
'quantity': item.quantity,
'price': str(item.product.price)
}
for item in items
],
'total_amount': str(cart.total_amount)
}
# 缓存到Redis(30分钟)
self.redis_cache.setex(
cache_key,
1800,
json.dumps(cart_data)
)
return cart_data
except ShoppingCart.DoesNotExist:
return None
def update_user_cart(user_id, cart_data):
"""更新用户购物车缓存"""
cache_key = f'cart:user:{user_id}'
self.redis_cache.setex(
cache_key,
1800,
json.dumps(cart_data)
)
return get_user_cart, update_user_cart
def inventory_caching(self):
"""库存缓存"""
def get_product_stock(product_id):
"""获取商品库存(带缓存)"""
cache_key = f'inventory:product:{product_id}'
stock = self.redis_cache.get(cache_key)
if stock is not None:
return int(stock)
# 从数据库查询
from myapp.models import Product
try:
product = Product.objects.get(id=product_id)
# 缓存库存(5分钟)
self.redis_cache.setex(
cache_key,
300,
product.stock_quantity
)
return product.stock_quantity
except Product.DoesNotExist:
return 0
def update_product_stock(product_id, new_stock):
"""更新商品库存缓存"""
cache_key = f'inventory:product:{product_id}'
self.redis_cache.setex(cache_key, 300, new_stock)
# 同时更新商品详情缓存中的库存信息
detail_key = f'product:{product_id}:details'
product_data = self.redis_cache.get(detail_key)
if product_data and product_data != 'NULL':
data = json.loads(product_data)
data['stock'] = new_stock
self.redis_cache.setex(detail_key, 3600, json.dumps(data))
return get_product_stock, update_product_stock
def recommendation_caching(self):
"""推荐系统缓存"""
def get_user_recommendations(user_id):
"""获取用户个性化推荐"""
cache_key = f'recommendations:user:{user_id}'
recommendations = self.redis_cache.get(cache_key)
if recommendations:
return json.loads(recommendations)
# 生成推荐(这里简化处理)
from myapp.models import Product
recommended_products = Product.objects.filter(
is_active=True
).order_by('-view_count')[:10]
recommendations = [
{
'id': p.id,
'name': p.name,
'price': str(p.price),
'image': p.primary_image.url if p.primary_image else None
}
for p in recommended_products
]
# 缓存推荐结果(1小时)
self.redis_cache.setex(
cache_key,
3600,
json.dumps(recommendations)
)
return recommendations
return get_user_recommendations
def cache_invalidation_strategy(self):
"""缓存失效策略"""
def invalidate_product_cache(product_id):
"""使商品相关缓存失效"""
# 商品详情缓存
self.redis_cache.delete(f'product:{product_id}:details')
# 库存缓存
self.redis_cache.delete(f'inventory:product:{product_id}')
# 推荐缓存(所有用户)
# 注意:在实际应用中可能需要更精细的控制
# 商品列表缓存(清除所有相关列表缓存)
# 这里可以使用Redis的keys命令或维护一个索引
def invalidate_user_cart(user_id):
"""使用户购物车缓存失效"""
self.redis_cache.delete(f'cart:user:{user_id}')
return invalidate_product_cache, invalidate_user_cart
7.2 缓存监控和维护
python
# 缓存监控和维护
import logging
from datetime import datetime, timedelta
logger = logging.getLogger(__name__)
class CacheMonitor:
"""缓存监控"""
def __init__(self):
self.redis_cache = caches['redis']
self.memcached_cache = caches['memcached']
def collect_metrics(self):
"""收集缓存指标"""
metrics = {
'timestamp': datetime.now().isoformat(),
'redis': self._get_redis_metrics(),
'memcached': self._get_memcached_metrics(),
}
return metrics
def _get_redis_metrics(self):
"""获取Redis指标"""
try:
redis_client = self.redis_cache.client.get_client()
info = redis_client.info()
return {
'used_memory': info['used_memory'],
'used_memory_human': info['used_memory_human'],
'connected_clients': info['connected_clients'],
'keyspace_hits': info['keyspace_hits'],
'keyspace_misses': info['keyspace_misses'],
'instantaneous_ops_per_sec': info['instantaneous_ops_per_sec'],
'hit_rate': self._calculate_hit_rate(info),
'keys_count': sum(
int(db['keys']) for db in info.keys()
if db.startswith('db')
)
}
except Exception as e:
logger.error(f"获取Redis指标失败: {e}")
return {}
def _get_memcached_metrics(self):
"""获取Memcached指标"""
try:
# 这里需要根据使用的Memcached客户端调整
stats = self.memcached_cache._cache.get_stats()
if stats:
server_stats = stats[0][1] # 取第一个服务器的统计
hits = int(server_stats.get('get_hits', 0))
misses = int(server_stats.get('get_misses', 0))
total = hits + misses
return {
'curr_items': int(server_stats.get('curr_items', 0)),
'total_connections': int(server_stats.get('total_connections', 0)),
'cmd_get': hits + misses,
'cmd_set': int(server_stats.get('cmd_set', 0)),
'get_hits': hits,
'get_misses': misses,
'hit_rate': (hits / total * 100) if total > 0 else 0,
'bytes': int(server_stats.get('bytes', 0)),
}
except Exception as e:
logger.error(f"获取Memcached指标失败: {e}")
return {}
def _calculate_hit_rate(self, info):
"""计算命中率"""
hits = info['keyspace_hits']
misses = info['keyspace_misses']
total = hits + misses
return (hits / total * 100) if total > 0 else 0
def check_health(self):
"""检查缓存健康状态"""
health_status = {
'redis': self._check_redis_health(),
'memcached': self._check_memcached_health(),
'overall': 'healthy'
}
# 检查整体状态
if (health_status['redis'] != 'healthy' or
health_status['memcached'] != 'healthy'):
health_status['overall'] = 'unhealthy'
logger.warning("缓存健康状态异常")
return health_status
def _check_redis_health(self):
"""检查Redis健康状态"""
try:
redis_client = self.redis_cache.client.get_client()
redis_client.ping()
# 检查内存使用
info = redis_client.info('memory')
if info['used_memory'] > 0.9 * info['maxmemory']: # 使用超过90%
return 'warning'
return 'healthy'
except Exception as e:
logger.error(f"Redis健康检查失败: {e}")
return 'unhealthy'
def _check_memcached_health(self):
"""检查Memcached健康状态"""
try:
# 简单的设置获取测试
test_key = 'health_check'
test_value = 'test'
self.memcached_cache.set(test_key, test_value, 10)
retrieved_value = self.memcached_cache.get(test_key)
if retrieved_value == test_value:
return 'healthy'
else:
return 'unhealthy'
except Exception as e:
logger.error(f"Memcached健康检查失败: {e}")
return 'unhealthy'
class CacheMaintenance:
"""缓存维护"""
def __init__(self):
self.redis_cache = caches['redis']
def cleanup_expired_keys(self):
"""清理过期键"""
# Redis会自动清理过期键
# 这里可以执行一些手动的清理操作
logger.info("执行缓存清理任务")
# 清理特定的缓存模式
patterns_to_clean = [
'temp:*',
'session:*', # 过期的会话
]
for pattern in patterns_to_clean:
self._delete_keys_by_pattern(pattern)
def _delete_keys_by_pattern(self, pattern):
"""按模式删除键"""
try:
redis_client = self.redis_cache.client.get_client()
# 使用SCAN命令避免阻塞
cursor = 0
deleted_count = 0
while True:
cursor, keys = redis_client.scan(
cursor=cursor,
match=pattern,
count=100
)
if keys:
redis_client.delete(*keys)
deleted_count += len(keys)
if cursor == 0:
break
logger.info(f"删除模式 {pattern} 的 {deleted_count} 个键")
except Exception as e:
logger.error(f"删除模式 {pattern} 的键时出错: {e}")
def optimize_redis(self):
"""优化Redis"""
try:
redis_client = self.redis_cache.client.get_client()
# 执行内存优化
redis_client.memory_purge()
# 如果使用AOF,可以执行重写
# redis_client.bgrewriteaof()
logger.info("Redis优化完成")
except Exception as e:
logger.error(f"Redis优化失败: {e}")
def backup_redis(self):
"""备份Redis数据"""
try:
redis_client = self.redis_cache.client.get_client()
# 执行后台保存
redis_client.bgsave()
logger.info("Redis备份已启动")
except Exception as e:
logger.error(f"Redis备份失败: {e}")
# 缓存管理命令
from django.core.management.base import BaseCommand
class Command(BaseCommand):
"""自定义缓存管理命令"""
help = '缓存维护和管理命令'
def add_arguments(self, parser):
parser.add_argument(
'--cleanup',
action='store_true',
help='清理过期缓存'
)
parser.add_argument(
'--metrics',
action='store_true',
help='显示缓存指标'
)
parser.add_argument(
'--health',
action='store_true',
help='检查缓存健康状态'
)
def handle(self, *args, **options):
monitor = CacheMonitor()
maintenance = CacheMaintenance()
if options['cleanup']:
self.stdout.write("开始清理缓存...")
maintenance.cleanup_expired_keys()
self.stdout.write(
self.style.SUCCESS("缓存清理完成")
)
if options['metrics']:
metrics = monitor.collect_metrics()
self.stdout.write("缓存指标:")
self.stdout.write(json.dumps(metrics, indent=2))
if options['health']:
health_status = monitor.check_health()
self.stdout.write("缓存健康状态:")
self.stdout.write(json.dumps(health_status, indent=2))
if health_status['overall'] == 'healthy':
self.stdout.write(
self.style.SUCCESS("所有缓存系统健康")
)
else:
self.stdout.write(
self.style.ERROR("缓存系统存在异常")
)
8. 总结
8.1 缓存策略总结
通过以上分析和测试,我们可以得出以下结论:
-
Redis:
- 优势:功能丰富,支持多种数据结构,可持久化,适合复杂场景
- 适用场景:会话存储、消息队列、排行榜、实时分析
- 推荐配置:使用连接池,合理设置内存限制和持久化策略
-
Memcached:
- 优势:纯内存操作,性能极高,分布式特性好
- 适用场景:页面缓存、会话存储、简单的键值缓存
- 推荐配置:多服务器分布,合理设置内存大小
-
数据库缓存:
- 优势:无需额外基础设施,数据持久化
- 适用场景:小型应用、开发环境、长期缓存
- 推荐配置:定期清理过期数据,使用合适的索引
8.2 最佳实践建议
-
分层缓存策略:
python# 本地内存 + Redis 分层缓存 def get_data_with_layers(key): # L1: 本地内存缓存(短暂) value = local_cache.get(key) if value: return value # L2: Redis缓存(中等时长) value = redis_cache.get(key) if value: local_cache.set(key, value, 60) # 缓存1分钟到本地 return value # L3: 数据库(源数据) value = database_query(key) redis_cache.set(key, value, 3600) # 缓存1小时到Redis local_cache.set(key, value, 60) # 缓存1分钟到本地 return value -
缓存失效策略:
- 设置合理的过期时间
- 使用版本控制
- 及时清理相关缓存
-
监控和告警:
- 监控缓存命中率
- 设置内存使用告警
- 定期检查缓存健康状态
8.3 选择指南
| 场景 | 推荐方案 | 理由 |
|---|---|---|
| 高并发Web应用 | Redis + Memcached | Redis处理复杂数据,Memcached处理简单缓存 |
| 会话存储 | Redis | 支持丰富的数据结构,可持久化 |
| 页面缓存 | Memcached | 纯内存操作,性能最佳 |
| 消息系统 | Redis | 支持发布订阅和列表操作 |
| 开发环境 | 数据库缓存 | 简单易用,无需额外服务 |
| 小型项目 | Redis | 功能全面,易于扩展 |
通过合理的缓存策略选择和配置,可以显著提升Django应用的性能和用户体验。建议根据具体业务需求、团队技术栈和基础设施情况来选择最合适的缓存方案。