sanic框架解决多进程共享缓存问题

最近在用sanic框架做项目,今天需要处理多进程共享缓存问题,在网上搜索了很多,知道使用multiprocessing模块,但是导入后,直接使用会报错,然后看官网解决问题。

直接看官方文档点我哦

大致意思如下:

go 复制代码
Python provides a few methods for exchanging objects(opens new window), synchronizing(opens new window), and sharing state(opens new window) between processes. This usually involves objects from the multiprocessing and ctypes modules.

If you are familiar with these objects and how to work with them, you will be happy to know that Sanic provides an API for sharing these objects between your worker processes. If you are not familiar, you are encouraged to read through the Python documentation linked above and try some of the examples before proceeding with implementing shared context.

Similar to how application context allows an applicaiton to share state across the lifetime of the application with app.ctx, shared context provides the same for the special objects mentioned above. This context is available as app.shared_ctx and should ONLY be used to share objects intended for this purpose.

The shared_ctx will:

NOT share regular objects like int, dict, or list
NOT share state between Sanic instances running on different machines
NOT share state to non-worker processes
only share state between server workers managed by the same Manager
Attaching an inappropriate object to shared_ctx will likely result in a warning, and not an error. You should be careful to not accidentally add an unsafe object to shared_ctx as it may not work as expected. If you are directed here because of one of those warnings, you might have accidentally used an unsafe object in shared_ctx.

In order to create a shared object you must create it in the main process and attach it inside of the main_process_start listener.

翻译过来如下:

一个小例子

python 复制代码
import multiprocessing
from sanic import HTTPResponse, Sanic, response
from sanic.log import logger

app = Sanic("Hw-Licence-System")
app.config.REQUEST_TIMEOUT = 180


# 创建共享的Manager对象
@app.main_process_start
async def main_process_start(app):
    app.shared_ctx.cache = multiprocessing.Manager().dict()

@app.route("/api/v1/get_data", methods=["GET"])
async def get_data(request):
    product_name = request.args.get("product_name")
    shared_cache = request.app.shared_ctx.cache
    # 尝试从共享缓存中获取数据
    if product_name in shared_cache:
        data = shared_cache[product_name]
        return response.json({"status": True, "data": data})

    # 存储到缓存
    logger.info("get data from server")
    shared_cache[product_name] = "123"
    # 获取数据并返回
    if product_name in shared_cache:
        data = shared_cache[product_name]
        return response.json({"status": True, "data": data})
    else:
        return response.json({"status": False, "message": "Data not found"})
    
    
if __name__ == "__main__":
    app.run(host="0.0.0.0", port=3000, workers=4)
相关推荐
王小王-1234 分钟前
基于Python的游戏推荐与可视化系统的设计与实现
python·游戏·游戏推荐系统·游戏可视化
KevinWang_26 分钟前
让 AI 写一个给图片加水印的 Python 脚本
python
go&Python1 小时前
检索模型与RAG
开发语言·python·llama
阿里云大数据AI技术1 小时前
ODPS 十五周年实录 | Data + AI,MaxCompute 下一个15年的新增长引擎
大数据·python·sql
RainbowJie12 小时前
Gemini CLI 与 MCP 服务器:释放本地工具的强大潜力
java·服务器·spring boot·后端·python·单元测试·maven
工作碎碎念2 小时前
NumPy------数值计算
python
工作碎碎念2 小时前
pandas
python
A7bert7773 小时前
【YOLOv5部署至RK3588】模型训练→转换RKNN→开发板部署
c++·人工智能·python·深度学习·yolo·目标检测·机器学习
冷月半明3 小时前
时间序列篇:Prophet负责优雅,LightGBM负责杀疯
python·算法
教练我想打篮球_基本功重塑版3 小时前
L angChain 加载大模型
python·langchain