sanic框架解决多进程共享缓存问题

最近在用sanic框架做项目,今天需要处理多进程共享缓存问题,在网上搜索了很多,知道使用multiprocessing模块,但是导入后,直接使用会报错,然后看官网解决问题。

直接看官方文档点我哦

大致意思如下:

go 复制代码
Python provides a few methods for exchanging objects(opens new window), synchronizing(opens new window), and sharing state(opens new window) between processes. This usually involves objects from the multiprocessing and ctypes modules.

If you are familiar with these objects and how to work with them, you will be happy to know that Sanic provides an API for sharing these objects between your worker processes. If you are not familiar, you are encouraged to read through the Python documentation linked above and try some of the examples before proceeding with implementing shared context.

Similar to how application context allows an applicaiton to share state across the lifetime of the application with app.ctx, shared context provides the same for the special objects mentioned above. This context is available as app.shared_ctx and should ONLY be used to share objects intended for this purpose.

The shared_ctx will:

NOT share regular objects like int, dict, or list
NOT share state between Sanic instances running on different machines
NOT share state to non-worker processes
only share state between server workers managed by the same Manager
Attaching an inappropriate object to shared_ctx will likely result in a warning, and not an error. You should be careful to not accidentally add an unsafe object to shared_ctx as it may not work as expected. If you are directed here because of one of those warnings, you might have accidentally used an unsafe object in shared_ctx.

In order to create a shared object you must create it in the main process and attach it inside of the main_process_start listener.

翻译过来如下:

一个小例子

python 复制代码
import multiprocessing
from sanic import HTTPResponse, Sanic, response
from sanic.log import logger

app = Sanic("Hw-Licence-System")
app.config.REQUEST_TIMEOUT = 180


# 创建共享的Manager对象
@app.main_process_start
async def main_process_start(app):
    app.shared_ctx.cache = multiprocessing.Manager().dict()

@app.route("/api/v1/get_data", methods=["GET"])
async def get_data(request):
    product_name = request.args.get("product_name")
    shared_cache = request.app.shared_ctx.cache
    # 尝试从共享缓存中获取数据
    if product_name in shared_cache:
        data = shared_cache[product_name]
        return response.json({"status": True, "data": data})

    # 存储到缓存
    logger.info("get data from server")
    shared_cache[product_name] = "123"
    # 获取数据并返回
    if product_name in shared_cache:
        data = shared_cache[product_name]
        return response.json({"status": True, "data": data})
    else:
        return response.json({"status": False, "message": "Data not found"})
    
    
if __name__ == "__main__":
    app.run(host="0.0.0.0", port=3000, workers=4)
相关推荐
Johny_Zhao12 小时前
CentOS Stream 8 高可用 Kuboard 部署方案
linux·网络·python·网络安全·docker·信息安全·kubernetes·云计算·shell·yum源·系统运维·kuboard
站大爷IP13 小时前
精通einsum():多维数组操作的瑞士军刀
python
站大爷IP13 小时前
Python与MongoDB的亲密接触:从入门到实战的代码指南
python
Roc-xb14 小时前
/etc/profile.d/conda.sh: No such file or directory : numeric argument required
python·ubuntu·conda
世由心生15 小时前
[从0到1]环境准备--anaconda与pycharm的安装
ide·python·pycharm
猛犸MAMMOTH16 小时前
Python打卡第54天
pytorch·python·深度学习
梓羽玩Python16 小时前
12K+ Star的离线语音神器!50MB模型秒杀云端API,隐私零成本,20+语种支持!
人工智能·python·github
成都犀牛16 小时前
LangGraph 深度学习笔记:构建真实世界的智能代理
人工智能·pytorch·笔记·python·深度学习
終不似少年遊*16 小时前
【数据可视化】Pyecharts-家乡地图
python·信息可视化·数据挖掘·数据分析·数据可视化·pyecharts
仟濹17 小时前
「Matplotlib 入门指南」 Python 数据可视化分析【数据分析全栈攻略:爬虫+处理+可视化+报告】
python·信息可视化·数据分析·matplotlib