sanic框架解决多进程共享缓存问题

最近在用sanic框架做项目,今天需要处理多进程共享缓存问题,在网上搜索了很多,知道使用multiprocessing模块,但是导入后,直接使用会报错,然后看官网解决问题。

直接看官方文档点我哦

大致意思如下:

go 复制代码
Python provides a few methods for exchanging objects(opens new window), synchronizing(opens new window), and sharing state(opens new window) between processes. This usually involves objects from the multiprocessing and ctypes modules.

If you are familiar with these objects and how to work with them, you will be happy to know that Sanic provides an API for sharing these objects between your worker processes. If you are not familiar, you are encouraged to read through the Python documentation linked above and try some of the examples before proceeding with implementing shared context.

Similar to how application context allows an applicaiton to share state across the lifetime of the application with app.ctx, shared context provides the same for the special objects mentioned above. This context is available as app.shared_ctx and should ONLY be used to share objects intended for this purpose.

The shared_ctx will:

NOT share regular objects like int, dict, or list
NOT share state between Sanic instances running on different machines
NOT share state to non-worker processes
only share state between server workers managed by the same Manager
Attaching an inappropriate object to shared_ctx will likely result in a warning, and not an error. You should be careful to not accidentally add an unsafe object to shared_ctx as it may not work as expected. If you are directed here because of one of those warnings, you might have accidentally used an unsafe object in shared_ctx.

In order to create a shared object you must create it in the main process and attach it inside of the main_process_start listener.

翻译过来如下:

一个小例子

python 复制代码
import multiprocessing
from sanic import HTTPResponse, Sanic, response
from sanic.log import logger

app = Sanic("Hw-Licence-System")
app.config.REQUEST_TIMEOUT = 180


# 创建共享的Manager对象
@app.main_process_start
async def main_process_start(app):
    app.shared_ctx.cache = multiprocessing.Manager().dict()

@app.route("/api/v1/get_data", methods=["GET"])
async def get_data(request):
    product_name = request.args.get("product_name")
    shared_cache = request.app.shared_ctx.cache
    # 尝试从共享缓存中获取数据
    if product_name in shared_cache:
        data = shared_cache[product_name]
        return response.json({"status": True, "data": data})

    # 存储到缓存
    logger.info("get data from server")
    shared_cache[product_name] = "123"
    # 获取数据并返回
    if product_name in shared_cache:
        data = shared_cache[product_name]
        return response.json({"status": True, "data": data})
    else:
        return response.json({"status": False, "message": "Data not found"})
    
    
if __name__ == "__main__":
    app.run(host="0.0.0.0", port=3000, workers=4)
相关推荐
music&movie2 小时前
代码填空任务---自编码器模型
python·深度学习·机器学习
风一样的树懒3 小时前
Python使用pip安装Caused by SSLError:certificate verify failed
人工智能·python
测试最靓仔4 小时前
allure报告修改默认语言为中文
python·自动化
AI视觉网奇5 小时前
imageio 图片转mp4 保存mp4
python
凡人的AI工具箱5 小时前
每天40分玩转Django:Django DevOps实践指南
运维·后端·python·django·devops
shaxin观yin5 小时前
python反序列化+沙箱逃逸++js+redis
python·学习·https
专注于开发微信小程序打工人5 小时前
庐山派k230使用串口通信发送数据驱动四个轮子并且实现摄像头画面识别目标检测功能
开发语言·python
白雪公主的后妈5 小时前
Python爬虫基础——XPath表达式
爬虫·python·lxml·etree
重剑无锋10245 小时前
【《python爬虫入门教程12--重剑无峰168》】
开发语言·爬虫·python