【ollama】(5):在本地使用docker-compose启动ollama镜像,修改模型存储位置,并下载qwen-0.5b模型,速度飞快

1,ollama项目

Ollama 是一个强大的框架,设计用于在 Docker 容器中部署 LLM。Ollama 的主要功能是在 Docker 容器内部署和管理 LLM 的促进者,它使该过程变得非常简单。它帮助用户快速在本地运行大模型,通过简单的安装指令,可以让用户执行一条命令就在本地运行开源大型语言模型,例如 Llama 2。

https://ollama.com/

https://www.bilibili.com/video/BV1HC411Y7P1/?vd_source=4b290247452adda4e56d84b659b0c8a2

【ollama】(5):在本地使用docker-compose启动ollama镜像,并下载qwen-0.5b模型,速度飞快

2,整个docker-compose 配置如下:

yaml 复制代码
version: '3.5'

services:

##################### 使用ollama部署大模型 #####################

# OLLAMA_HOST       The host:port to bind to (default "127.0.0.1:11434")
# OLLAMA_ORIGINS    A comma separated list of allowed origins.
# OLLAMA_MODELS     The path to the models directory (default is "~/.ollama/models")

  ollama:
    restart: always
    container_name: ollama
    image: ollama/ollama
    ports:
      - 8000:8000
    environment:
      - OLLAMA_HOST=0.0.0.0:8000
      - OLLAMA_MODELS=/data/models
    volumes:
      - ./models/:/data/models
    # 命令启动 serve
    command: serve

启动docker镜像服务:

bash 复制代码
docker-compose  up -d

然后就可以执行命令测试了:

bash 复制代码
curl http://localhost:8000/v1/chat/completions \
    -H "Content-Type: application/json" \
    -d '{
        "model": "qwen:0.5b","stream":true,
        "messages": [
            {
                "role": "user",
                "content": "你好"
            }
        ]
    }'
bash 复制代码
$ curl http://localhost:8000/v1/chat/completions \
    -H "Content-Type: application/json" \
    -d '{
        "model": "qwen:0.5b","stream":true,
        "messages": [
            {
                "role": "user",
                "content": "你好"
            }
        ]
    }'
data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"你好"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":","},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"很高兴"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"为您"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"服务"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"。"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"有什么"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"我可以"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"帮助"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"您的"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"吗"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"?"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377123,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"\n"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377123,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":""},"finish_reason":"stop"}]}

data: [DONE]

速度特别快。还可以。

相关推荐
枷锁—sha1 分钟前
跨站请求伪造漏洞(CSRF)详解
运维·服务器·前端·web安全·网络安全·csrf
云途行者10 分钟前
使用 docker 安装 openldap
运维·docker·容器
群联云防护小杜17 分钟前
深度隐匿源IP:高防+群联AI云防护防绕过实战
运维·服务器·前端·网络·人工智能·网络协议·tcp/ip
退役小学生呀30 分钟前
十五、K8s可观测能力:日志收集
linux·云原生·容器·kubernetes·k8s
van叶~32 分钟前
Linux探秘坊-------15.线程概念与控制
linux·运维·服务器
Andy杨2 小时前
20250718-5-Kubernetes 调度-Pod对象:重启策略+健康检查_笔记
笔记·容器·kubernetes
Andy杨3 小时前
20250718-1-Kubernetes 应用程序生命周期管理-应用部署、升级、弹性_笔记
linux·docker·容器
别致的影分身9 小时前
Docker 镜像原理
运维·docker·容器
阿葱(聪)9 小时前
java 在k8s中的部署流程
java·开发语言·docker·kubernetes
指月小筑9 小时前
K8s 自定义调度器 Part1:通过 Scheduler Extender 实现自定义调度逻辑
云原生·容器·kubernetes·go