【ollama】(5):在本地使用docker-compose启动ollama镜像,修改模型存储位置,并下载qwen-0.5b模型,速度飞快

1,ollama项目

Ollama 是一个强大的框架,设计用于在 Docker 容器中部署 LLM。Ollama 的主要功能是在 Docker 容器内部署和管理 LLM 的促进者,它使该过程变得非常简单。它帮助用户快速在本地运行大模型,通过简单的安装指令,可以让用户执行一条命令就在本地运行开源大型语言模型,例如 Llama 2。

https://ollama.com/

https://www.bilibili.com/video/BV1HC411Y7P1/?vd_source=4b290247452adda4e56d84b659b0c8a2

【ollama】(5):在本地使用docker-compose启动ollama镜像,并下载qwen-0.5b模型,速度飞快

2,整个docker-compose 配置如下:

yaml 复制代码
version: '3.5'

services:

##################### 使用ollama部署大模型 #####################

# OLLAMA_HOST       The host:port to bind to (default "127.0.0.1:11434")
# OLLAMA_ORIGINS    A comma separated list of allowed origins.
# OLLAMA_MODELS     The path to the models directory (default is "~/.ollama/models")

  ollama:
    restart: always
    container_name: ollama
    image: ollama/ollama
    ports:
      - 8000:8000
    environment:
      - OLLAMA_HOST=0.0.0.0:8000
      - OLLAMA_MODELS=/data/models
    volumes:
      - ./models/:/data/models
    # 命令启动 serve
    command: serve

启动docker镜像服务:

bash 复制代码
docker-compose  up -d

然后就可以执行命令测试了:

bash 复制代码
curl http://localhost:8000/v1/chat/completions \
    -H "Content-Type: application/json" \
    -d '{
        "model": "qwen:0.5b","stream":true,
        "messages": [
            {
                "role": "user",
                "content": "你好"
            }
        ]
    }'
bash 复制代码
$ curl http://localhost:8000/v1/chat/completions \
    -H "Content-Type: application/json" \
    -d '{
        "model": "qwen:0.5b","stream":true,
        "messages": [
            {
                "role": "user",
                "content": "你好"
            }
        ]
    }'
data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"你好"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":","},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"很高兴"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"为您"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"服务"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"。"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"有什么"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"我可以"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"帮助"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"您的"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"吗"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377122,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"?"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377123,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"\n"},"finish_reason":null}]}

data: {"id":"chatcmpl-163","object":"chat.completion.chunk","created":1710377123,"model":"qwen:0.5b","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":""},"finish_reason":"stop"}]}

data: [DONE]

速度特别快。还可以。

相关推荐
tangdou36909865514 分钟前
Docker系列-超级详细教你Linux安装并使用docker compose,如何使用docker-compose安装sqlserver
docker·容器·sql server
文牧之26 分钟前
PostgreSQL 和Oracle 表压缩的对比
运维·数据库·postgresql·oracle
世辰辰辰40 分钟前
Linux实用操作
linux·运维·服务器
verse_armour41 分钟前
linux没有权限安装zip应该如何解压压缩包
linux·运维·服务器
GFCGUO1 小时前
Ubuntu18.04安装cuda11.1(出现c++版本问题)
linux·运维·ubuntu
睆小白2 小时前
【理论】负载均衡
运维·负载均衡
醉颜凉2 小时前
银河麒麟服务器:更新软件源
运维·服务器·kylin·国产化·银河麒麟操作系统
龙哥·三年风水2 小时前
国外电商系统开发-运维系统单个添加被管理服务器
运维·python·商城系统
真正的醒悟2 小时前
华为资源分享
运维·服务器·华为
YancyYue9 小时前
ACL(Access Control List)访问控制列表
运维·服务器