💝💝💝欢迎来到我的博客,很高兴能够在这里和您见面!希望您在这里可以感受到一份轻松愉快的氛围,不仅可以获得有趣的内容和知识,也可以畅所欲言、分享您的想法和见解。
- 推荐:kuan 的首页,持续学习,不断总结,共同进步,活到老学到老
- 导航
非常期待和您一起在这个小小的网络世界里共同探索、学习和成长。💝💝💝 ✨✨ 欢迎订阅本专栏 ✨✨
博客目录
一.chainlit 简介
1.官方文档
2.python 安装
在 centos 服务器上安装 python3.10
安装依赖
apl
#安装依赖库
sudo yum install gcc openssl-devel bzip2-devel libffi-devel zlib-devel wget sqlite-devel
#下载python
wget https://www.python.org/ftp/python/3.10.0/Python-3.10.0.tgz
#解压
tar -zxvf Python-3.10.0.tgz
安装python3.10
apl
#进入目录
cd Python-3.10.0
#校验
./configure --enable-optimizations
#编译
make -j 8
#安装
sudo make altinstall
验证
apl
#验证
python3.10 --version
3.安装 chainlit
创建虚拟环境:
python
#创建虚拟环境
python3.10 -m venv myenv
#激活虚拟环境
source myenv/bin/activate
#退出虚拟环境
deactivate
安装依赖
apl
#安装chainlit
pip install chainlit
#安装langchain
pip install langchain
创建azure_demo.py文件,内容如下
python
import os
import chainlit as cl
from langchain.chat_models import ChatOpenAI
from langchain.schema import (
HumanMessage,
SystemMessage
)
# 公司的key
os.environ["OPENAI_API_KEY"] = 'xxxxx'
os.environ["OPENAI_API_BASE"] = 'https://opencatgpt.openai.azure.com/'
os.environ["OPENAI_API_TYPE"] = 'azure'
os.environ["OPENAI_API_VERSION"] = '2023-05-15'
chat = ChatOpenAI(model_name="gpt-35-turbo", engine="gpt-35-turbo")
history = [SystemMessage(content="你是一个聊天机器人,请回答下列问题。\n")]
@cl.on_message # this function will be called every time a user inputs a message in the UI
async def main(message: str):
# history = [SystemMessage(content="你是一个聊天机器人,请回答下列问题。\n")]
history.append(HumanMessage(content=message))
res = await cl.make_async(sync_func)()
# res = chat(history)
# print(res.content)
# this is an intermediate step
# await cl.Message(author="Tool 1", content=f"Response from tool1", indent=1).send()
# send back the final answer
history.append(res)
await cl.Message(content=f"{res.content}").send()
def sync_func():
return chat(history)
方式二:
python
import openai
import chainlit as cl
openai.proxy = 'http://127.0.0.1:7890'
openai.api_key = "xxxx"
# model_name = "text-davinci-003"
model_name = "gpt-3.5-turbo"
settings = {
"temperature": 0.7,
"max_tokens": 500,
"top_p": 1,
"frequency_penalty": 0,
"presence_penalty": 0,
}
@cl.on_chat_start
def start_chat():
cl.user_session.set(
"message_history",
[{"role": "system", "content": "You are a helpful assistant."}],
)
@cl.on_message
async def main(message: str):
message_history = cl.user_session.get("message_history")
message_history.append({"role": "user", "content": message})
msg = cl.Message(content="")
async for stream_resp in await openai.ChatCompletion.acreate(
model=model_name, messages=message_history, stream=True, **settings
):
token = stream_resp.choices[0]["delta"].get("content", "")
await msg.stream_token(token)
message_history.append({"role": "assistant", "content": msg.content})
await msg.send()
4.启动脚本
apl
nohup chainlit run azure_demo.py &
5.一键启动
apl
echo `ps -ef | grep azure_demo | grep -v grep | awk '{print $2}'`
kill -9 `ps -ef | grep azure_demo | grep -v grep | awk '{print $2}'`
cd /kwan/chainlit
python3.10 -m venv myenv
source myenv/bin/activate
nohup chainlit run azure_demo.py >/dev/null 2>&1 & exit
二.docker 部署
1.github 地址
2.Dockerfile
apl
FROM python:3.11-slim-buster as builder
#RUN apt-get update && apt-get install -y git
RUN pip install poetry==1.4.2 -i https://mirrors.tuna.tsinghua.edu.cn/pypi/web/simple/ \
&& pip install DBUtils==3.0.3 -i https://mirrors.tuna.tsinghua.edu.cn/pypi/web/simple/ \
&& pip install PyMySQL==1.1.0 -i https://mirrors.tuna.tsinghua.edu.cn/pypi/web/simple/
ENV POETRY_NO_INTERACTION=1 \
POETRY_VIRTUALENVS_IN_PROJECT=1 \
POETRY_VIRTUALENVS_CREATE=1 \
POETRY_CACHE_DIR=/tmp/poetry_cache
ENV HOST=0.0.0.0
ENV LISTEN_PORT 8000
EXPOSE 8000
WORKDIR /app
COPY pyproject.toml poetry.lock ./
RUN poetry config repositories.clearlydefined https://pypi.tuna.tsinghua.edu.cn/simple/
RUN poetry config cache-dir /kwan/chainlit/demo
RUN poetry config virtualenvs.create false
RUN poetry install --without dev --no-root && rm -rf $POETRY_CACHE_DIR
# The runtime image, used to just run the code provided its virtual environment
FROM python:3.11-slim-buster as runtime
ENV VIRTUAL_ENV=/app/.venv \
PATH="/app/.venv/bin:$PATH"
COPY --from=builder ${VIRTUAL_ENV} ${VIRTUAL_ENV}
COPY ./demo_app ./demo_app
COPY ./.chainlit ./.chainlit
COPY chainlit.md ./
CMD ["chainlit", "run", "demo_app/main.py"]
3.新增依赖
apl
#在pyproject.toml中新增依赖
[tool.poetry.dependencies]
python = "^3.10"
langchain = "0.0.199"
openai = "0.27.8"
chainlit = "0.5.2"
DBUtils = "3.0.3"
PyMySQL = "1.1.0"
#执行poetry update会更新poetry.lock文件
poetry update
4.部署步骤
apl
#创建缓存目录
mkdir -p /kwan/chainlit
#进入目录
cd /kwan/chainlit
#下载源码
git clone https://github.com/amjadraza/langchain-chainlit-docker-deployment-template
#进入目录
cd /kwan/chainlit/langchain-chainlit-docker-deployment-template
#修改代码
/kwan/chainlit/langchain-chainlit-docker-deployment-template/demo_app
#构建镜像
DOCKER_BUILDKIT=1 docker build --target=runtime . -t langchain-chainlit-chat-app:latest
#启动容器
docker run -d --name langchain-chainlit-chat-app -p 8000:8000 langchain-chainlit-chat-app
#删除容器
docker rm -f langchain-chainlit-chat-app
#容器日志
docker logs -f langchain-chainlit-chat-app
#所有容器
docker ps -a
5.修改配置
apl
#修改chainlit的配置
cd /kwan/chainlit/.chainlit
#修改markdown文件
cd /kwan/chainlit
6.访问验证
apl
#页面验证
http://120.79.36.53:8000/
#公司内网地址
http://10.201.0.6:8000/
觉得有用的话点个赞
👍🏻
呗。❤️❤️❤️本人水平有限,如有纰漏,欢迎各位大佬评论批评指正!😄😄😄
💘💘💘如果觉得这篇文对你有帮助的话,也请给个点赞、收藏下吧,非常感谢!👍 👍 👍
🔥🔥🔥Stay Hungry Stay Foolish 道阻且长,行则将至,让我们一起加油吧!🌙🌙🌙