使用fastapi搭建ChatGPT对话后台

使用fastapi搭建ChatGPT对话后台

参考资料:使用fastapi搭建ChatGPT对话后台

效果:在本地构建网页达成类似chatgpt的对话效果,一个字一个字的返回生成结果

ChatGPT初步调用

python 复制代码
import os
import fastapi
import dotenv
from httpx import AsyncClient
from typing import List,Dict
dotenv.load_dotenv('./env')

# print(os.getenv('OPENAI_API_BASE'))
async def request(val: List[dict[str,str]]):
    """
    发起请求
    val: 对话内容
    """
    url = "https://xiaoai.plus/v1/chat/completions"
    headers ={
        "Content-Type": "application/json",
        "Authorization": "Bearer " + os.getenv("OPENAI_API_KEY")
    }
    params = {
        "model": "gpt-3.5-turbo",
        "messages": val, # [{"role": "user", "content": "Say this is a test!"}]
        "temperature": 0.7,
        "n": 1,
        "max_tokens": 3000,
        "stream": False
    }
    async with AsyncClient() as clinet:
        response = await clinet.post(url, headers=headers,json=params,timeout=60)
        print(response.json())

if __name__ == '__main__':
    import asyncio
    asyncio.run(request([{"role": "user", "content": "Hello!"}]))
bash 复制代码
{'id': 'chatcmpl-ABYGZNDqhtZn5igtaukBbLPWrdTPZ', 'object': 'chat.completion', 'created': 1727316691, 'model': 'gpt-3.5-turbo', 'choices': [{'index': 0, 'message': {'role': 'assistant', 'content': 'Hello! How can I assist you today?'}, 'finish_reason': 'stop'}], 'usage': {'prompt_tokens': 9, 'completion_tokens': 9, 'total_tokens': 18}, 'system_fingerprint': 'fp_808245b034'}

对回答进行解析, 这里的结果是一次性返回消息内容

bash 复制代码
{
    "id": "chatcmpl-ABYGZNDqhtZn5igtaukBbLPWrdTPZ",  # 唯一标识符,用于追踪请求
    "object": "chat.completion",  # 对象类型,表示这是一个聊天完成事件
    "created": 1727316691,  # 创建时间戳,表示响应创建的时间
    "model": "gpt-3.5-turbo",  # 使用的模型名称
    "choices": [  # 选择列表,可能包含多个回复,这里只有一个
        {
            "index": 0,  # 当前选择的索引
            "message": {  # 选择的消息内容
                "role": "assistant",  # 消息角色,这里是助手
                "content": "Hello! How can I assist you today?"  # 消息内容
            },
            "finish_reason": "stop"  # 完成原因,这里是"stop",表示模型决定停止生成更多内容
        }
    ],
    "usage": {  # 使用情况,包括token使用情况
        "prompt_tokens": 9,  # 提示token的数量
        "completion_tokens": 9,  # 完成token的数量
        "total_tokens": 18  # 总token的数量
    },
    "system_fingerprint": "fp_808245b034"  # 系统指纹,用于识别请求的系统环境
}

流式调用ChatGPT

修改上述代码中的"stream": Trueprint(response.text)部分

python 复制代码
import os
import fastapi
import dotenv
from httpx import AsyncClient
from typing import List,Dict
dotenv.load_dotenv('./env')

# print(os.getenv('OPENAI_API_BASE'))
async def request(val: List[dict[str,str]]):
    """
    发起请求
    val: 对话内容
    """
    url = "https://xiaoai.plus/v1/chat/completions"
    headers ={
        "Content-Type": "application/json",
        "Authorization": "Bearer " + os.getenv("OPENAI_API_KEY")
    }
    params = {
        "model": "gpt-3.5-turbo",
        "messages": val, # [{"role": "user", "content": "Say this is a test!"}]
        "temperature": 0.7,
        "n": 1,
        "max_tokens": 3000,
        "stream": True
    }
    async with AsyncClient() as clinet:
        response = await clinet.post(url, headers=headers,json=params,timeout=60)
        print(response.text)

if __name__ == '__main__':
    import asyncio
    asyncio.run(request([{"role": "user", "content": "Hello!"}]))

可以看到GPT的结果是一个词一个词的返回的

bash 复制代码
data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"role":"assistant","content":""},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":"Hello"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":"!"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" How"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" can"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" I"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" assist"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" you"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" today"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":"?"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{},"finish_reason":"stop"}]}

data: [DONE]

逐行处理流式响应

修改相应的代码

python 复制代码
    # async with AsyncClient() as clinet:
    #     response = await clinet.post(url, headers=headers,json=params,timeout=60)
    #     print(response.text)
    async with AsyncClient() as clinet:
        async with clinet.stream("POST", url, headers=headers, json=params, timeout=60) as response:
            async for line in response.aiter_lines():
                print(line)
python 复制代码
import os
import fastapi
import dotenv
from httpx import AsyncClient
from typing import List,Dict
import json
from collections import defaultdict
dotenv.load_dotenv('./env')

# print(os.getenv('OPENAI_API_BASE'))
async def request(val: List[dict[str,str]]):
    """
    发起请求
    val: 对话内容
    """
    url = "https://xiaoai.plus/v1/chat/completions"
    headers ={
        "Content-Type": "application/json",
        "Authorization": "Bearer " + os.getenv("OPENAI_API_KEY")
    }
    params = {
        "model": "gpt-3.5-turbo",
        "messages": val, # [{"role": "user", "content": "Say this is a test!"}]
        "temperature": 0.7,
        "n": 1,
        "max_tokens": 3000,
        "stream": True
    }
    # async with AsyncClient() as clinet:
    #     response = await clinet.post(url, headers=headers,json=params,timeout=60)
    #     print(response.text)
    async with AsyncClient() as clinet:
        async with clinet.stream("POST", url, headers=headers, json=params, timeout=60) as response:
            async for line in response.aiter_lines():
                if line.strip() == "":
                    continue
                line = line.replace("data: ","")
                if line.strip() == "[DONE]":
                    return
                data = json.loads(line)
                if data.get("choices") is None or len(data.get("choices")) == 0 or data.get("choices")[0].get("finish_reason") is not None:
                    return
                yield data.get("choices")[0]

async def chat(inp: str):
    message = [{"role": "user", "content": inp}]
    async for i in request(message):
        print(i)
if __name__ == '__main__':
    import asyncio
    asyncio.run(chat("你好啊"))
bash 复制代码
{'index': 0, 'delta': {'role': 'assistant', 'content': ''}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '你'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '好'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': ','}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '有'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '什'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '么'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '可以'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '帮'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '助'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '你'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '的'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '吗'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '?'}, 'finish_reason': None}

封装请求与chat方法

python 复制代码
import os
import fastapi
import dotenv
from httpx import AsyncClient
from typing import List,Dict
import json
from collections import defaultdict
dotenv.load_dotenv('./env')

# print(os.getenv('OPENAI_API_BASE'))
async def request(val: List[dict[str,str]]):
    """
    发起请求
    val: 对话内容
    """
    url = "https://xiaoai.plus/v1/chat/completions"
    headers ={
        "Content-Type": "application/json",
        "Authorization": "Bearer " + os.getenv("OPENAI_API_KEY")
    }
    params = {
        "model": "gpt-3.5-turbo",
        "messages": val, # [{"role": "user", "content": "Say this is a test!"}]
        "temperature": 0.7,
        "n": 1,
        "max_tokens": 3000,
        "stream": True
    }
    # async with AsyncClient() as clinet:
    #     response = await clinet.post(url, headers=headers,json=params,timeout=60)
    #     print(response.text)
    async with AsyncClient() as clinet:
        async with clinet.stream("POST", url, headers=headers, json=params, timeout=60) as response:
            async for line in response.aiter_lines():
                if line.strip() == "":
                    continue
                line = line.replace("data: ","")
                if line.strip() == "[DONE]":
                    return
                data = json.loads(line)
                if data.get("choices") is None or len(data.get("choices")) == 0 or data.get("choices")[0].get("delta").get("finish_reason") is not None:
                    return
                yield data.get("choices")[0]

async def chat(inp: str):
    message = [{"role": "user", "content": inp}]
    chat_msg = defaultdict(str)
    async for i in request(message):
        if i.get("delta").get("role"):
            chat_msg["role"] = i.get("delta").get("role")
        if i.get("delta").get("content"):
            chat_msg["content"] += i.get("delta").get("content")
    print(chat_msg)
if __name__ == '__main__':
    import asyncio
    asyncio.run(chat("你好啊"))
bash 复制代码
defaultdict(<class 'str'>, {'role': 'assistant', 'content': '你好!有什么我可以帮助你的吗?'}

使用fastapi进行封装

python 复制代码
import os
import fastapi
import dotenv
from httpx import AsyncClient
from typing import List,Dict
from fastapi import FastAPI, WebSocket
from fastapi.middleware.cors import CORSMiddleware

import json
from collections import defaultdict
from fastapi.responses import HTMLResponse

app = FastAPI()
dotenv.load_dotenv('./env')


# app.add_middleware(
#     CORSMiddleware,
#     allow_origins=["*"],
#     allow_credentials=True,
#     allow_methods=["*"],
#     allow_headers=["*"],
# )


@app.get("/")
async def root():
    return {"message": "Hello World"}

# print(os.getenv('OPENAI_API_BASE'))
async def request(val: List[dict[str,str]]):
    """
    发起请求
    val: 对话内容
    """
    url = "https://xiaoai.plus/v1/chat/completions"
    headers ={
        "Content-Type": "application/json",
        "Authorization": "Bearer " + os.getenv("OPENAI_API_KEY"),

    }
    params = {
        "model": "gpt-3.5-turbo",
        "messages": val, # [{"role": "user", "content": "Say this is a test!"}]
        "temperature": 0.7,
        "n": 1,
        "max_tokens": 3000,
        "stream": True
    }
    # async with AsyncClient() as clinet:
    #     response = await clinet.post(url, headers=headers,json=params,timeout=60)
    #     print(response.text)
    async with AsyncClient() as clinet:
        async with clinet.stream("POST", url, headers=headers, json=params, timeout=60) as response:
            async for line in response.aiter_lines():
                if line.strip() == "":
                    continue
                line = line.replace("data: ","")
                if line.strip() == "[DONE]":
                    return
                data = json.loads(line)
                if data.get("choices") is None or len(data.get("choices")) == 0 or data.get("choices")[0].get("delta").get("finish_reason") is not None:
                    return
                yield data.get("choices")[0]

@app.websocket("/chat")
async def chat(websocket: WebSocket):
    await websocket.accept()
    message = []
    while True:
        data = await websocket.receive_text()
        if data == "quit": 
            await websocket.close()
            break
        message.append({"role": "user", "content": data})
        chat_msg = defaultdict(str)
        async for i in request(message):
            if i.get("delta").get("role"):
                chat_msg["role"] = i.get("delta").get("role")
            if i.get("delta").get("content"):
                chat_msg["content"] += i.get("delta").get("content")
                await websocket.send_text(i.get("delta").get("content"))
        message.append(chat_msg)

if __name__ == '__main__':
    import uvicorn
    uvicorn.run("main:app", host="127.0.0.1",port=8080,reload=True)

使用html界面调用

html 复制代码
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<title>Titile</title>
</head>
<body>
    <p>连接状态: <span id="status">未连接</span></p>
    <p>回复消息: <span id="message"></span></p>
    <p><input id="inp"></p>
    <button type="submit" id="submit">提交</button>
<script>
    let status = document.getElementById("status")
    let message = document.getElementById("message")
    let inp = document.getElementById("inp")
    let submit = document.getElementById("submit")

    let socket = new WebSocket("ws://127.0.0.1:8080/chat")
    socket.addEventListener("open", (event)=>{
        status.innerText = "已连接"
    })
    socket.addEventListener("error", (event)=>{
        status.innerText = "已失败"
    })
    socket.addEventListener("close", (event)=>{
        status.innerText = "已关闭"
        console.log("WebSocket closed:", event)
    })
    socket.addEventListener("message", (event)=>{
        message.innerText += event.data
    })
    submit.addEventListener("click", ()=>{
        socket.send(inp.value)
    })
</script>
</body>
</html>
相关推荐
雪碧聊技术2 小时前
ORM简介、安装、使用流程
fastapi·orm·基础代码编写
有颜有货2 小时前
GEO(生成引擎优化)是什么?GEO的工作流程详解
人工智能·chatgpt·geo
数研小生2 小时前
用爬虫数据训练 ChatGPT 行业知识库:从数据采集到模型微调的实战指南
人工智能·爬虫·chatgpt
曲幽3 小时前
FastAPI实战:用懒加载与Lifespan优雅管理重型依赖
fastapi·async·lifespan·lazy loading·startup event
雪碧聊技术4 小时前
ORM-查询
fastapi·orm·查询
Li emily13 小时前
解决港股实时行情数据 API 接入难题
人工智能·python·fastapi
yuezhilangniao15 小时前
AI智能体全栈开发工程化规范 备忘 ~ fastAPI+Next.js
javascript·人工智能·fastapi
a11177618 小时前
图书借阅管理系统(FastAPI + Vue)
前端·vue.js·fastapi
曲幽1 天前
FastAPI生命周期管理实战:从启动到关闭,如何优雅地管好你的“资源家当”
redis·python·fastapi·web·shutdown·startup·lifespan
极客小云1 天前
【基于AI的自动商品试用系统:不仅仅是虚拟试衣!】
javascript·python·django·flask·github·pyqt·fastapi