使用fastapi搭建ChatGPT对话后台

使用fastapi搭建ChatGPT对话后台

参考资料:使用fastapi搭建ChatGPT对话后台

效果:在本地构建网页达成类似chatgpt的对话效果,一个字一个字的返回生成结果

ChatGPT初步调用

python 复制代码
import os
import fastapi
import dotenv
from httpx import AsyncClient
from typing import List,Dict
dotenv.load_dotenv('./env')

# print(os.getenv('OPENAI_API_BASE'))
async def request(val: List[dict[str,str]]):
    """
    发起请求
    val: 对话内容
    """
    url = "https://xiaoai.plus/v1/chat/completions"
    headers ={
        "Content-Type": "application/json",
        "Authorization": "Bearer " + os.getenv("OPENAI_API_KEY")
    }
    params = {
        "model": "gpt-3.5-turbo",
        "messages": val, # [{"role": "user", "content": "Say this is a test!"}]
        "temperature": 0.7,
        "n": 1,
        "max_tokens": 3000,
        "stream": False
    }
    async with AsyncClient() as clinet:
        response = await clinet.post(url, headers=headers,json=params,timeout=60)
        print(response.json())

if __name__ == '__main__':
    import asyncio
    asyncio.run(request([{"role": "user", "content": "Hello!"}]))
bash 复制代码
{'id': 'chatcmpl-ABYGZNDqhtZn5igtaukBbLPWrdTPZ', 'object': 'chat.completion', 'created': 1727316691, 'model': 'gpt-3.5-turbo', 'choices': [{'index': 0, 'message': {'role': 'assistant', 'content': 'Hello! How can I assist you today?'}, 'finish_reason': 'stop'}], 'usage': {'prompt_tokens': 9, 'completion_tokens': 9, 'total_tokens': 18}, 'system_fingerprint': 'fp_808245b034'}

对回答进行解析, 这里的结果是一次性返回消息内容

bash 复制代码
{
    "id": "chatcmpl-ABYGZNDqhtZn5igtaukBbLPWrdTPZ",  # 唯一标识符,用于追踪请求
    "object": "chat.completion",  # 对象类型,表示这是一个聊天完成事件
    "created": 1727316691,  # 创建时间戳,表示响应创建的时间
    "model": "gpt-3.5-turbo",  # 使用的模型名称
    "choices": [  # 选择列表,可能包含多个回复,这里只有一个
        {
            "index": 0,  # 当前选择的索引
            "message": {  # 选择的消息内容
                "role": "assistant",  # 消息角色,这里是助手
                "content": "Hello! How can I assist you today?"  # 消息内容
            },
            "finish_reason": "stop"  # 完成原因,这里是"stop",表示模型决定停止生成更多内容
        }
    ],
    "usage": {  # 使用情况,包括token使用情况
        "prompt_tokens": 9,  # 提示token的数量
        "completion_tokens": 9,  # 完成token的数量
        "total_tokens": 18  # 总token的数量
    },
    "system_fingerprint": "fp_808245b034"  # 系统指纹,用于识别请求的系统环境
}

流式调用ChatGPT

修改上述代码中的"stream": Trueprint(response.text)部分

python 复制代码
import os
import fastapi
import dotenv
from httpx import AsyncClient
from typing import List,Dict
dotenv.load_dotenv('./env')

# print(os.getenv('OPENAI_API_BASE'))
async def request(val: List[dict[str,str]]):
    """
    发起请求
    val: 对话内容
    """
    url = "https://xiaoai.plus/v1/chat/completions"
    headers ={
        "Content-Type": "application/json",
        "Authorization": "Bearer " + os.getenv("OPENAI_API_KEY")
    }
    params = {
        "model": "gpt-3.5-turbo",
        "messages": val, # [{"role": "user", "content": "Say this is a test!"}]
        "temperature": 0.7,
        "n": 1,
        "max_tokens": 3000,
        "stream": True
    }
    async with AsyncClient() as clinet:
        response = await clinet.post(url, headers=headers,json=params,timeout=60)
        print(response.text)

if __name__ == '__main__':
    import asyncio
    asyncio.run(request([{"role": "user", "content": "Hello!"}]))

可以看到GPT的结果是一个词一个词的返回的

bash 复制代码
data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"role":"assistant","content":""},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":"Hello"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":"!"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" How"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" can"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" I"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" assist"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" you"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" today"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":"?"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{},"finish_reason":"stop"}]}

data: [DONE]

逐行处理流式响应

修改相应的代码

python 复制代码
    # async with AsyncClient() as clinet:
    #     response = await clinet.post(url, headers=headers,json=params,timeout=60)
    #     print(response.text)
    async with AsyncClient() as clinet:
        async with clinet.stream("POST", url, headers=headers, json=params, timeout=60) as response:
            async for line in response.aiter_lines():
                print(line)
python 复制代码
import os
import fastapi
import dotenv
from httpx import AsyncClient
from typing import List,Dict
import json
from collections import defaultdict
dotenv.load_dotenv('./env')

# print(os.getenv('OPENAI_API_BASE'))
async def request(val: List[dict[str,str]]):
    """
    发起请求
    val: 对话内容
    """
    url = "https://xiaoai.plus/v1/chat/completions"
    headers ={
        "Content-Type": "application/json",
        "Authorization": "Bearer " + os.getenv("OPENAI_API_KEY")
    }
    params = {
        "model": "gpt-3.5-turbo",
        "messages": val, # [{"role": "user", "content": "Say this is a test!"}]
        "temperature": 0.7,
        "n": 1,
        "max_tokens": 3000,
        "stream": True
    }
    # async with AsyncClient() as clinet:
    #     response = await clinet.post(url, headers=headers,json=params,timeout=60)
    #     print(response.text)
    async with AsyncClient() as clinet:
        async with clinet.stream("POST", url, headers=headers, json=params, timeout=60) as response:
            async for line in response.aiter_lines():
                if line.strip() == "":
                    continue
                line = line.replace("data: ","")
                if line.strip() == "[DONE]":
                    return
                data = json.loads(line)
                if data.get("choices") is None or len(data.get("choices")) == 0 or data.get("choices")[0].get("finish_reason") is not None:
                    return
                yield data.get("choices")[0]

async def chat(inp: str):
    message = [{"role": "user", "content": inp}]
    async for i in request(message):
        print(i)
if __name__ == '__main__':
    import asyncio
    asyncio.run(chat("你好啊"))
bash 复制代码
{'index': 0, 'delta': {'role': 'assistant', 'content': ''}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '你'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '好'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': ','}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '有'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '什'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '么'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '可以'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '帮'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '助'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '你'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '的'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '吗'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '?'}, 'finish_reason': None}

封装请求与chat方法

python 复制代码
import os
import fastapi
import dotenv
from httpx import AsyncClient
from typing import List,Dict
import json
from collections import defaultdict
dotenv.load_dotenv('./env')

# print(os.getenv('OPENAI_API_BASE'))
async def request(val: List[dict[str,str]]):
    """
    发起请求
    val: 对话内容
    """
    url = "https://xiaoai.plus/v1/chat/completions"
    headers ={
        "Content-Type": "application/json",
        "Authorization": "Bearer " + os.getenv("OPENAI_API_KEY")
    }
    params = {
        "model": "gpt-3.5-turbo",
        "messages": val, # [{"role": "user", "content": "Say this is a test!"}]
        "temperature": 0.7,
        "n": 1,
        "max_tokens": 3000,
        "stream": True
    }
    # async with AsyncClient() as clinet:
    #     response = await clinet.post(url, headers=headers,json=params,timeout=60)
    #     print(response.text)
    async with AsyncClient() as clinet:
        async with clinet.stream("POST", url, headers=headers, json=params, timeout=60) as response:
            async for line in response.aiter_lines():
                if line.strip() == "":
                    continue
                line = line.replace("data: ","")
                if line.strip() == "[DONE]":
                    return
                data = json.loads(line)
                if data.get("choices") is None or len(data.get("choices")) == 0 or data.get("choices")[0].get("delta").get("finish_reason") is not None:
                    return
                yield data.get("choices")[0]

async def chat(inp: str):
    message = [{"role": "user", "content": inp}]
    chat_msg = defaultdict(str)
    async for i in request(message):
        if i.get("delta").get("role"):
            chat_msg["role"] = i.get("delta").get("role")
        if i.get("delta").get("content"):
            chat_msg["content"] += i.get("delta").get("content")
    print(chat_msg)
if __name__ == '__main__':
    import asyncio
    asyncio.run(chat("你好啊"))
bash 复制代码
defaultdict(<class 'str'>, {'role': 'assistant', 'content': '你好!有什么我可以帮助你的吗?'}

使用fastapi进行封装

python 复制代码
import os
import fastapi
import dotenv
from httpx import AsyncClient
from typing import List,Dict
from fastapi import FastAPI, WebSocket
from fastapi.middleware.cors import CORSMiddleware

import json
from collections import defaultdict
from fastapi.responses import HTMLResponse

app = FastAPI()
dotenv.load_dotenv('./env')


# app.add_middleware(
#     CORSMiddleware,
#     allow_origins=["*"],
#     allow_credentials=True,
#     allow_methods=["*"],
#     allow_headers=["*"],
# )


@app.get("/")
async def root():
    return {"message": "Hello World"}

# print(os.getenv('OPENAI_API_BASE'))
async def request(val: List[dict[str,str]]):
    """
    发起请求
    val: 对话内容
    """
    url = "https://xiaoai.plus/v1/chat/completions"
    headers ={
        "Content-Type": "application/json",
        "Authorization": "Bearer " + os.getenv("OPENAI_API_KEY"),

    }
    params = {
        "model": "gpt-3.5-turbo",
        "messages": val, # [{"role": "user", "content": "Say this is a test!"}]
        "temperature": 0.7,
        "n": 1,
        "max_tokens": 3000,
        "stream": True
    }
    # async with AsyncClient() as clinet:
    #     response = await clinet.post(url, headers=headers,json=params,timeout=60)
    #     print(response.text)
    async with AsyncClient() as clinet:
        async with clinet.stream("POST", url, headers=headers, json=params, timeout=60) as response:
            async for line in response.aiter_lines():
                if line.strip() == "":
                    continue
                line = line.replace("data: ","")
                if line.strip() == "[DONE]":
                    return
                data = json.loads(line)
                if data.get("choices") is None or len(data.get("choices")) == 0 or data.get("choices")[0].get("delta").get("finish_reason") is not None:
                    return
                yield data.get("choices")[0]

@app.websocket("/chat")
async def chat(websocket: WebSocket):
    await websocket.accept()
    message = []
    while True:
        data = await websocket.receive_text()
        if data == "quit": 
            await websocket.close()
            break
        message.append({"role": "user", "content": data})
        chat_msg = defaultdict(str)
        async for i in request(message):
            if i.get("delta").get("role"):
                chat_msg["role"] = i.get("delta").get("role")
            if i.get("delta").get("content"):
                chat_msg["content"] += i.get("delta").get("content")
                await websocket.send_text(i.get("delta").get("content"))
        message.append(chat_msg)

if __name__ == '__main__':
    import uvicorn
    uvicorn.run("main:app", host="127.0.0.1",port=8080,reload=True)

使用html界面调用

html 复制代码
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<title>Titile</title>
</head>
<body>
    <p>连接状态: <span id="status">未连接</span></p>
    <p>回复消息: <span id="message"></span></p>
    <p><input id="inp"></p>
    <button type="submit" id="submit">提交</button>
<script>
    let status = document.getElementById("status")
    let message = document.getElementById("message")
    let inp = document.getElementById("inp")
    let submit = document.getElementById("submit")

    let socket = new WebSocket("ws://127.0.0.1:8080/chat")
    socket.addEventListener("open", (event)=>{
        status.innerText = "已连接"
    })
    socket.addEventListener("error", (event)=>{
        status.innerText = "已失败"
    })
    socket.addEventListener("close", (event)=>{
        status.innerText = "已关闭"
        console.log("WebSocket closed:", event)
    })
    socket.addEventListener("message", (event)=>{
        message.innerText += event.data
    })
    submit.addEventListener("click", ()=>{
        socket.send(inp.value)
    })
</script>
</body>
</html>
相关推荐
刘逸潇20051 天前
jinji2模板
fastapi
onelafite2 天前
怎么获取小红书用户笔记作品列表?item_search_shop_videoAPI接口指南
api·fastapi
刘逸潇20052 天前
FastAPI(二)——请求与响应
网络·python·fastapi
产业家3 天前
Sora 后思考:从 AI 工具到 AI 平台,产业 AGI 又近了一步
人工智能·chatgpt·agi
weixin_519535774 天前
从ChatGPT到新质生产力:一份数据驱动的AI研究方向指南
人工智能·深度学习·机器学习·ai·chatgpt·数据分析·aigc
JamSlade4 天前
流式响应 sse 系统全流程 react + fastapi为例子
前端·react.js·fastapi
无敌糖果4 天前
FastAPI请求会话context上下文中间件
fastapi
sen_shan4 天前
《FastAPI零基础入门与进阶实战》第21篇:告别 /path/ vs /path:静默斜杠修正中间件
fastapi
尽兴-5 天前
【10 分钟!M4 Mac mini 离线部署「私有 ChatGPT」完整实录】
macos·ai·chatgpt·大模型·ollama·私有化