使用fastapi搭建ChatGPT对话后台

使用fastapi搭建ChatGPT对话后台

参考资料:使用fastapi搭建ChatGPT对话后台

效果:在本地构建网页达成类似chatgpt的对话效果,一个字一个字的返回生成结果

ChatGPT初步调用

python 复制代码
import os
import fastapi
import dotenv
from httpx import AsyncClient
from typing import List,Dict
dotenv.load_dotenv('./env')

# print(os.getenv('OPENAI_API_BASE'))
async def request(val: List[dict[str,str]]):
    """
    发起请求
    val: 对话内容
    """
    url = "https://xiaoai.plus/v1/chat/completions"
    headers ={
        "Content-Type": "application/json",
        "Authorization": "Bearer " + os.getenv("OPENAI_API_KEY")
    }
    params = {
        "model": "gpt-3.5-turbo",
        "messages": val, # [{"role": "user", "content": "Say this is a test!"}]
        "temperature": 0.7,
        "n": 1,
        "max_tokens": 3000,
        "stream": False
    }
    async with AsyncClient() as clinet:
        response = await clinet.post(url, headers=headers,json=params,timeout=60)
        print(response.json())

if __name__ == '__main__':
    import asyncio
    asyncio.run(request([{"role": "user", "content": "Hello!"}]))
bash 复制代码
{'id': 'chatcmpl-ABYGZNDqhtZn5igtaukBbLPWrdTPZ', 'object': 'chat.completion', 'created': 1727316691, 'model': 'gpt-3.5-turbo', 'choices': [{'index': 0, 'message': {'role': 'assistant', 'content': 'Hello! How can I assist you today?'}, 'finish_reason': 'stop'}], 'usage': {'prompt_tokens': 9, 'completion_tokens': 9, 'total_tokens': 18}, 'system_fingerprint': 'fp_808245b034'}

对回答进行解析, 这里的结果是一次性返回消息内容

bash 复制代码
{
    "id": "chatcmpl-ABYGZNDqhtZn5igtaukBbLPWrdTPZ",  # 唯一标识符,用于追踪请求
    "object": "chat.completion",  # 对象类型,表示这是一个聊天完成事件
    "created": 1727316691,  # 创建时间戳,表示响应创建的时间
    "model": "gpt-3.5-turbo",  # 使用的模型名称
    "choices": [  # 选择列表,可能包含多个回复,这里只有一个
        {
            "index": 0,  # 当前选择的索引
            "message": {  # 选择的消息内容
                "role": "assistant",  # 消息角色,这里是助手
                "content": "Hello! How can I assist you today?"  # 消息内容
            },
            "finish_reason": "stop"  # 完成原因,这里是"stop",表示模型决定停止生成更多内容
        }
    ],
    "usage": {  # 使用情况,包括token使用情况
        "prompt_tokens": 9,  # 提示token的数量
        "completion_tokens": 9,  # 完成token的数量
        "total_tokens": 18  # 总token的数量
    },
    "system_fingerprint": "fp_808245b034"  # 系统指纹,用于识别请求的系统环境
}

流式调用ChatGPT

修改上述代码中的"stream": Trueprint(response.text)部分

python 复制代码
import os
import fastapi
import dotenv
from httpx import AsyncClient
from typing import List,Dict
dotenv.load_dotenv('./env')

# print(os.getenv('OPENAI_API_BASE'))
async def request(val: List[dict[str,str]]):
    """
    发起请求
    val: 对话内容
    """
    url = "https://xiaoai.plus/v1/chat/completions"
    headers ={
        "Content-Type": "application/json",
        "Authorization": "Bearer " + os.getenv("OPENAI_API_KEY")
    }
    params = {
        "model": "gpt-3.5-turbo",
        "messages": val, # [{"role": "user", "content": "Say this is a test!"}]
        "temperature": 0.7,
        "n": 1,
        "max_tokens": 3000,
        "stream": True
    }
    async with AsyncClient() as clinet:
        response = await clinet.post(url, headers=headers,json=params,timeout=60)
        print(response.text)

if __name__ == '__main__':
    import asyncio
    asyncio.run(request([{"role": "user", "content": "Hello!"}]))

可以看到GPT的结果是一个词一个词的返回的

bash 复制代码
data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"role":"assistant","content":""},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":"Hello"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":"!"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" How"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" can"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" I"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" assist"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" you"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" today"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":"?"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{},"finish_reason":"stop"}]}

data: [DONE]

逐行处理流式响应

修改相应的代码

python 复制代码
    # async with AsyncClient() as clinet:
    #     response = await clinet.post(url, headers=headers,json=params,timeout=60)
    #     print(response.text)
    async with AsyncClient() as clinet:
        async with clinet.stream("POST", url, headers=headers, json=params, timeout=60) as response:
            async for line in response.aiter_lines():
                print(line)
python 复制代码
import os
import fastapi
import dotenv
from httpx import AsyncClient
from typing import List,Dict
import json
from collections import defaultdict
dotenv.load_dotenv('./env')

# print(os.getenv('OPENAI_API_BASE'))
async def request(val: List[dict[str,str]]):
    """
    发起请求
    val: 对话内容
    """
    url = "https://xiaoai.plus/v1/chat/completions"
    headers ={
        "Content-Type": "application/json",
        "Authorization": "Bearer " + os.getenv("OPENAI_API_KEY")
    }
    params = {
        "model": "gpt-3.5-turbo",
        "messages": val, # [{"role": "user", "content": "Say this is a test!"}]
        "temperature": 0.7,
        "n": 1,
        "max_tokens": 3000,
        "stream": True
    }
    # async with AsyncClient() as clinet:
    #     response = await clinet.post(url, headers=headers,json=params,timeout=60)
    #     print(response.text)
    async with AsyncClient() as clinet:
        async with clinet.stream("POST", url, headers=headers, json=params, timeout=60) as response:
            async for line in response.aiter_lines():
                if line.strip() == "":
                    continue
                line = line.replace("data: ","")
                if line.strip() == "[DONE]":
                    return
                data = json.loads(line)
                if data.get("choices") is None or len(data.get("choices")) == 0 or data.get("choices")[0].get("finish_reason") is not None:
                    return
                yield data.get("choices")[0]

async def chat(inp: str):
    message = [{"role": "user", "content": inp}]
    async for i in request(message):
        print(i)
if __name__ == '__main__':
    import asyncio
    asyncio.run(chat("你好啊"))
bash 复制代码
{'index': 0, 'delta': {'role': 'assistant', 'content': ''}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '你'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '好'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': ','}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '有'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '什'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '么'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '可以'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '帮'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '助'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '你'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '的'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '吗'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '?'}, 'finish_reason': None}

封装请求与chat方法

python 复制代码
import os
import fastapi
import dotenv
from httpx import AsyncClient
from typing import List,Dict
import json
from collections import defaultdict
dotenv.load_dotenv('./env')

# print(os.getenv('OPENAI_API_BASE'))
async def request(val: List[dict[str,str]]):
    """
    发起请求
    val: 对话内容
    """
    url = "https://xiaoai.plus/v1/chat/completions"
    headers ={
        "Content-Type": "application/json",
        "Authorization": "Bearer " + os.getenv("OPENAI_API_KEY")
    }
    params = {
        "model": "gpt-3.5-turbo",
        "messages": val, # [{"role": "user", "content": "Say this is a test!"}]
        "temperature": 0.7,
        "n": 1,
        "max_tokens": 3000,
        "stream": True
    }
    # async with AsyncClient() as clinet:
    #     response = await clinet.post(url, headers=headers,json=params,timeout=60)
    #     print(response.text)
    async with AsyncClient() as clinet:
        async with clinet.stream("POST", url, headers=headers, json=params, timeout=60) as response:
            async for line in response.aiter_lines():
                if line.strip() == "":
                    continue
                line = line.replace("data: ","")
                if line.strip() == "[DONE]":
                    return
                data = json.loads(line)
                if data.get("choices") is None or len(data.get("choices")) == 0 or data.get("choices")[0].get("delta").get("finish_reason") is not None:
                    return
                yield data.get("choices")[0]

async def chat(inp: str):
    message = [{"role": "user", "content": inp}]
    chat_msg = defaultdict(str)
    async for i in request(message):
        if i.get("delta").get("role"):
            chat_msg["role"] = i.get("delta").get("role")
        if i.get("delta").get("content"):
            chat_msg["content"] += i.get("delta").get("content")
    print(chat_msg)
if __name__ == '__main__':
    import asyncio
    asyncio.run(chat("你好啊"))
bash 复制代码
defaultdict(<class 'str'>, {'role': 'assistant', 'content': '你好!有什么我可以帮助你的吗?'}

使用fastapi进行封装

python 复制代码
import os
import fastapi
import dotenv
from httpx import AsyncClient
from typing import List,Dict
from fastapi import FastAPI, WebSocket
from fastapi.middleware.cors import CORSMiddleware

import json
from collections import defaultdict
from fastapi.responses import HTMLResponse

app = FastAPI()
dotenv.load_dotenv('./env')


# app.add_middleware(
#     CORSMiddleware,
#     allow_origins=["*"],
#     allow_credentials=True,
#     allow_methods=["*"],
#     allow_headers=["*"],
# )


@app.get("/")
async def root():
    return {"message": "Hello World"}

# print(os.getenv('OPENAI_API_BASE'))
async def request(val: List[dict[str,str]]):
    """
    发起请求
    val: 对话内容
    """
    url = "https://xiaoai.plus/v1/chat/completions"
    headers ={
        "Content-Type": "application/json",
        "Authorization": "Bearer " + os.getenv("OPENAI_API_KEY"),

    }
    params = {
        "model": "gpt-3.5-turbo",
        "messages": val, # [{"role": "user", "content": "Say this is a test!"}]
        "temperature": 0.7,
        "n": 1,
        "max_tokens": 3000,
        "stream": True
    }
    # async with AsyncClient() as clinet:
    #     response = await clinet.post(url, headers=headers,json=params,timeout=60)
    #     print(response.text)
    async with AsyncClient() as clinet:
        async with clinet.stream("POST", url, headers=headers, json=params, timeout=60) as response:
            async for line in response.aiter_lines():
                if line.strip() == "":
                    continue
                line = line.replace("data: ","")
                if line.strip() == "[DONE]":
                    return
                data = json.loads(line)
                if data.get("choices") is None or len(data.get("choices")) == 0 or data.get("choices")[0].get("delta").get("finish_reason") is not None:
                    return
                yield data.get("choices")[0]

@app.websocket("/chat")
async def chat(websocket: WebSocket):
    await websocket.accept()
    message = []
    while True:
        data = await websocket.receive_text()
        if data == "quit": 
            await websocket.close()
            break
        message.append({"role": "user", "content": data})
        chat_msg = defaultdict(str)
        async for i in request(message):
            if i.get("delta").get("role"):
                chat_msg["role"] = i.get("delta").get("role")
            if i.get("delta").get("content"):
                chat_msg["content"] += i.get("delta").get("content")
                await websocket.send_text(i.get("delta").get("content"))
        message.append(chat_msg)

if __name__ == '__main__':
    import uvicorn
    uvicorn.run("main:app", host="127.0.0.1",port=8080,reload=True)

使用html界面调用

html 复制代码
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<title>Titile</title>
</head>
<body>
    <p>连接状态: <span id="status">未连接</span></p>
    <p>回复消息: <span id="message"></span></p>
    <p><input id="inp"></p>
    <button type="submit" id="submit">提交</button>
<script>
    let status = document.getElementById("status")
    let message = document.getElementById("message")
    let inp = document.getElementById("inp")
    let submit = document.getElementById("submit")

    let socket = new WebSocket("ws://127.0.0.1:8080/chat")
    socket.addEventListener("open", (event)=>{
        status.innerText = "已连接"
    })
    socket.addEventListener("error", (event)=>{
        status.innerText = "已失败"
    })
    socket.addEventListener("close", (event)=>{
        status.innerText = "已关闭"
        console.log("WebSocket closed:", event)
    })
    socket.addEventListener("message", (event)=>{
        message.innerText += event.data
    })
    submit.addEventListener("click", ()=>{
        socket.send(inp.value)
    })
</script>
</body>
</html>
相关推荐
Good kid.1 小时前
【原创】基于 RoBERTa 的智能垃圾分类系统(规则 + AI 混合,FastAPI 接口 + Web Demo)
人工智能·分类·fastapi
博客胡17 小时前
Python-fastAPI的学习与使用
学习·fastapi·ai编程
wang60212521817 小时前
FastAPI中的异步任务执行-celery
fastapi·celery
idkmn_1 天前
Agentic AI 基础概念
人工智能·python·深度学习·chatgpt·langchain
宁雨桥1 天前
多引擎中英翻译API搭建与使用教程
python·fastapi·翻译
Luke Ewin1 天前
基于FunASR开发的可私有化部署的语音转文字接口 | FunASR接口开发 | 语音识别接口私有化部署
人工智能·python·语音识别·fastapi·asr·funasr
视觉&物联智能2 天前
【杂谈】-边缘计算竞赛:人工智能硬件缘何超越云端
人工智能·ai·chatgpt·aigc·边缘计算·agi·deepseek
钱彬 (Qian Bin)2 天前
项目实践11—全球证件智能识别系统(切换为PostgreSQL数据库)
人工智能·qt·fastapi
不会计算机的g_c__b2 天前
HuggingGPT深度解析:当ChatGPT遇上HuggingFace,打造AI世界的“万能工具箱”
人工智能·chatgpt
梁辰兴2 天前
OpenAI更新ChatGPT Images:生成速度最高提升4倍,原生多模态模型
人工智能·科技·ai·chatgpt·大模型·openai·图像生成