使用fastapi搭建ChatGPT对话后台

使用fastapi搭建ChatGPT对话后台

参考资料:使用fastapi搭建ChatGPT对话后台

效果:在本地构建网页达成类似chatgpt的对话效果,一个字一个字的返回生成结果

ChatGPT初步调用

python 复制代码
import os
import fastapi
import dotenv
from httpx import AsyncClient
from typing import List,Dict
dotenv.load_dotenv('./env')

# print(os.getenv('OPENAI_API_BASE'))
async def request(val: List[dict[str,str]]):
    """
    发起请求
    val: 对话内容
    """
    url = "https://xiaoai.plus/v1/chat/completions"
    headers ={
        "Content-Type": "application/json",
        "Authorization": "Bearer " + os.getenv("OPENAI_API_KEY")
    }
    params = {
        "model": "gpt-3.5-turbo",
        "messages": val, # [{"role": "user", "content": "Say this is a test!"}]
        "temperature": 0.7,
        "n": 1,
        "max_tokens": 3000,
        "stream": False
    }
    async with AsyncClient() as clinet:
        response = await clinet.post(url, headers=headers,json=params,timeout=60)
        print(response.json())

if __name__ == '__main__':
    import asyncio
    asyncio.run(request([{"role": "user", "content": "Hello!"}]))
bash 复制代码
{'id': 'chatcmpl-ABYGZNDqhtZn5igtaukBbLPWrdTPZ', 'object': 'chat.completion', 'created': 1727316691, 'model': 'gpt-3.5-turbo', 'choices': [{'index': 0, 'message': {'role': 'assistant', 'content': 'Hello! How can I assist you today?'}, 'finish_reason': 'stop'}], 'usage': {'prompt_tokens': 9, 'completion_tokens': 9, 'total_tokens': 18}, 'system_fingerprint': 'fp_808245b034'}

对回答进行解析, 这里的结果是一次性返回消息内容

bash 复制代码
{
    "id": "chatcmpl-ABYGZNDqhtZn5igtaukBbLPWrdTPZ",  # 唯一标识符,用于追踪请求
    "object": "chat.completion",  # 对象类型,表示这是一个聊天完成事件
    "created": 1727316691,  # 创建时间戳,表示响应创建的时间
    "model": "gpt-3.5-turbo",  # 使用的模型名称
    "choices": [  # 选择列表,可能包含多个回复,这里只有一个
        {
            "index": 0,  # 当前选择的索引
            "message": {  # 选择的消息内容
                "role": "assistant",  # 消息角色,这里是助手
                "content": "Hello! How can I assist you today?"  # 消息内容
            },
            "finish_reason": "stop"  # 完成原因,这里是"stop",表示模型决定停止生成更多内容
        }
    ],
    "usage": {  # 使用情况,包括token使用情况
        "prompt_tokens": 9,  # 提示token的数量
        "completion_tokens": 9,  # 完成token的数量
        "total_tokens": 18  # 总token的数量
    },
    "system_fingerprint": "fp_808245b034"  # 系统指纹,用于识别请求的系统环境
}

流式调用ChatGPT

修改上述代码中的"stream": Trueprint(response.text)部分

python 复制代码
import os
import fastapi
import dotenv
from httpx import AsyncClient
from typing import List,Dict
dotenv.load_dotenv('./env')

# print(os.getenv('OPENAI_API_BASE'))
async def request(val: List[dict[str,str]]):
    """
    发起请求
    val: 对话内容
    """
    url = "https://xiaoai.plus/v1/chat/completions"
    headers ={
        "Content-Type": "application/json",
        "Authorization": "Bearer " + os.getenv("OPENAI_API_KEY")
    }
    params = {
        "model": "gpt-3.5-turbo",
        "messages": val, # [{"role": "user", "content": "Say this is a test!"}]
        "temperature": 0.7,
        "n": 1,
        "max_tokens": 3000,
        "stream": True
    }
    async with AsyncClient() as clinet:
        response = await clinet.post(url, headers=headers,json=params,timeout=60)
        print(response.text)

if __name__ == '__main__':
    import asyncio
    asyncio.run(request([{"role": "user", "content": "Hello!"}]))

可以看到GPT的结果是一个词一个词的返回的

bash 复制代码
data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"role":"assistant","content":""},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":"Hello"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":"!"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" How"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" can"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" I"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" assist"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" you"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":" today"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{"content":"?"},"finish_reason":null}]}

data: {"id":"chatcmpl-ABYJD1L22azzx2PK9IyqCaw2RrC7J","object":"chat.completion.chunk","created":1727316855,"model":"gpt-3.5-turbo","system_fingerprint":"fp_808245b034","choices":[{"index":0,"delta":{},"finish_reason":"stop"}]}

data: [DONE]

逐行处理流式响应

修改相应的代码

python 复制代码
    # async with AsyncClient() as clinet:
    #     response = await clinet.post(url, headers=headers,json=params,timeout=60)
    #     print(response.text)
    async with AsyncClient() as clinet:
        async with clinet.stream("POST", url, headers=headers, json=params, timeout=60) as response:
            async for line in response.aiter_lines():
                print(line)
python 复制代码
import os
import fastapi
import dotenv
from httpx import AsyncClient
from typing import List,Dict
import json
from collections import defaultdict
dotenv.load_dotenv('./env')

# print(os.getenv('OPENAI_API_BASE'))
async def request(val: List[dict[str,str]]):
    """
    发起请求
    val: 对话内容
    """
    url = "https://xiaoai.plus/v1/chat/completions"
    headers ={
        "Content-Type": "application/json",
        "Authorization": "Bearer " + os.getenv("OPENAI_API_KEY")
    }
    params = {
        "model": "gpt-3.5-turbo",
        "messages": val, # [{"role": "user", "content": "Say this is a test!"}]
        "temperature": 0.7,
        "n": 1,
        "max_tokens": 3000,
        "stream": True
    }
    # async with AsyncClient() as clinet:
    #     response = await clinet.post(url, headers=headers,json=params,timeout=60)
    #     print(response.text)
    async with AsyncClient() as clinet:
        async with clinet.stream("POST", url, headers=headers, json=params, timeout=60) as response:
            async for line in response.aiter_lines():
                if line.strip() == "":
                    continue
                line = line.replace("data: ","")
                if line.strip() == "[DONE]":
                    return
                data = json.loads(line)
                if data.get("choices") is None or len(data.get("choices")) == 0 or data.get("choices")[0].get("finish_reason") is not None:
                    return
                yield data.get("choices")[0]

async def chat(inp: str):
    message = [{"role": "user", "content": inp}]
    async for i in request(message):
        print(i)
if __name__ == '__main__':
    import asyncio
    asyncio.run(chat("你好啊"))
bash 复制代码
{'index': 0, 'delta': {'role': 'assistant', 'content': ''}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '你'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '好'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': ','}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '有'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '什'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '么'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '可以'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '帮'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '助'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '你'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '的'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '吗'}, 'finish_reason': None}
{'index': 0, 'delta': {'content': '?'}, 'finish_reason': None}

封装请求与chat方法

python 复制代码
import os
import fastapi
import dotenv
from httpx import AsyncClient
from typing import List,Dict
import json
from collections import defaultdict
dotenv.load_dotenv('./env')

# print(os.getenv('OPENAI_API_BASE'))
async def request(val: List[dict[str,str]]):
    """
    发起请求
    val: 对话内容
    """
    url = "https://xiaoai.plus/v1/chat/completions"
    headers ={
        "Content-Type": "application/json",
        "Authorization": "Bearer " + os.getenv("OPENAI_API_KEY")
    }
    params = {
        "model": "gpt-3.5-turbo",
        "messages": val, # [{"role": "user", "content": "Say this is a test!"}]
        "temperature": 0.7,
        "n": 1,
        "max_tokens": 3000,
        "stream": True
    }
    # async with AsyncClient() as clinet:
    #     response = await clinet.post(url, headers=headers,json=params,timeout=60)
    #     print(response.text)
    async with AsyncClient() as clinet:
        async with clinet.stream("POST", url, headers=headers, json=params, timeout=60) as response:
            async for line in response.aiter_lines():
                if line.strip() == "":
                    continue
                line = line.replace("data: ","")
                if line.strip() == "[DONE]":
                    return
                data = json.loads(line)
                if data.get("choices") is None or len(data.get("choices")) == 0 or data.get("choices")[0].get("delta").get("finish_reason") is not None:
                    return
                yield data.get("choices")[0]

async def chat(inp: str):
    message = [{"role": "user", "content": inp}]
    chat_msg = defaultdict(str)
    async for i in request(message):
        if i.get("delta").get("role"):
            chat_msg["role"] = i.get("delta").get("role")
        if i.get("delta").get("content"):
            chat_msg["content"] += i.get("delta").get("content")
    print(chat_msg)
if __name__ == '__main__':
    import asyncio
    asyncio.run(chat("你好啊"))
bash 复制代码
defaultdict(<class 'str'>, {'role': 'assistant', 'content': '你好!有什么我可以帮助你的吗?'}

使用fastapi进行封装

python 复制代码
import os
import fastapi
import dotenv
from httpx import AsyncClient
from typing import List,Dict
from fastapi import FastAPI, WebSocket
from fastapi.middleware.cors import CORSMiddleware

import json
from collections import defaultdict
from fastapi.responses import HTMLResponse

app = FastAPI()
dotenv.load_dotenv('./env')


# app.add_middleware(
#     CORSMiddleware,
#     allow_origins=["*"],
#     allow_credentials=True,
#     allow_methods=["*"],
#     allow_headers=["*"],
# )


@app.get("/")
async def root():
    return {"message": "Hello World"}

# print(os.getenv('OPENAI_API_BASE'))
async def request(val: List[dict[str,str]]):
    """
    发起请求
    val: 对话内容
    """
    url = "https://xiaoai.plus/v1/chat/completions"
    headers ={
        "Content-Type": "application/json",
        "Authorization": "Bearer " + os.getenv("OPENAI_API_KEY"),

    }
    params = {
        "model": "gpt-3.5-turbo",
        "messages": val, # [{"role": "user", "content": "Say this is a test!"}]
        "temperature": 0.7,
        "n": 1,
        "max_tokens": 3000,
        "stream": True
    }
    # async with AsyncClient() as clinet:
    #     response = await clinet.post(url, headers=headers,json=params,timeout=60)
    #     print(response.text)
    async with AsyncClient() as clinet:
        async with clinet.stream("POST", url, headers=headers, json=params, timeout=60) as response:
            async for line in response.aiter_lines():
                if line.strip() == "":
                    continue
                line = line.replace("data: ","")
                if line.strip() == "[DONE]":
                    return
                data = json.loads(line)
                if data.get("choices") is None or len(data.get("choices")) == 0 or data.get("choices")[0].get("delta").get("finish_reason") is not None:
                    return
                yield data.get("choices")[0]

@app.websocket("/chat")
async def chat(websocket: WebSocket):
    await websocket.accept()
    message = []
    while True:
        data = await websocket.receive_text()
        if data == "quit": 
            await websocket.close()
            break
        message.append({"role": "user", "content": data})
        chat_msg = defaultdict(str)
        async for i in request(message):
            if i.get("delta").get("role"):
                chat_msg["role"] = i.get("delta").get("role")
            if i.get("delta").get("content"):
                chat_msg["content"] += i.get("delta").get("content")
                await websocket.send_text(i.get("delta").get("content"))
        message.append(chat_msg)

if __name__ == '__main__':
    import uvicorn
    uvicorn.run("main:app", host="127.0.0.1",port=8080,reload=True)

使用html界面调用

html 复制代码
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<title>Titile</title>
</head>
<body>
    <p>连接状态: <span id="status">未连接</span></p>
    <p>回复消息: <span id="message"></span></p>
    <p><input id="inp"></p>
    <button type="submit" id="submit">提交</button>
<script>
    let status = document.getElementById("status")
    let message = document.getElementById("message")
    let inp = document.getElementById("inp")
    let submit = document.getElementById("submit")

    let socket = new WebSocket("ws://127.0.0.1:8080/chat")
    socket.addEventListener("open", (event)=>{
        status.innerText = "已连接"
    })
    socket.addEventListener("error", (event)=>{
        status.innerText = "已失败"
    })
    socket.addEventListener("close", (event)=>{
        status.innerText = "已关闭"
        console.log("WebSocket closed:", event)
    })
    socket.addEventListener("message", (event)=>{
        message.innerText += event.data
    })
    submit.addEventListener("click", ()=>{
        socket.send(inp.value)
    })
</script>
</body>
</html>
相关推荐
陈敬雷-充电了么-CEO兼CTO21 小时前
主流大模型Agent框架 AutoGPT详解
人工智能·python·gpt·ai·chatgpt·nlp·aigc
代码能跑就行管它可读性2 天前
【论文复现】利用生成式AI进行选股和分配权重
人工智能·chatgpt
前端小盆友2 天前
从零实现一个GPT 【React + Express】--- 【4】实现文生图的功能
react.js·chatgpt·express
Json____2 天前
使用python的 FastApi框架开发图书管理系统-前后端分离项目分享
开发语言·python·fastapi·图书管理系统·图书·项目练习
czkm2 天前
苹果🍎的奇幻漂流,当你提问后,ChatGPT在“想”什么?
chatgpt·llm
陈敬雷-充电了么-CEO兼CTO2 天前
复杂任务攻坚:多模态大模型推理技术从 CoT 数据到 RL 优化的突破之路
人工智能·python·神经网络·自然语言处理·chatgpt·aigc·智能体
G皮T2 天前
【人工智能】ChatGPT、DeepSeek-R1、DeepSeek-V3 辨析
人工智能·chatgpt·llm·大语言模型·deepseek·deepseek-v3·deepseek-r1
坤坤爱学习2.04 天前
求医十年,病因不明,ChatGPT:你看起来有基因突变
人工智能·ai·chatgpt·程序员·大模型·ai编程·大模型学
Ven%4 天前
破译AI黑箱:如何用20行Python理解ChatGPT?
人工智能·python·chatgpt
我不是哆啦A梦4 天前
破解风电运维“百模大战”困局,机械版ChatGPT诞生?
运维·人工智能·python·算法·chatgpt