目录
调用官方服务
需要在对应的平台注册,如deepseek:DeepSeek

python
import keyring
from openai import OpenAI
service_name, username = 'deepseek_test', 'API_KEY1'
API_KEY = keyring.get_password(service_name, username)
client = OpenAI(api_key=API_KEY, base_url="https://api.deepseek.com/v1")
response = client.chat.completions.create(
model="deepseek-chat",
messages=[
{"role": "system", "content": "你好,有什么可以帮到你的。"},
{"role": "user", "content": "你是机器人吗?"},
],
stream=False
)
print(response.choices[0].message.content)
本地服务
需要先在本地部署对应的模型
安装ollama库:pip install ollama -i https://pypi.mirrors.ustc.edu.cn/simple/
使用request库调用模型接口
python
import requests
ask = input('请输入你的问题:')
# Ollama中提供的chat功能的API地址
url = 'http://127.0.0.1:11434/api/chat'
# 要发送的数据
data = {
"model": "deepseek-r1:8b",
"messages": [{
"role": "user",
"content": ask}],
"stream": False
}
# 发送POST请求
response = requests.post(url, json=data)
# 打印模型的输出文本
print(response.json()["message"]["content"])
使用openai库调用本地模型api
python
from openai import OpenAI
ask = input('请输入你的问题:')
client = OpenAI(
base_url='http://localhost:11434/v1/',
api_key='ollama',
)
chat_completion = client.chat.completions.create(
messages=[
{
'role': 'user',
'content': ask,
}
],
model='deepseek-r1:8b',
stream=False,
temperature=0.0, # 0-2之间,决定生成结果的随机性,越大越随机
)
print(chat_completion.choices[0].message.content)
使用ollama库加载模型
python
from ollama import chat
from ollama import ChatResponse
ask = input('请输入你的问题:')
response: ChatResponse = chat(
model='deepseek-r1:8b',
messages=[
{
'role': 'user',
'content': ask,
},
])
print(response['message']['content']) # 结果同response.message.content
上下文历史消息对话
python
from ollama import chat
messages = []
while True:
ask = input('用户提问:')
response = chat(
'deepseek-r1:8b',
messages=messages +
[
{'role': 'user', 'content': ask}
])
messages += [
{'role': 'user', 'content': ask},
{'role': 'assistant', 'content': response.message.content},
]
print(response.message.content + '\n')