Error when attempting to add data source to Azure OpenAI api

题意:尝试向 Azure OpenAI API 添加数据源时出现错误

问题背景:

My code is working for a call to Azure OpenAI when I don't have a datasource added. However, when I do add my datasource with the following parameters I get an error:

当我没有添加数据源时,我的代码在调用 Azure OpenAI 时可以正常工作。然而,当我使用以下参数添加数据源时,出现了错误:

cs 复制代码
response = client.chat.completions.create(
    messages = [
        {
            "role": "system",
            "content": "when the user provides a project name as input you should do the steps mentioned below: Step 1: Get the project band of the project from the file."
        },
        {
            "role": "user",
            "content": 'Project Name: "Test project" '
        }
    ],
    model = "GPT-3.5 Turbo",
    seed = 42,
    temperature = 0,
    max_tokens = 800,
    extra_body = {
        "dataSources": [
            {
                "type": "AzureCognitiveSearch",
                "parameters": {
                    "endpoint": os.environ["SEARCH_ENDPOINT"],
                    "key": os.environ["SEARCH_KEY"],
                    "indexName": "test-index"
                }
            }
        ]

Gives error: 错误信息:

cs 复制代码
Exception has occurred: BadRequestError
Error code: 400 - {'error': {'message': 'Unrecognized request argument supplied: dataSources', 'type': 'invalid_request_error', 'param': None, 'code': None}}
httpx.HTTPStatusError: Client error '400 model_error' for url 'https://openai-ngap-genai-poc.openai.azure.com//openai/deployments/NTAPOC/chat/completions?api-version=2023-09-01-preview'
For more information check: https://httpstatuses.com/400

During handling of the above exception, another exception occurred:

  File "C:\Users\choran\OneDrive - Open Sky Data Systems\Documents\NTA\NTA Chatbot code\Attempting to add datasource.py", line 13, in <module>
    response = client.chat.completions.create(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
openai.BadRequestError: Error code: 400 - {'error': {'message': 'Unrecognized request argument supplied: dataSources', 'type': 'invalid_request_error', 'param': None, 'code': None}}

Verified that datasource details were correct.

Full code here

问题解决:

In my environment, when I tried the same code, I got the same error:

在我的环境中,当我尝试相同的代码时,也遇到了相同的错误:

Error**:**

openai.BadRequestError: Error code: 400 - {'error': {'message':'Unrecognized request argument supplied: dataSources', 'type': 'invalid_request_error', 'param': None, 'code': None}}

You can use this MS-DOCS to create your own data with chat completion.

你可以使用这个 [MS-DOCS](https://docs.microsoft.com) 来创建自己的数据,并进行聊天补全。

You can use the code below to create chat completion with data source and openai version 1.9.0.

你可以使用下面的代码,在数据源和 OpenAI 版本 1.9.0 下创建聊天补全。

Code:

cs 复制代码
import os
from openai import AzureOpenAI

endpoint=os.environ["AZURE_ENDPOINT"]
deployment="gpt-35-turbo"
apikey=os.environ["API_KEY"]
client = AzureOpenAI(
    base_url=f"{endpoint}/openai/deployments/{deployment}/extensions", 
    api_key=apikey, 
    api_version="2023-09-01-preview")

for i in range(3):
    print (f'Answer Version {i + 1}\n---')

completion = client.chat.completions.create(
    model = deployment,
    messages = [
        {
            "role": "system",
            "content": "When the user provides a project name as input, you should do the steps mentioned below: Step 1: Get the project band of the project from the file."
        },
        {
            "role": "user",
            "content": 'Where do I go for Azure OpenAI customer support?" '
        }
    ],
    seed = 42,
    temperature = 0,
    max_tokens = 800,
    extra_body = {
        "dataSources": [
            {
                "type": "AzureCognitiveSearch",
                "parameters": {
                    "endpoint": os.environ["SEARCH_ENDPOINT"],
                     "key": os.environ["SEARCH_KEY"],
                     "indexName": "test-index"
                    }
             }
        ]
    }
)
print (completion.choices[0].message.content) 

print("---\n")

Output:

cs 复制代码
Answer Version 1
---
Answer Version 2
---
Answer Version 3
---
You can check the Cognitive Services support options guide for help with Azure OpenAI [doc1].
相关推荐
喝不完一杯咖啡7 分钟前
【AI时代】可视化训练模型工具LLaMA-Factory安装与使用
人工智能·llm·sft·llama·llama-factory
guyoung1 小时前
DeepSeek轻量级本地化部署工具——AIMatrices DeepSeek
rust·llm·deepseek
可乐张1 小时前
AutoGen 技术博客系列 (九):从 v0.2 到 v0.4 的迁移指南
后端·llm
宇努力学习1 小时前
如何本地部署seepseek
python·ai·ollama·deepseek
Java知识技术分享2 小时前
使用LangChain构建第一个ReAct Agent
python·react.js·ai·语言模型·langchain
一根烂笔头5 小时前
Mac M3/M4 本地部署Deepseek并集成vscode
vscode·ai·mac·deepseek·m4
刘什么洋啊Zz9 小时前
MacOS下使用Ollama本地构建DeepSeek并使用本地Dify构建AI应用
人工智能·macos·ai·ollama·deepseek
AnnyYoung12 小时前
华为云deepseek大模型平台:deepseek满血版
人工智能·ai·华为云
Elastic 中国社区官方博客14 小时前
Elasticsearch Open Inference API 增加了对 Jina AI 嵌入和 Rerank 模型的支持
大数据·人工智能·elasticsearch·搜索引擎·ai·全文检索·jina
AWS官方合作商14 小时前
Amazon Lex:AI对话引擎重构企业服务新范式
人工智能·ai·机器人·aws