题意:Vercel 错误:(Azure) OpenAI API 密钥未找到
问题背景:
I implemented openAI API in my Next.js app with the help of langchain library and it works superb on localhost, but in Vercel (ProVersion) it throws an error:
我使用 langchain 库在我的 Next.js 应用中实现了 OpenAI API,它在本地主机上运行得非常出色,但在 Vercel(专业版)上却抛出了一个错误
Error: (Azure) OpenAI API key not found at new OpenAIChat (file:///var/task/node_modules/langchain/dist/llms/openai-chat.js:184:19) at new OpenAI (file:///var/task/node_modules/langchain/dist/llms/openai.js:54:20) at /var/task/.next/server/pages/api/tasks/ai.js:63:21 RequestId: 472c0bdb-dbbc-4cd4-95a3-1808d0b6a5ac Error: Runtime exited with error: exit status 1 Runtime.ExitError
the error leads to this code in the langchain node_module in my app:
这个错误指向了我应用中的 langchain
node_module
里的这段代码
python
this.openAIApiKey =
fields?.openAIApiKey ?? getEnvironmentVariable("OPENAI_API_KEY");
this.azureOpenAIApiKey =
fields?.azureOpenAIApiKey ??
getEnvironmentVariable("AZURE_OPENAI_API_KEY");
if (!this.azureOpenAIApiKey && !this.openAIApiKey) {
throw new Error("(Azure) OpenAI API key not found");
}
I put OPENAI_API_KEY environmental variable (identical to the one in my .env file) in Vercel:
我在 Vercel 中设置了 OPENAI_API_KEY
环境变量(与我的 .env
文件中的环境变量相同):
vercel_environmental_variable
In my app I put the OPENAI_API_KEY in .env file and load it in my backend:
在我的应用中,我将 OPENAI_API_KEY
放在 .env
文件中,并在我的后端加载它。
python
const apiKey = process.env.OPENAI_API_KEY;
const openAIModel = new OpenAI({
modelName: "gpt-3.5-turbo",
temperature: 0,
maxTokens: 2000,
openAIApiKey: apiKey,
});
I can even post and get API requests in localhost when I don't process the OPENAI_API_KEY in backend (langchain module takes it directly from my .env file)
即使我在后端不处理 OPENAI_API_KEY
(langchain 模块直接从我的 .env
文件中获取它),我也可以在本地主机上发布和获取 API 请求。
python
const openAIModel = new OpenAI({
modelName: "gpt-3.5-turbo",
temperature: 0,
maxTokens: 2000,
});
I also tried to change the model to OpenAIChat, it works same as OpenAI model on localhost, but again not in Vercel:
我也尝试将模型更改为 OpenAIChat,它在本地主机上与 OpenAI 模型的表现相同,但在 Vercel 上再次出现问题。
python
const openAIModel = new OpenAIChat({
modelName: "gpt-3.5-turbo",
temperature: 0,
maxTokens: 2000,
});
I expect this would be enough to post requests to openAI API in Vercel, but it throws the same error again and again.
我本以为这样在 Vercel 上向 OpenAI API 发送请求就足够了,但它却反复抛出相同的错误
Does anyone have any idea what is missing here?
有人知道这里缺少什么吗?
Thank you in advance! Nasti
提前谢谢你!Nasti
问题解决:
I was facing the same error but, the error occured while using Langchain pinecone in nodejs .I was facing the error while using the OpenAIEmbeddings function .So if fixed it with this piece of code
我遇到了同样的错误,但错误是在使用 Node.js 中的 Langchain pinecone 时发生的。我在使用 OpenAIEmbeddings 函数时遇到了错误。所以我用这段代码解决了它
python
import { OpenAIEmbeddings } from "@langchain/openai";
const embeddingsArrays = await new OpenAIEmbeddings({
openAIApiKey : process.env.OPEN_API_KEY
}).embedDocuments(