https://docs.langchain4j.dev/tutorials/spring-boot-integration
无论是导入 low 还是 high 级别的依赖,可公用 yaml:
yaml
langchain4j:
open-ai:
chat-model:
api-key: ${ALI_QWEN_API_KEY} # 将会自动从本地环境变量获取
model-name: qwen-plus
log-requests: true
log-responses: true
base-url: https://dashscope.aliyuncs.com/compatible-mode/v1
低阶(Low level)集成
Spring Boot starter依赖项的命名约定是:langchain4j-{integration-name}-spring-boot-starter
例如,对于 OpenAI(langchain4j-open-ai),依赖项名称将是langchain4j-open-ai-spring-boot-starter:
xml
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-open-ai-spring-boot-starter</artifactId>
<version>1.11.0-beta19</version>
</dependency>
controller
java
@RestController
@RequestMapping("/low")
public class LowController {
@Resource
private ChatModel chatModelQwen;
@GetMapping("/langchain4j/qwen")
public String helloQwen(@RequestParam(value = "question", defaultValue = "你是谁?") String question) {
String result = chatModelQwen.chat(question);
return result;
}
}
高阶(High level)集成
xml
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-spring-boot-starter</artifactId>
<version>1.11.0-beta19</version>
</dependency>
定义 AI 服务接口,使用 @AiService 声明:
java
@AiService
public interface Assistant {
String chat(String question);
}
注意,如果同时存在多个 ChatModel 的 Bean,请手动指定 chatModel。
否则,报错:
java
dev.langchain4j.service.IllegalConfigurationException: Conflict: multiple beans of type dev.langchain4j.model.chat.ChatModel are found: [chatModelQwen, openAiChatModel]. Please specify which one you wish to wire in the @AiService annotation like this: @AiService(wiringMode = EXPLICIT, chatModel = "<beanName>").
java
@AiService(
wiringMode = AiServiceWiringMode.EXPLICIT,
chatModel = "chatModelQwen"
)
public interface Assistant {
String chat(String question);
}
将 AI 服务接口作为 Service 使用:
java
@RestController
@RequestMapping("/high")
public class HighController {
@Resource
private Assistant assistant;
@GetMapping("/langchain4j/qwen")
public String helloQwen(@RequestParam(value = "question", defaultValue = "你是谁?") String question) {
return assistant.chat(question);
}
}
http://localhost:9001/high/langchain4j/qwen?question=xxx