longchain4j 学习系列(1)-ollama本地调用

longchain4j是比spring-ai更早出现的大模型相关工程开源框架,社区成熟,活跃度高。下面演示如何用longchain4j调用本地ollama

一、核心pom依赖

复制代码
 1 <!-- LongChain4j Core -->
 2 <dependency>
 3     <groupId>dev.langchain4j</groupId>
 4     <artifactId>langchain4j</artifactId>
 5     <version>1.1.0</version>
 6 </dependency>
 7 
 8 <!-- LongChain4j Ollama Integration -->
 9 <dependency>
10     <groupId>dev.langchain4j</groupId>
11     <artifactId>langchain4j-ollama</artifactId>
12     <version>1.1.0-rc1</version>
13 </dependency>

View Code

二、注入对应实例

复制代码
@Configuration
public class OllamaConfig {

    @Value("${ollama.base-url:http://localhost:11434}")
    private String ollamaBaseUrl;

    @Value("${ollama.model:qwen3:0.6b}")
    private String ollamaModel;

    @Value("${ollama.timeout:60}")
    private Integer timeoutSeconds;

    /**
     * 配置Ollama聊天模型
     *
     * @return ChatLanguageModel实例
     */
    @Bean
    public ChatModel chatModel() {
        return OllamaChatModel.builder()
                .baseUrl(ollamaBaseUrl)
                .modelName(ollamaModel)
                .timeout(Duration.ofSeconds(timeoutSeconds))
                .logRequests(true)
                .logResponses(true)
                .build();
    }

    @Bean
    public StreamingChatModel streamingChatModel() {
        return OllamaStreamingChatModel.builder()
                .baseUrl(ollamaBaseUrl)
                .modelName(ollamaModel)
                .timeout(Duration.ofSeconds(timeoutSeconds))
                .logRequests(true)
                .logResponses(true)
                .build();
    }
} 

注:与spring-ai不同,longchain4j的流式响应,需要1个单独的模型StreamingChatModel

三、yaml配置

复制代码
server:
  port: 8080
  servlet:
    context-path: /

spring:
  application:
    name: longchain4j-study
  
  # 日志配置
  logging:
    level:
      com.example.longchain4jstudy: DEBUG
      dev.langchain4j: DEBUG
    pattern:
      console: "%d{yyyy-MM-dd HH:mm:ss} [%thread] %-5level %logger{36} - %msg%n"

# Ollama配置
ollama:
  base-url: http://localhost:11434
  model: qwen3:0.6b
  timeout: 60

# 应用信息
info:
  app:
    name: LongChain4j Study
    version: 1.0.0
    description: LongChain4j学习项目 - 集成Ollama聊天示例 

四、API示例

复制代码
    @Autowired
    private ChatModel chatModel;

    @Autowired
    private StreamingChatModel streamingChatModel;

    /**
     * 发送聊天消息(GET方式)
     *
     * @param prompt 用户输入的消息
     * @return 聊天响应
     */
    @GetMapping(value = "/chat", produces = MediaType.TEXT_PLAIN_VALUE)
    public ResponseEntity<String> chat(@RequestParam String prompt) {
        log.info("收到聊天请求: {}", prompt);

        try {
            String aiResponse = chatModel.chat(prompt);
            return ResponseEntity.ok(aiResponse);

        } catch (Exception e) {
            log.error("与Ollama通信时发生错误", e);
            String errorResponse = "抱歉,处理您的请求时发生了错误: " + e.getMessage();
            return ResponseEntity.ok(errorResponse);
        }
    }

    /**
     * 流式聊天消息(GET方式)
     *
     * @param prompt 用户输入的消息
     * @return 流式聊天响应
     */
    @GetMapping(value = "/chat/stream", produces = "text/html;charset=utf-8")
    public Flux<String> chatStream(@RequestParam String prompt) {
        log.info("收到流式聊天请求: {}", prompt);

        Sinks.Many<String> sink = Sinks.many().unicast().onBackpressureBuffer();

        streamingChatModel.chat(prompt, new StreamingChatResponseHandler() {
            @Override
            public void onPartialResponse(String s) {
                log.info("收到部分响应: {}",s);
                // 发送SSE格式的数据
                sink.tryEmitNext(escapeToHtml(s));
            }

            @Override
            public void onCompleteResponse(ChatResponse chatResponse) {
                log.info("流式响应完成");
                sink.tryEmitComplete();
            }

            @Override
            public void onError(Throwable throwable) {
                log.error("流式响应发生错误", throwable);
                sink.tryEmitError(throwable);
            }
        });

        return sink.asFlux();
    }

五、运行效果

文中代码:

yjmyzz/longchain4j-study at day01

参考:
https://docs.langchain4j.dev/
https://docs.langchain4j.info/