Skip to content

Spring Cloud 微服务 — 与 LLM 服务集成

简介

金融系统通常使用 Java Spring Cloud 微服务架构,LLM Python 服务需要与之集成。

Spring Boot 调用 LLM API

java
// LLMService.java
@Service
public class LLMService {
    
    private final RestTemplate restTemplate;
    private final String llmApiUrl;
    
    public LLMService(RestTemplate restTemplate, 
                      @Value("${llm.api.url}") String llmApiUrl) {
        this.restTemplate = restTemplate;
        this.llmApiUrl = llmApiUrl;
    }
    
    public String analyzeRisk(LoanApplication application) {
        Map<String, Object> request = new HashMap<>();
        request.put("message", buildPrompt(application));
        request.put("model", "qwen-plus");
        
        ResponseEntity<Map> response = restTemplate.postForEntity(
            llmApiUrl + "/chat",
            request,
            Map.class
        );
        
        return (String) response.getBody().get("answer");
    }
    
    private String buildPrompt(LoanApplication app) {
        return String.format(
            "分析贷款申请风险:公司%s,营收%d万,负债率%.1f%%",
            app.getCompanyName(),
            app.getRevenue() / 10000,
            app.getDebtRatio() * 100
        );
    }
}

服务注册(Nacos)

yaml
# application.yml
spring:
  application:
    name: loan-risk-service
  cloud:
    nacos:
      discovery:
        server-addr: localhost:8848

llm:
  api:
    url: http://llm-api-service/  # 通过服务名调用

Feign 客户端

java
@FeignClient(name = "llm-api-service", url = "${llm.api.url}")
public interface LLMApiClient {
    
    @PostMapping("/chat")
    ChatResponse chat(@RequestBody ChatRequest request);
    
    @PostMapping("/rag/query")
    RAGResponse ragQuery(@RequestBody RAGRequest request);
}

// 使用
@Autowired
private LLMApiClient llmApiClient;

public String getRiskAnalysis(String companyInfo) {
    ChatRequest request = new ChatRequest();
    request.setMessage("分析风险:" + companyInfo);
    request.setModel("qwen-plus");
    
    ChatResponse response = llmApiClient.chat(request);
    return response.getAnswer();
}

API 网关(Spring Cloud Gateway)

yaml
# gateway application.yml
spring:
  cloud:
    gateway:
      routes:
      - id: llm-api
        uri: lb://llm-api-service
        predicates:
        - Path=/api/llm/**
        filters:
        - StripPrefix=2
        - name: RequestRateLimiter
          args:
            redis-rate-limiter.replenishRate: 10
            redis-rate-limiter.burstCapacity: 20

集成要点

  1. LLM Python 服务注册到 Nacos,Java 服务通过服务名调用
  2. 在 API 网关层做限流,防止 LLM API 费用超支
  3. 使用 Feign + Hystrix 做熔断降级
  4. 异步调用使用 Spring WebFlux + WebClient

本站内容由 褚成志 整理编写,仅供学习参考