Skip to content

Conversation

@Arkissa
Copy link
Contributor

@Arkissa Arkissa commented Jan 3, 2026

添加 AIClient 的 api 支持,尽可能的按照厂商的 api 以实现类型安全的 eDSL 请求结构构建,避免得到错误的响应。

  1. Model 能力约束, Model.Supported 接口提供了 Model 的能力,比如一些 Model 只支持文本对话,不支持图像处理,通过给 Model 添加对应的 Model.Supported 的实现来表达 Model 的能力,减少接收错误响应的情况。
  2. 利用 java 的类型系统约束请求结构字段的类型,避免直接编辑 json 字符串带来的类型错误。
example output
var provider = new OpenAIProvider(
		URI.create("https://api.v3.cm/v1/chat/completions"),
		new QwenModel.Qwen3.VlPlus(),
		new Authorize.Token("token"),
		Message.of(
			new Message.System("You are a helpful assistant."),
			new Message.User(new Content.Text("你是谁?"))
		),
		new OpenAIProvider.Options.Stream(true, false),
		new OpenAIProvider.Options.MaxInputTokens(3000),
		new OpenAIProvider.Options.MaxTokens(3000)
	);

var response = new AIClient(HttpClient.newHttpClient()).send(provider);
response.map((json) -> json.toString()).forEach(System.out::println);
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"role":"assistant","content":""},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"content":"我是"},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"content":""},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"content":""},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"content":""},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"content":""},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"content":""},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"content":""},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"content":"阿里巴巴"},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"content":"集团"},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"content":"旗下的"},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"content":""},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"content":""},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"content":"实验室"},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"content":"研发"},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"content":""},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"content":""},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"content":"大规模"},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"content":"语言"},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"content":"模型"},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}
{"created":1767463697,"usage":null,"model":"qwen3-vl-plus","id":"chatcmpl-b202002d-5645-95b5-bbea-10843569f883","choices":[{"finish_reason":null,"delta":{"content":""},"index":0,"logprobs":null}],"system_fingerprint":null,"object":"chat.completion.chunk"}

@gakkiyomi
Copy link
Member

似乎没有用到大模型的地方,而且另起一个仓库不是更好?

@adlered
Copy link
Member

adlered commented Jan 4, 2026

似乎没有用到大模型的地方,而且另起一个仓库不是更好?

上层我这边來对接实现(涉及到商务问题)

@adlered adlered merged commit 1ec2f5c into FishPiOffical:master Jan 4, 2026
1 check failed
@adlered
Copy link
Member

adlered commented Jan 4, 2026


AIClient 使用说明文档

概述

AIClient 是一个轻量级的 AI 模型调用客户端,支持 OpenAI 兼容格式的 API,包括流式响应(SSE)和普通响应。

架构设计

AIClient # 核心客户端,负责发送请求和解析响应
├── Provider # 提供者接口
│ └── OpenAIProvider # OpenAI 兼容格式的实现
├── Model # 模型接口
│ └── QwenModel # 通义千问模型定义
└── ModelNotSupportException # 模型不支持异常

核心类说明

  1. AIClient

// 创建客户端
HttpClient httpClient = HttpClient.newHttpClient();
AIClient aiClient = new AIClient(httpClient);

// 发送请求,返回 Stream
Stream response = aiClient.send(provider);

  1. OpenAIProvider

提供者负责构建 HTTP 请求,支持以下配置:

消息类型 (Message):

  • Message.System(String content) - 系统提示词
  • Message.User(Content content) - 用户消息
  • Message.Developer(String content) - 开发者消息

内容类型 (Content):

  • Content.Text(String text) - 纯文本内容
  • Content.Array(List content) - 混合内容(文本+图片)

配置选项 (Options):

  • Options.Stream(boolean, boolean includeUsage) - 流式输出
  • Options.MaxTokens(Integer n) - 最大输出 token 数
  • Options.MaxInputTokens(Integer n) - 最大输入 token 数
  • Options.Thinking(boolean) - 思考模式
  • Options.EnableSearch(boolean) - 启用搜索
  • Options.StructureResponse.Json() - JSON 格式响应
  • Options.StructureResponse.Text() - 文本格式响应
  1. Model 接口

模型需要实现能力标记:

  • Model.Supported.Text - 支持文本处理
  • Model.Supported.Image - 支持图像处理

使用示例

基础文本对话

import org.b3log.symphony.ai.;
import org.b3log.symphony.ai.OpenAIProvider.
;
import java.net.URI;
import java.net.http.HttpClient;

// 1. 创建 HttpClient
HttpClient httpClient = HttpClient.newHttpClient();
AIClient aiClient = new AIClient(httpClient);

// 2. 定义模型
Model model = new QwenModel.Qwen3.VlPlus();

// 3. 构建消息
List messages = Message.of(
new Message.System("你是一个有用的助手"),
new Message.User(new Provider.Content.Text("你好,请介绍一下自己"))
);

// 4. 创建 Provider
URI uri = URI.create("https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions");
Provider.Authorize auth = new Provider.Authorize.Token("your-api-key");

OpenAIProvider provider = new OpenAIProvider(
uri,
model,
auth,
messages,
new Options.Stream(true, true),
new Options.MaxTokens(2048)
);

// 5. 发送请求并处理响应
aiClient.send(provider).forEach(json -> {
// 处理每个响应块
System.out.println(json.toString());
});

带图片的多模态对话

// 构建包含图片的消息
List<Provider.ContentType> content = List.of(
new Provider.ContentType.Text("这张图片是什么?"),
new Provider.ContentType.Image("data:image/png;base64,xxxxx", "image/png")
);

List messages = Message.of(
new Message.User(new Provider.Content.Array(content))
);

// 使用支持图像的模型
Model model = new QwenModel.Qwen3.VlPlus(); // 实现了 Model.Supported.Image

非流式请求

OpenAIProvider provider = new OpenAIProvider(
uri,
model,
auth,
messages,
new Options.Stream(false, false) // 关闭流式
);

// 非流式只返回一个 JSONObject
aiClient.send(provider).findFirst().ifPresent(json -> {
System.out.println(json.toString());
});

已支持的模型

模型类 模型名称 支持文本 支持图像
QwenModel.Qwen3.VlPlus qwen3-vl-plus
QwenModel.Qwen.VlOcr qwen3-vl-plus

扩展自定义模型

public class CustomModel implements Model, Model.Supported.Text {
@OverRide
public String getName() {
return "gpt-4o";
}
}

异常处理

try {
OpenAIProvider provider = new OpenAIProvider(uri, model, auth, messages);
aiClient.send(provider).forEach(System.out::println);
} catch (ModelNotSupportException e) {
// 模型不支持某种内容类型(如发送图片给纯文本模型)
System.err.println(e.toString());
} catch (IOException | InterruptedException e) {
// 网络请求异常
e.printStackTrace();
}

响应格式

流式响应返回的 JSONObject 遵循 OpenAI 格式:

{
"id": "chatcmpl-xxx",
"object": "chat.completion.chunk",
"choices": [{
"index": 0,
"delta": {
"content": "响应文本片段"
}
}]
}


Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants