Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .claude-plugin/plugin.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "aimock",
"version": "1.19.1",
"version": "1.19.2",
"description": "Fixture authoring guidance for @copilotkit/aimock — LLM, multimedia, MCP, A2A, AG-UI, vector, and service mocking",
"author": {
"name": "CopilotKit"
Expand Down
10 changes: 10 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,15 @@
# @copilotkit/aimock

## [1.19.2] - 2026-05-07

### Fixed

- **Converse stream: double-wrapped Event Stream payloads** — `buildBedrockStreamTextEvents`, `buildBedrockStreamToolCallEvents`, and `buildBedrockStreamContentWithToolCallsEvents` emitted payloads wrapped with the event type name (e.g. `{ messageStart: { role: "assistant" } }`). The `:event-type` header already carries the event name, so AWS SDK (botocore) expected flat payloads (e.g. `{ role: "assistant" }`). The redundant wrapping caused botocore's `BaseEventStreamParser` to silently return empty dicts, producing `KeyError: 'role'` in downstream frameworks like Strands Agents. (Issue #162, reported by @KMiya84377)
- **Responses API: missing item_id on 3 SSE event types** — Added `item_id` to `response.output_text.done`, `response.content_part.added`, and `response.content_part.done` events, matching the real OpenAI Responses API shape. SDK drift shapes updated.
- **Chat Completions: missing logprobs on choices** — Added `logprobs: null` to all streaming chunks and non-streaming choices. Removed `logprobs` from drift allowlist so future omissions are caught.
- **Ollama: missing created_at on /api/chat** — Added `created_at` to all 6 `/api/chat` builder functions (text, tool call, content+tools, and their streaming variants). The `/api/generate` path already had it.
- **Gemini: error fixtures used Anthropic-style error codes** — Test fixtures and the Gemini Live WebSocket handler now use Google canonical gRPC status codes (`RESOURCE_EXHAUSTED`, `INTERNAL`) instead of `rate_limit_error` / `ERROR`.

## [1.19.1] - 2026-05-07

### Fixed
Expand Down
2 changes: 1 addition & 1 deletion charts/aimock/Chart.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,4 +3,4 @@ name: aimock
description: Mock infrastructure for AI application testing (OpenAI, Anthropic, Gemini, MCP, A2A, vector)
type: application
version: 0.1.0
appVersion: "1.19.1"
appVersion: "1.19.2"
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@copilotkit/aimock",
"version": "1.19.1",
"version": "1.19.2",
"description": "Mock infrastructure for AI application testing — LLM APIs, image generation, text-to-speech, transcription, audio generation, video generation, MCP tools, A2A agents, AG-UI event streams, vector databases, search, rerank, and moderation. One package, one port, zero dependencies.",
"license": "MIT",
"keywords": [
Expand Down
8 changes: 6 additions & 2 deletions src/__tests__/api-conformance.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -182,7 +182,7 @@ describe("OpenAI Chat Completions conformance", () => {
expect(typeof json.created).toBe("number");
});

it("choices[0] has index, message, and finish_reason", async () => {
it("choices[0] has index, message, logprobs, and finish_reason", async () => {
const res = await httpPost(chatPath(), {
model: "gpt-4",
messages: [{ role: "user", content: "hello" }],
Expand All @@ -192,6 +192,8 @@ describe("OpenAI Chat Completions conformance", () => {
const choice = json.choices[0];
expect(choice).toHaveProperty("index");
expect(choice).toHaveProperty("message");
expect(choice).toHaveProperty("logprobs");
expect(choice.logprobs).toBeNull();
expect(choice).toHaveProperty("finish_reason");
expect(choice.message.role).toBe("assistant");
expect(typeof choice.message.content).toBe("string");
Expand Down Expand Up @@ -276,7 +278,7 @@ describe("OpenAI Chat Completions conformance", () => {
expect(res.body.trimEnd()).toMatch(/data: \[DONE\]$/);
});

it("each chunk has id, object chat.completion.chunk, created, model, choices", async () => {
it("each chunk has id, object chat.completion.chunk, created, model, choices with logprobs", async () => {
const res = await httpPost(chatPath(), {
model: "gpt-4",
messages: [{ role: "user", content: "hello" }],
Expand All @@ -291,6 +293,8 @@ describe("OpenAI Chat Completions conformance", () => {
expect(c).toHaveProperty("created");
expect(c).toHaveProperty("model");
expect(c).toHaveProperty("choices");
expect(c.choices[0]).toHaveProperty("logprobs");
expect(c.choices[0].logprobs).toBeNull();
}
});

Expand Down
Loading
Loading