Skip to content

Feat/add groq llm provider#204

Merged
spalen0 merged 5 commits intomainfrom
feat/add-groq-llm-provider
Apr 7, 2026
Merged

Feat/add groq llm provider#204
spalen0 merged 5 commits intomainfrom
feat/add-groq-llm-provider

Conversation

@spalen0
Copy link
Copy Markdown
Collaborator

@spalen0 spalen0 commented Apr 7, 2026

No description provided.

spalen0 and others added 5 commits April 7, 2026 12:18
Groq uses an OpenAI-compatible API, so it works with the existing
OpenAICompatProvider. Default model is openai/gpt-oss-safeguard-20b.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
The LLM sometimes returns '### DETAIL' or '**TLDR:**' instead of plain
'DETAIL:' / 'TLDR:'. Use regex matching to handle these variations so
the detail section is correctly extracted and uploaded to the paste service.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Large calldata blobs cause the model to lose track of formatting
instructions at the top of the prompt. Repeating the format reminder
at the end ensures the model sees it right before generating.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Ruff formatter adds a space in {1,4} making it an invalid regex
quantifier. Extract the heading pattern to a separate variable with
fmt: skip to prevent this.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@spalen0 spalen0 merged commit 316bf6a into main Apr 7, 2026
2 checks passed
@spalen0 spalen0 deleted the feat/add-groq-llm-provider branch April 7, 2026 10:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant