Skip to content

fix(chat): Improve agent loop tracing#303

Open
obostjancic wants to merge 1 commit intogetsentry:mainfrom
obostjancic:ognjenbostjancic/tet-2304-improve-juniors-instrumentation
Open

fix(chat): Improve agent loop tracing#303
obostjancic wants to merge 1 commit intogetsentry:mainfrom
obostjancic:ognjenbostjancic/tet-2304-improve-juniors-instrumentation

Conversation

@obostjancic
Copy link
Copy Markdown
Member

@obostjancic obostjancic commented May 7, 2026

  • Add a tracedStreamFn wrapper that creates a gen_ai.chat Sentry span for each LLM call inside the pi-agent-core agent loop, capturing input/output messages, token usage, finish reasons, and response model
  • Inject the wrapper into Agent via the streamFn option so spans nest naturally under the existing gen_ai.invoke_agent parent
  • Document the gen_ai.invoke_agentgen_ai.chat span hierarchy rule in the tracing spec

Before:
image

After:
CleanShot 2026-05-07 at 12 26 05

NOTE: All of the code was written by Claude based on sentry skills, manual instrumentation docs and docs that it fetched on demand as well as a bit of steering from my side.

Each iteration of the pi-agent-core agent loop now produces its own
gen_ai.chat Sentry span via a traced streamFn wrapper. This gives
visibility into individual LLM calls (input/output messages, token
usage, finish reasons) nested under the parent gen_ai.invoke_agent
span.

Co-Authored-By: Claude <noreply@anthropic.com>
@vercel
Copy link
Copy Markdown

vercel Bot commented May 7, 2026

@obostjancic is attempting to deploy a commit to the Sentry Team on Vercel.

A member of the Team first needs to authorize it.

Copy link
Copy Markdown

@cursor cursor Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 2 potential issues.

Fix All in Cursor

❌ Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.

Reviewed by Cursor Bugbot for commit a9661dd. Configure here.

() => {
span.end();
},
);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Span leaks if success callback throws unexpectedly

Low Severity

The stream.result().then(successHandler, rejectionHandler) creates a floating promise. If any statement inside the success handler (e.g., buildChatEndAttributes or span.setAttribute) throws before span.end() is reached, the span never ends (leaks) and the error becomes an unhandled promise rejection. Wrapping the success handler body in try/finally with span.end() in finally would ensure the span always closes, matching the defensive style of the catch block at line 114.

Fix in Cursor Fix in Web

Reviewed by Cursor Bugbot for commit a9661dd. Configure here.

} catch (error) {
span.end();
throw error;
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Error spans lack status, appearing successful in Sentry

Medium Severity

When base() throws (catch block) or stream.result() rejects (rejection handler), the span is ended without setting its status to error. Unlike Sentry.startSpan (used elsewhere via withSpan), startInactiveSpan has no automatic error-status propagation. Failed LLM calls will appear as successful in Sentry's trace waterfall, undermining the tracing improvement this PR aims to deliver. The same gap exists for the success path when stopReason is "error".

Additional Locations (1)
Fix in Cursor Fix in Web

Reviewed by Cursor Bugbot for commit a9661dd. Configure here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant