Skip to content

feat(node): Add AI SDK v7 support via diagnostics_channel#20896

Open
sergical wants to merge 2 commits into
developfrom
feat/vercel-ai-v7-dc-telemetry
Open

feat(node): Add AI SDK v7 support via diagnostics_channel#20896
sergical wants to merge 2 commits into
developfrom
feat/vercel-ai-v7-dc-telemetry

Conversation

@sergical
Copy link
Copy Markdown
Member

Summary

  • Adds AI SDK v7 telemetry support using node:diagnostics_channel subscription
  • v7 publishes all telemetry events to 'aisdk:telemetry' regardless of which OTel integration the user registers — we subscribe and create spans directly with gen_ai.* attributes
  • v3-v6 path is unchanged (OTel instrumentation, version-gated at <7)
  • On v3-v6, the DC subscriber is inert — the channel is never published to

New files

  • packages/node/src/integrations/tracing/vercelai/dc-handlers.ts — event handlers mapping DC events → Sentry spans
  • packages/node/src/integrations/tracing/vercelai/dc-subscriber.ts — subscribes to 'aisdk:telemetry' and dispatches to handlers

Test coverage (14 tests, all passing)

  • generateText with and without sendDefaultPii
  • streamText
  • embed
  • Tool calls and tool errors
  • ToolLoopAgent with functionId
  • telemetry: { isEnabled: false } suppresses spans
  • ESM and CJS for all scenarios

Known limitations

  • Edge runtime (@sentry/vercel-edge) does not support node:diagnostics_channel — v7 edge users are not covered by this PR
  • If a user explicitly registers LegacyOpenTelemetry on v7, both DC and OTel paths would fire (duplicate spans). This requires manual user action and is not the recommended v7 path.

Test plan

  • v7 integration tests pass (14/14) against ai@^7.0.0-canary
  • v6 integration tests still pass (10/10) — no regressions
  • yarn format, yarn lint, yarn build:dev pass
  • Unit tests for @sentry/node and @sentry/core pass
  • CI

🤖 Generated with Claude Code

AI SDK v7 publishes all telemetry events to node:diagnostics_channel
on 'aisdk:telemetry', regardless of which OpenTelemetry integration
the user registers. This subscribes to that channel and creates spans
directly with gen_ai.* attributes — no OTel span translation needed.

- v3-v6: existing OTel instrumentation path (unchanged)
- v7+: diagnostic channel subscriber creates spans from raw events
- On v3-v6, the DC subscriber is inert (channel never published to)

Handles: generateText, streamText, generateObject, streamObject,
embed, embedMany, rerank, tool execution, tool errors, ToolLoopAgent.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented May 14, 2026

size-limit report 📦

Path Size % Change Change
@sentry/browser 26.92 kB - -
@sentry/browser - with treeshaking flags 25.35 kB - -
@sentry/browser (incl. Tracing) 44.97 kB - -
@sentry/browser (incl. Tracing + Span Streaming) 46.97 kB - -
@sentry/browser (incl. Tracing, Profiling) 49.97 kB - -
@sentry/browser (incl. Tracing, Replay) 84.59 kB - -
@sentry/browser (incl. Tracing, Replay) - with treeshaking flags 74.06 kB - -
@sentry/browser (incl. Tracing, Replay with Canvas) 89.3 kB - -
@sentry/browser (incl. Tracing, Replay, Feedback) 101.91 kB - -
@sentry/browser (incl. Feedback) 44.11 kB - -
@sentry/browser (incl. sendFeedback) 31.73 kB - -
@sentry/browser (incl. FeedbackAsync) 36.84 kB - -
@sentry/browser (incl. Metrics) 28.01 kB - -
@sentry/browser (incl. Logs) 28.16 kB - -
@sentry/browser (incl. Metrics & Logs) 28.84 kB - -
@sentry/react 28.67 kB - -
@sentry/react (incl. Tracing) 47.23 kB - -
@sentry/vue 31.84 kB - -
@sentry/vue (incl. Tracing) 46.83 kB - -
@sentry/svelte 26.94 kB - -
CDN Bundle 29.34 kB - -
CDN Bundle (incl. Tracing) 47.26 kB - -
CDN Bundle (incl. Logs, Metrics) 30.71 kB - -
CDN Bundle (incl. Tracing, Logs, Metrics) 48.38 kB - -
CDN Bundle (incl. Replay, Logs, Metrics) 70.07 kB - -
CDN Bundle (incl. Tracing, Replay) 84.72 kB - -
CDN Bundle (incl. Tracing, Replay, Logs, Metrics) 85.77 kB - -
CDN Bundle (incl. Tracing, Replay, Feedback) 90.56 kB - -
CDN Bundle (incl. Tracing, Replay, Feedback, Logs, Metrics) 91.66 kB - -
CDN Bundle - uncompressed 86.47 kB - -
CDN Bundle (incl. Tracing) - uncompressed 142.29 kB - -
CDN Bundle (incl. Logs, Metrics) - uncompressed 90.67 kB - -
CDN Bundle (incl. Tracing, Logs, Metrics) - uncompressed 145.75 kB - -
CDN Bundle (incl. Replay, Logs, Metrics) - uncompressed 215.54 kB - -
CDN Bundle (incl. Tracing, Replay) - uncompressed 261.05 kB - -
CDN Bundle (incl. Tracing, Replay, Logs, Metrics) - uncompressed 264.49 kB - -
CDN Bundle (incl. Tracing, Replay, Feedback) - uncompressed 274.75 kB - -
CDN Bundle (incl. Tracing, Replay, Feedback, Logs, Metrics) - uncompressed 278.18 kB - -
@sentry/nextjs (client) 49.74 kB - -
@sentry/sveltekit (client) 45.45 kB - -
@sentry/node-core 61.96 kB - -
@sentry/node 168.26 kB +0.81% +1.34 kB 🔺
@sentry/node - without tracing 74.37 kB +0.01% +1 B 🔺
@sentry/aws-serverless 109.18 kB - -
@sentry/cloudflare (withSentry) - minified 170.88 kB - -
@sentry/cloudflare (withSentry) 431.1 kB - -

View base workflow run

@sergical sergical marked this pull request as ready for review May 14, 2026 22:46
@sergical sergical requested a review from a team as a code owner May 14, 2026 22:46
Comment thread packages/node/src/integrations/tracing/vercelai/dc-handlers.ts Outdated
Comment thread packages/node/src/integrations/tracing/vercelai/dc-subscriber.ts Outdated
Comment thread packages/node/src/integrations/tracing/vercelai/dc-handlers.ts
Comment thread packages/node/src/integrations/tracing/vercelai/dc-handlers.ts
@@ -0,0 +1,65 @@
import { subscribe } from 'node:diagnostics_channel';
Copy link
Copy Markdown
Member

@logaretm logaretm May 14, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

m: The subscriber feels pretty thin, let's just merge both files here.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done — merged dc-subscriber.ts into dc-handlers.ts. The subscription logic and handler dispatch now live in a single file.

[AI-generated comment]

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nothing changed here, did you push? 😅 if not then maybe keep it for now, looks like we will add a lot of handlers over there.

Copy link
Copy Markdown
Member

@logaretm logaretm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did a first pass, some minor concerns and questions.

Comment on lines +60 to +78
interface Usage {
inputTokens?: number;
outputTokens?: number;
inputTokenDetails?: { cacheReadTokens?: number; cacheWriteTokens?: number };
outputTokenDetails?: { reasoningTokens?: number };
}

interface ContentPart {
type: string;
text?: string;
toolCallId?: string;
toolName?: string;
input?: unknown;
}
interface ToolCall {
toolCallId: string;
toolName: string;
input?: unknown;
}
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

m: we got quite a few of these types thrown around, let's move them out in a ./types.ts file to keep things focused here.

It would be cool to add link comments to their shape in the ai-sdk source so we can track them if something breaks. Or just copy their types over here.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done — moved `Usage`, `ContentPart`, and `ToolCall` to `./types.ts` (as `AiSdkUsage`, `AiSdkContentPart`, `AiSdkToolCall`). Added a source link comment pointing to the AI SDK telemetry source.

[AI-generated comment]

@@ -0,0 +1,65 @@
import { subscribe } from 'node:diagnostics_channel';
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nothing changed here, did you push? 😅 if not then maybe keep it for now, looks like we will add a lot of handlers over there.

Comment thread packages/node/src/integrations/tracing/vercelai/dc-handlers.ts
Comment on lines +26 to +41
function mapOperationName(operationId: string): string {
switch (operationId) {
case 'ai.generateText':
case 'ai.streamText':
case 'ai.generateObject':
case 'ai.streamObject':
return 'invoke_agent';
case 'ai.embed':
case 'ai.embedMany':
return 'embeddings';
case 'ai.rerank':
return 'rerank';
default:
return operationId;
}
}
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nitpick: a simple map would work better here, no?

const opMap = {
'ai.generateText': 'invoke_agent'
  // ...
};

My reasoning is fallthrough switches are a bit tricky to maintain or read at a glance. A map object gives us granularity and makes it easier to maintain.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done — replaced the switch with a `Record<AiSdkOperationId, string>` map object + nullish coalescing fallback.

[AI-generated comment]


const callStates = new Map<string, CallState>();

function mapOperationName(operationId: string): string {
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

l/nit: Can we type the expected string input/output here? as string literals.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done — added `AiSdkOperationId` string literal union type for the input and the map is typed as `Record<AiSdkOperationId, string>`.

[AI-generated comment]

Comment on lines +20 to +39
case 'onStart':
handleOnStart(msg.event);
break;
case 'onLanguageModelCallStart':
handleOnLanguageModelCallStart(msg.event);
break;
case 'onLanguageModelCallEnd':
handleOnLanguageModelCallEnd(msg.event);
break;
case 'onToolExecutionStart':
handleOnToolExecutionStart(msg.event);
break;
case 'onToolExecutionEnd':
handleOnToolExecutionEnd(msg.event);
break;
case 'onEnd':
handleOnEnd(msg.event);
break;
case 'onError':
handleOnError(msg.event);
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

h: looks like they have more events for us to instrument here, specifically around steps.

Vercel has onStepStart / onStepFinish and parents LM/tool spans under the step. We should aim for parity with their otel telemetry.

Maybe we should wait a bit for them to finalize their events shape.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed — the dc-subscriber.ts dispatch map is easy to extend once the step events stabilize. We can add `onStepStart`/`onStepFinish` handlers as a follow-up once Vercel finalizes the event shape.

[AI-generated comment]

Comment on lines +94 to +97
function normalizeFinishReason(reason: unknown): string {
if (typeof reason !== 'string') return 'stop';
return reason === 'tool-calls' ? 'tool_call' : reason;
}
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: Feels like the reason can be typed, WDYT?

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done — added `AiSdkFinishReason` string literal union type and `normalizeFinishReason` now takes that as input.

[AI-generated comment]

Comment on lines +99 to +116
function buildOutputMessages(content: ContentPart[], finishReason: unknown): string | undefined {
const parts: Record<string, unknown>[] = [];
const text = content
.filter(p => p.type === 'text' && p.text)
.map(p => p.text)
.join('');
if (text) parts.push({ type: 'text', content: text });
for (const tc of content.filter(p => p.type === 'tool-call')) {
parts.push({
type: 'tool_call',
id: tc.toolCallId,
name: tc.toolName,
arguments: typeof tc.input === 'string' ? tc.input : JSON.stringify(tc.input ?? {}),
});
}
if (parts.length === 0) return undefined;
return JSON.stringify([{ role: 'assistant', parts, finish_reason: normalizeFinishReason(finishReason) }]);
}
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

m: Maybe I'm paranoid but since we are stitching spans lifecycle ourselves here, then we should probably be careful with any JSON.stringify calls, if they throw then we break the span tree.

So maybe let's create a util file for the instrumentation with safeJsonParse, you would need to consider reasonable fallbacks to use in case it does fail.

Same thing for any other parses here in this file.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good call — created `dc-utils.ts` with a `safeStringify` helper that returns `undefined` on failure. All `JSON.stringify` calls in dc-handlers.ts now go through it, with callers skipping the attribute when it returns `undefined`.

[AI-generated comment]

- Merge dc-subscriber.ts into dc-handlers.ts
- Move DC event types (Usage, ContentPart, ToolCall) to types.ts with source links
- Replace switch with Record map for operation name mapping
- Add AiSdkFinishReason and AiSdkOperationId string literal types
- Add safeStringify utility to prevent JSON.stringify from breaking span tree
- Clean up child spans in handleOnEnd to prevent orphaned spans
- Guard error handling in handleOnError with instanceof check

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
if (val) attributes['gen_ai.request.available_tools'] = val;
}
}
state.inferenceSpan = startInactiveSpan({ name: `generate_content ${modelId}`, attributes });
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Child spans in Vercel AI tracing handlers are not parented correctly because startInactiveSpan is called without an explicit parent or an active span in the scope.
Severity: HIGH

Suggested Fix

Explicitly pass the parent span when creating child spans. For example, when creating inferenceSpan, call startInactiveSpan({ ..., parentSpan: state.rootSpan }). Apply the same fix for tool spans. Alternatively, wrap the handler logic in withActiveSpan(state.rootSpan, ...).

Prompt for AI Agent
Review the code at the location below. A potential bug has been identified by an AI
agent. Verify if this is a real issue. If it is, propose a fix; if not, explain why it's
not valid.

Location: packages/node/src/integrations/tracing/vercelai/dc-handlers.ts#L162

Potential issue: In the Vercel AI diagnostic channel handlers, child spans
(`inferenceSpan` and tool spans) are created using `startInactiveSpan`. However, the
`rootSpan` is not set as the active span in the execution context. When
`startInactiveSpan` is called for the child spans without an explicit `parentSpan`
argument, it looks for an active span on the current scope. Since no active span is
found, the new spans are created as disconnected root spans instead of children of the
intended `rootSpan`. This results in a broken trace hierarchy, where users will see a
flat list of unrelated spans instead of a correctly nested trace tree.

Also affects:

  • packages/node/src/integrations/tracing/vercelai/dc-handlers.ts:207~210

Copy link
Copy Markdown

@cursor cursor Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 1 potential issue.

Fix All in Cursor

❌ Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.

Reviewed by Cursor Bugbot for commit 8129519. Configure here.

for (const tc of content.filter(p => p.type === 'tool-call')) {
const args = typeof tc.input === 'string' ? tc.input : safeStringify(tc.input ?? {});
parts.push({ type: 'tool_call', id: tc.toolCallId, name: tc.toolName, arguments: args });
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Multiple iterations over same content array

Low Severity

In buildOutputMessages, the content array is iterated four times: .filter(), .map(), .join() for text parts, then .filter() again for tool-call parts. A single classic for loop could accumulate both text and tool-call parts in one pass.

Fix in Cursor Fix in Web

Triggered by project rule: PR Review Guidelines for Cursor Bot

Reviewed by Cursor Bugbot for commit 8129519. Configure here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants