Work in this release was contributed by @dmmulroy and @SAY-5. Thank you for your contributions!
Beta release of the official Hono Sentry SDK
This release marks the beta release of the @sentry/hono Sentry SDK. For details on how to use it, check out the
Sentry Hono SDK docs. Please reach out on
GitHub if you have any feedback or concerns.
feat(browser): Add ingest_settings to v2 log envelope payload (#20453)
Inference of user data (e.g. IP address, browser name/version) on log events is now gated behind the sendDefaultPii option. Previously, this data was always inferred by default.
ingest_settings to v2 metrics envelope payload (#20454)ignoreSpans (#20595)processSegmentSpan to Deno context integration (#20613)processSegmentSpan to node context integration (#20678)bundle-analyzer-scenarios dev packages (#20680)Work in this release was contributed by @sbs44. Thank you for your contribution!
feat(cloudflare): Add trace propagation for RPC method calls (#20343)
Trace context is now propagated across Cloudflare Workers RPC calls, connecting traces between Workers and Durable Objects.
This feature is opt-in and requires setting enableRpcTracePropagation: true in your SDK configuration:
// Worker
export default Sentry.withSentry(
env => ({
dsn: env.SENTRY_DSN,
enableRpcTracePropagation: true,
}),
handler,
);
// Durable Object
export const MyDurableObject = Sentry.instrumentDurableObjectWithSentry
brace-expansion peer-dep (#20198)bin scripts (#20570)ignoreSpans (#20513)SENTRY_DSN (#20528)ingest_settings to span v2 envelope payload (#20411)httpContextIntegration (#20464)ignoreSpans (#20512)idleTimeout test config (#20467)feat(effect): Support v4 beta (#20394)
The @sentry/effect integration now supports Effect v4 beta, enabling Sentry instrumentation for the latest Effect framework version.
Read more in the Effect SDK readme.
feat(hono): Add @sentry/hono/bun for Bun runtime (#20355)
A new @sentry/hono/bun entry point adds first-class support for running Hono applications instrumented with Sentry on the Bun runtime.
Read more in the Hono SDK readme.
feat(replay): Add replayStart/replayEnd client lifecycle hooks (#20369)
New replayStart and replayEnd client lifecycle hooks let you react to replay session start and end events in your application.
no_parent_span client outcomes for discarded spans requiring a parent (#20350)GoogleGenAIIstrumentedMethod typo in type namehttpVersion in outgoing request span attributes (#20430).makeRequestAndWaitForEnvelope to wait for envelopes (#20208)feat(browser): Add View Hierarchy integration (#14981)
A new viewHierarchyIntegration captures the DOM structure when an error occurs, providing a snapshot of the page state for debugging. Enable it in your Sentry configuration:
import * as Sentry from '@sentry/browser';
Sentry.init({
dsn: '__DSN__',
integrations: [Sentry.viewHierarchyIntegration()],
});
feat(cloudflare): Split alarms into multiple traces and link them (#19373)
Durable Object alarms now create separate traces for each alarm invocation, with proper linking between related alarms for better observability.
feat(cloudflare): Enable RPC trace propagation with enableRpcTracePropagation (#19991, #20345)
A new enableRpcTracePropagation option enables automatic trace propagation for Cloudflare RPC calls via .fetch(), ensuring distributed traces flow correctly across service bindings.
rewriteSources top level option (#20142)eventLoopBlockIntegration (#20108)conversation_id only on gen_ai spans (#20274)ai.operationId for Vercel AI V6 operation name mapping (#20285)reader.closed rejection from releaseLock() in streaming (#20187)feat(aws-serverless): Ship Lambda extension in npm package for container image Lambdas (#20133)
The Sentry Lambda extension is now included in the npm package, enabling container image-based Lambda functions to use it. Copy the extension files into your Docker image and set the tunnel option:
RUN mkdir -p /opt/sentry-extension
COPY node_modules/@sentry/aws-serverless/build/lambda-extension/sentry-extension /opt/extensions/sentry-extension
COPY node_modules/@sentry/aws-serverless/build/lambda-extension/index.mjs /opt/sentry-extension/index.mjs
RUN chmod +x /opt/extensions/sentry-extension /opt/sentry-extension/index.mjs
Sentry.init({
dsn: '__DSN__',
tunnel: 'http://localhost:9000/envelope',
});
This works with any Sentry SDK (@sentry/aws-serverless, @sentry/sveltekit, @sentry/node, etc.).
feat(cloudflare): Support basic WorkerEntrypoint (#19884)
withSentry now supports instrumenting classes extending Cloudflare's WorkerEntrypoint. This instruments fetch, scheduled, queue, and tail handlers.
npmrc pointing to Verdaccio (#20611)feat(hono)!: Change setup for @sentry/hono/node (init in external file) (#20497)
To improve Node.js instrumentation, the sentry() middleware exported from @sentry/hono/node no longer accepts configuration options.
Instead, you must configure the SDK by calling Sentry.init() in a dedicated instrumentation file that runs before your application code (read more in the Hono SDK readme:
// instrument.mjs (or instrument.ts)
import * as Sentry from '@sentry/hono/node';
Sentry.init({
dsn: '__DSN__',
tracesSampleRate: 1.0,
});
feat(nitro): Add @sentry/nitro SDK (#19224)
A new @sentry/nitro package provides first-class Sentry support for Nitro applications, with HTTP handler and error instrumentation, middleware tracing, request isolation, and build-time source map uploading via withSentryConfig.
Read more in the Nitro SDK docs and the Nitro SDK readme.
* http.route attribute on segment spans (#20471)ignoreSpans (#20514)svelteKitSpansEnhancement integration (#20496)isSentryRequest handles subdomains properly (#20530).use() middleware in sub-apps from .all() handlers (#20554)maskAttributes works with maskAllText=false (#20491)sendDefaultPii for supabase integration (#20490)safeSetSpanJSONAttributes in cultureContext integration (#20481)eslint-config-next package to major (#20552)honoIntegration (#20397)feat(core): Add enableTruncation option to AI integrations (#20167, #20181, #20182, #20183, #20184)
All AI integrations (OpenAI, Anthropic, Google GenAI, LangChain, LangGraph) now support an enableTruncation option to control whether large AI inputs/outputs are truncated.
feat(opentelemetry): Vendor AsyncLocalStorageContextManager (#20243)
The OpenTelemetry context manager is now vendored internally, reducing external dependencies and ensuring consistent behavior across environments.
findInjectionIndexAfterDirectives for better readability (#20310)@opentelemetry/resources with inline getSentryResource() (#20327)import * as Sentry from '@sentry/cloudflare';
import { WorkerEntrypoint } from 'cloudflare:workers';
class MyWorker extends WorkerEntrypoint {
async fetch(request: Request): Promise<Response> {
return new Response('Hello World!');
}
}
export default Sentry.withSentry(env => ({ dsn: env.SENTRY_DSN, tracesSampleRate: 1.0 }), MyWorker);ref(core): Unify .do* span ops to gen_ai.generate_content (#20074)
All Vercel AI do* spans (ai.generateText.doGenerate, ai.streamText.doStream, ai.generateObject.doGenerate, ai.streamObject.doStream) now use a single unified span op gen_ai.generate_content instead of separate ops like gen_ai.generate_text, gen_ai.stream_text, gen_ai.generate_object, and gen_ai.stream_object.
ref(core): Remove provider-specific AI span attributes in favor of gen_ai attributes in sentry conventions (#20011)
The following provider-specific span attributes have been removed from the OpenAI and Anthropic AI integrations. Use the standardized gen_ai.* equivalents instead:
| Removed attribute | Replacement |
|---|---|
openai.response.id | gen_ai.response.id |
openai.response.model | gen_ai.response.model |
openai.usage.prompt_tokens | gen_ai.usage.input_tokens |
openai.usage.completion_tokens | gen_ai.usage.output_tokens |
openai.response.timestamp | (removed, no replacement) |
anthropic.response.timestamp | (removed, no replacement) |
If you reference these attributes in hooks (e.g. beforeSendTransaction), update them to the gen_ai.* equivalents.
feat(core): Support embeddings in LangChain (#20017)
Adds instrumentation for LangChain embeddings (embedQuery, embedDocuments), creating gen_ai.embeddings spans. In Node.js, embedding classes from @langchain/openai, @langchain/google-genai, @langchain/mistralai, and @langchain/google-vertexai are auto-instrumented. For other runtimes, use the new instrumentLangChainEmbeddings API:
import * as Sentry from '@sentry/cloudflare';
import { OpenAIEmbeddings } from '@langchain/openai';
const embeddings = Sentry.instrumentLangChainEmbeddings(new OpenAIEmbeddings({ model: 'text-embedding-3-small' }));
await embeddings.embedQuery('Hello world');