December 23, 2025
Fix model-level and runtime header support for LLM calls (#11303)
This fixes a bug where custom headers configured on models (like anthropic-beta) were not being passed through to the underlying AI SDK calls. The fix properly handles headers from multiple sources with correct priority:
Header Priority (low to high):
Examples that now work:
// Model config headers
new Agent({
model: {
id: 'anthropic/claude-4-5-sonnet',
headers: { 'anthropic-beta': 'context-1m-2025-08-07' },
},
});
// Runtime headers override config
agent.generate('...', {
modelSettings: { headers: { 'x-custom': 'runtime-value' } },
});
// Provider-level headers preserved
const openai = createOpenAI({ headers: { 'openai-organization': 'org-123' } });
new Agent({ model: openai('gpt-4o-mini') });
Add helpful JSDoc comments to BundlerConfig properties (used with bundler option) (#11300)
Fix telemetry disabled configuration being ignored by decorators (#11267)
The hasActiveTelemetry() function now properly checks the enabled configuration flag before creating spans. Previously, it only checked if a tracer existed (which always returns true in OpenTelemetry), causing decorators to create spans even when telemetry: { enabled: false } was set.
What changed:
hasActiveTelemetry() to check globalThis.__TELEMETRY__?.isEnabled() before checking for tracer existenceHow to use:
// Telemetry disabled at initialization
const mastra = new Mastra({
telemetry: { enabled: false },
});
// Or disable at runtime
Telemetry.setEnabled(false);
Breaking changes: None - this is a bug fix that makes the existing API work as documented.
Allow for bundler.externals: true to be set. (#11300)
With this configuration during mastra build all dependencies (except workspace dependencies) will be treated as "external" and not bundled. Instead they will be added to the .mastra/output/package.json file.
Fix generate system prompt by updating deprecated function call. (#11075)
Full Changelog: fd37787
Fetched April 7, 2026