Estimated end-of-life date, accurate to within three months: 05-2027 See the support level definitions for more information.
ray.job.submit spans are removed. Ray job submission outcome is now reported on the existing ray.job span through ray.job.submit_status.pin parameter in ddtrace.contrib.dbapi.TracedConnection, ddtrace.contrib.dbapi.TracedCursor, and ddtrace.contrib.dbapi_async.TracedAsyncConnection is deprecated and will be removed in version 5.0.0. To manage configuration of DB tracing please use integration configuration and environment variables.DD_TRACE_INFERRED_PROXY_SERVICES_ENABLED is deprecated and will be removed in 5.0.0. Use DD_TRACE_INFERRED_SPANS_ENABLED instead. The old environment variable continues to work but emits a DDTraceDeprecationWarning when set.profiling
ASM
ddtrace.appsec.ai_guard.integrations.litellm.DatadogAIGuardGuardrail class can be registered as a custom guardrail in the LiteLLM proxy to evaluate requests and responses against AI Guard security policies. Requires the LiteLLM proxy guardrails API v2 available since litellm>=1.46.1.azure_cosmos
CI Visibility
DD_AGENTLESS_LOG_SUBMISSION_ENABLED=true for agentless setups, or DD_LOGS_INJECTION=true when using the Datadog Agent.llama_index
llama-index-core>=0.11.0. Traces LLM calls, query engines, retrievers, embeddings, and agents. See the llama_index documentation for more information.tracing
OTEL_TRACES_EXPORTER=otlp to send spans to an OTLP endpoint instead of the Datadog Agent.mysql
mysql.connector.aio.connect in the MySQL integration.LLM Observability
decorator tag to LLM Observability spans that are traced by a function decorator.pydantic_evals ReportEvaluator as a summary evaluator when its evaluate return annotation is exactly ScalarResult. The scalar value is recorded as the summary evaluation. Report evaluators that declare a broader analysis return type (for example the full ReportAnalysis union) are not accepted as summary evaluators; use a class-based or function summary evaluator instead. Examples and further documentation can found in our documentation [here](https://docs.datadoghq.com/llm_observability/guide/evaluation_developer_guide).Example:
from pydantic_evals.evaluators import ReportEvaluator
from pydantic_evals.evaluators import ReportEvaluatorContext
from pydantic_evals.reporting.analyses import ScalarResult
from ddtrace.llmobs import LLMObs
dataset = LLMObs.create_dataset(
dataset_name="<DATASET_NAME>",
description="<DATASET_DESCRIPTION>",
records=[RECORD_1, RECORD_2, RECORD_3, ...]
)
class TotalCasesEvaluator(ReportEvaluator):
def evaluate(self, ctx: ReportEvaluatorContext) -> ScalarResult:
return ScalarResult(
title='Total Cases',
value=len(ctx.report.cases),
unit='cases',
)
def my_task(input_data, config):
return input_data["output"]
equals_expected = EqualsExpected()
summary_evaluator = TotalCasesEvaluator()
experiment = LLMObs.experiment(
name="<EXPERIMENT_NAME>",
task=my_task,
dataset=dataset,
evaluators=[equals_expected],
summary_evaluators=[summary_evaluator],
description="<EXPERIMENT_DESCRIPTION>."
)
result = experiment.run()
ModuleNotFoundError could be raised at startup in Python environments without the _ctypes extension module.invoke_agent) were incorrectly appearing as siblings of their SDK parent span (e.g. call_agent) rather than being nested under it.model_name and model_provider reported on AWS Bedrock LLM spans as the model_id full model identifier value (e.g., "amazon.nova-lite-v1:0") and "amazon_bedrock" respectively. Bedrock spans' model_name and model_provider now correctly match backend pricing data, which enables features including cost tracking.defer_loading=True) in Anthropic and OpenAI integrations caused LLMObs span payloads to include full tool descriptions and JSON schemas for every tool in a large catalog. Deferred tool definitions now have their description and schema stripped from span metadata, with only the tool name preserved.os._exit, SIGKILL, segfault) caused buffered test events to be lost. To enable eager flushing, set DD_TRACE_PARTIAL_FLUSH_MIN_SPANS=1./search_commits endpoint caused the git metadata upload to fall back to sending the full 30-day commit history instead of aborting. This fallback could trigger cascading write load on the backend. The upload now aborts when search_commits fails, matching the behavior when the /packfile upload itself fails.return in the IAST taint tracking add_aspect native function that caused redundant work when only the right operand of a string concatenation was tainted.Task.replace().RuntimeError: coroutine ignored GeneratorExit that occurred under ASGI with async views and async middleware hooks on Python 3.13+. Async view methods and middleware hooks are now correctly detected and awaited instead of being wrapped with sync bytecode wrappers.svc.auto process tag attribution logic. The tag now correctly reflects the auto-detected service name derived from the script or module entrypoint, matching the service name the tracer would assign to spans.python -m <module> could report entrypoint.name as -m in process tags.network.client.ip and http.client_ip span tags were missing when client IP collection was enabled and request had no headers.DD_SERVICE is not explicitly configured. Service remapping rules configured in Datadog will now apply correctly to Lambda spans.Fetched April 13, 2026