Skip to main content

Documentation Index

Fetch the complete documentation index at: https://arize-ax.mintlify.dev/docs/llms.txt

Use this file to discover all available pages before exploring further.

Strands Agents SDK is an open-source framework from AWS for building agentic applications with tool use, memory, and model abstraction. Arize AX captures every Strands agent run — agent invocations, tool calls, model calls — via the openinference-instrumentation-strands-agents processor, which converts Strands’ native OpenTelemetry spans into OpenInference format.

Prerequisites

  • Python 3.10+
  • An Arize AX account (sign up)
  • An AWS account with Bedrock model access enabled for the model you want to call (the example uses Anthropic Claude Sonnet 4.6 — request access from the Bedrock console under Model access if you haven’t already)

Launch Arize AX

  1. Sign in to your Arize AX account.
  2. From Space Settings, copy your Space ID and API Key. You will set them as ARIZE_SPACE_ID and ARIZE_API_KEY below.

Install

pip install arize-otel \
  openinference-instrumentation-strands-agents \
  strands-agents boto3

Configure credentials

export ARIZE_SPACE_ID="<your-space-id>"
export ARIZE_API_KEY="<your-api-key>"
export ARIZE_PROJECT_NAME="strands-agents-tracing-example"

# AWS credentials — long-lived or SSO/STS temporary.
export AWS_ACCESS_KEY_ID="<your-aws-access-key-id>"
export AWS_SECRET_ACCESS_KEY="<your-aws-secret-access-key>"
export AWS_REGION="us-east-1"

# Only required for SSO / STS / federated logins.
export AWS_SESSION_TOKEN=""  # optional

Setup tracing

Strands emits OpenTelemetry spans natively through StrandsTelemetry. The OpenInference processor (StrandsAgentsToOpenInferenceProcessor) reshapes those spans into the OpenInference semantic-convention layout Arize AX expects, then the OTLP exporter ships them to Arize AX.
# instrumentation.py
import os

from opentelemetry import trace
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider

from strands.telemetry import StrandsTelemetry
from openinference.instrumentation.strands_agents import (
    StrandsAgentsToOpenInferenceProcessor,
)

# Build a TracerProvider with the Arize project name in its Resource —
# that's how a trace gets routed to a project on the Arize side.
resource = Resource.create({
    "openinference.project.name": os.environ["ARIZE_PROJECT_NAME"],
    "service.name": "strands-agents-tracing-example",
})
provider = TracerProvider(resource=resource)
provider.add_span_processor(StrandsAgentsToOpenInferenceProcessor())

# IMPORTANT: register the provider globally. Strands' `Agent` calls
# `opentelemetry.trace.get_tracer(...)`, which uses the global provider.
# StrandsTelemetry(tracer_provider=...) stores it but does NOT register
# it as global — you have to do that yourself.
trace.set_tracer_provider(provider)

# Hand the provider to StrandsTelemetry and wire the OTLP exporter to
# Arize. setup_otlp_exporter uses HTTP/Protobuf, so the endpoint is
# the HTTP /v1/traces path.
telemetry = StrandsTelemetry(tracer_provider=provider)
telemetry.setup_otlp_exporter(
    endpoint="https://otlp.arize.com/v1/traces",
    headers={
        "authorization":   os.environ["ARIZE_API_KEY"],
        "arize-space-id":  os.environ["ARIZE_SPACE_ID"],
        "arize-interface": "python",
    },
)
print("Arize AX tracing initialized for Strands Agents.")

Run Strands

# example.py

# Importing instrumentation first ensures the global TracerProvider is
# set before any Strands Agent is constructed.
from instrumentation import provider

from strands import Agent
from strands.models.bedrock import BedrockModel

# BedrockModel reads AWS_ACCESS_KEY_ID / AWS_SECRET_ACCESS_KEY /
# AWS_REGION (and AWS_SESSION_TOKEN if present) from the environment.
# Cross-region inference profile (`us.` prefix) is required for newer
# Claude models on Bedrock.
agent = Agent(
    name="OceanAssistant",
    model=BedrockModel(model_id="us.anthropic.claude-sonnet-4-6"),
    system_prompt="Answer concisely in two sentences.",
)

result = agent("Why is the ocean salty?")
print(str(result))

# Force flush + shutdown — without these, the BatchSpanProcessor inside
# StrandsTelemetry can drop late spans when the process exits.
provider.force_flush(timeout_millis=10000)
provider.shutdown()

Expected output

Arize AX tracing initialized for Strands Agents.
The ocean is salty because rivers continuously dissolve mineral salts from rocks and soil and carry them to the sea, where they accumulate over millions of years. Water leaves the ocean through evaporation but the salts remain, steadily concentrating until reaching today's roughly 3.5% salinity.

Verify in Arize AX

  1. Open your Arize AX space and select project strands-agents-tracing-example.
  2. You should see a new trace within ~30 seconds with this shape: an invoke_agent <agentName> root span (AGENT) wrapping an execute_event_loop_cycle span (CHAIN) and a chat LLM span (model us.anthropic.claude-sonnet-4-6, prompt + response + token usage attached).
  3. If no traces appear, see Troubleshooting.

Troubleshooting

  • No traces in Arize AX. Confirm ARIZE_SPACE_ID and ARIZE_API_KEY are set in the same shell that runs example.py. Enable OpenTelemetry debug logs with export OTEL_LOG_LEVEL=debug and re-run.
  • The agent runs but no spans appear. The most common cause is omitting trace.set_tracer_provider(provider). StrandsTelemetry(tracer_provider=...) accepts the provider but does not register it as global, so Strands’ internal trace.get_tracer(...) falls through to the no-op provider and emits no spans. Always call set_tracer_provider yourself.
  • AccessDeniedException from Bedrock. Your IAM principal needs bedrock:InvokeModel permission, and the foundation model in the example needs to be enabled under Model access in the Bedrock console.
  • ValidationException: Invocation of model ID anthropic.claude-sonnet-4-6 ... isn't supported. Newer Claude models on Bedrock require cross-region inference profiles. Prefix the model id with a geography slug — us.anthropic.claude-sonnet-4-6 (the example uses this) or eu.anthropic.claude-sonnet-4-6.
  • ExpiredTokenException. SSO / STS temporary credentials expired. Refresh and re-export AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_SESSION_TOKEN.
  • Spans dropped at process exit. Always provider.force_flush(...) and provider.shutdown() before the script returns; otherwise the OTLP HTTP exporter’s background batch can be cut off.
  • Other model providers. Strands ships adapters for OpenAI (strands.models.openai.OpenAIModel), Anthropic native (strands.models.anthropic.AnthropicModel), and others. The same StrandsAgentsToOpenInferenceProcessor covers every adapter — only the model construction changes.

Resources

Strands Agents Documentation

OpenInference Strands Instrumentor

Strands Agents SDK

Restaurant Assistant Tutorial (full agent with KB + tools)